English dictionary
west berlin meaning and definition
Definition and meaning of west berlin at MeaningMonkey.org. west berlin meaning and definition in the English Dictionary.WEST BERLIN noun
Definition of West Berlin (noun)
- the part of Berlin under United States and British and French control until 1989
Source: Princeton University Wordnet