English dictionary

west berlin meaning and definition

Definition and meaning of west berlin at MeaningMonkey.org. west berlin meaning and definition in the English Dictionary.

WEST BERLIN noun

Definition of West Berlin (noun)

  1. the part of Berlin under United States and British and French control until 1989
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: