English dictionary
west germany meaning and definition
Definition and meaning of west germany at MeaningMonkey.org. west germany meaning and definition in the English Dictionary.WEST GERMANY noun
Definition of West Germany (noun)
- a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
- synonyms: Federal Republic of Germany
Source: Princeton University Wordnet