English dictionary
germany meaning and definition
Definition and meaning of germany at MeaningMonkey.org. germany meaning and definition in the English Dictionary.GERMANY noun
Definition of Germany (noun)
- a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990
- synonyms: Deutschland, Federal Republic of Germany, FRG
Source: Princeton University Wordnet