English dictionary

germany meaning and definition

Definition and meaning of germany at MeaningMonkey.org. germany meaning and definition in the English Dictionary.

GERMANY noun

Definition of Germany (noun)

  1. a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: