English dictionary

deutschland meaning and definition

Definition and meaning of deutschland at MeaningMonkey.org. deutschland meaning and definition in the English Dictionary.

DEUTSCHLAND noun

Definition of Deutschland (noun)

  1. a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: