English dictionary
deutschland meaning and definition
Definition and meaning of deutschland at MeaningMonkey.org. deutschland meaning and definition in the English Dictionary.DEUTSCHLAND noun
Definition of Deutschland (noun)
- a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990
- synonyms: Federal Republic of Germany, FRG, Germany
Source: Princeton University Wordnet