English dictionary
germanism meaning and definition
Definition and meaning of germanism at MeaningMonkey.org. germanism meaning and definition in the English Dictionary.GERMANISM noun
Definition of Germanism (noun)
- a custom that is peculiar to Germany or its citizens
Source: Princeton University Wordnet