English dictionary

germanism meaning and definition

Definition and meaning of germanism at MeaningMonkey.org. germanism meaning and definition in the English Dictionary.

GERMANISM noun

Definition of Germanism (noun)

  1. a custom that is peculiar to Germany or its citizens
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: