English dictionary
centrism meaning and definition
Definition and meaning of centrism at MeaningMonkey.org. centrism meaning and definition in the English Dictionary.CENTRISM noun
Definition of centrism (noun)
- a political philosophy of avoiding the extremes of left and right by taking a moderate position or course of action
- synonyms: moderatism
Source: Princeton University Wordnet