English dictionary

centrism meaning and definition

Definition and meaning of centrism at MeaningMonkey.org. centrism meaning and definition in the English Dictionary.

CENTRISM noun

Definition of centrism (noun)

  1. a political philosophy of avoiding the extremes of left and right by taking a moderate position or course of action
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: