English dictionary

western meaning and definition

Definition and meaning of western at MeaningMonkey.org. western meaning and definition in the English Dictionary.

WESTERN noun

Definition of Western (noun)

  1. a film about life in the western United States during the period of exploration and development
  2. a sandwich made from a western omelet

WESTERN adjective

Definition of western (adjective)

  1. relating to or characteristic of the western parts of the world or the West as opposed to the eastern or oriental parts
    • "the Western world"; "Western thought"; "Western thought"
    • antonym: eastern
  2. of or characteristic of regions of the United States west of the Mississippi River
  3. lying toward or situated in the west
    • "our company's western office"
  4. of wind; from the west
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: