English dictionary

western united states meaning and definition

Definition and meaning of western united states at MeaningMonkey.org. western united states meaning and definition in the English Dictionary.

WESTERN UNITED STATES noun

Definition of western United States (noun)

  1. the region of the United States lying to the west of the Mississippi River
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: