English dictionary
western united states meaning and definition
Definition and meaning of western united states at MeaningMonkey.org. western united states meaning and definition in the English Dictionary.WESTERN UNITED STATES noun
Definition of western United States (noun)
- the region of the United States lying to the west of the Mississippi River
- synonyms: West
Source: Princeton University Wordnet