English dictionary
west meaning and definition
Definition and meaning of west at MeaningMonkey.org. west meaning and definition in the English Dictionary.WEST noun
Definition of West (noun)
- the countries of (originally) Europe and (now including) North America and South America
- synonyms: Occident
- the cardinal compass point that is a 270 degrees
- the region of the United States lying to the west of the Mississippi River
- synonyms: western United States
- the direction corresponding to the westward cardinal compass point
- British writer (born in Ireland) (1892-1983)
- synonyms: Cicily Isabel Fairfield, Dame Rebecca West, Rebecca West
- United States film actress (1892-1980)
- synonyms: Mae West
- English painter (born in America) who became the second president of the Royal Academy (1738-1820)
- synonyms: Benjamin West
- a location in the western part of a country, region, or city
WEST adjective
Definition of west (adjective)
- situated in or facing or moving toward the west
- antonym: east
WEST adverb
Definition of west (adverb)
- to, toward, or in the west
- "we moved west to Arizona"; "situated west of Boston"
Source: Princeton University Wordnet