English dictionary

west meaning and definition

Definition and meaning of west at MeaningMonkey.org. west meaning and definition in the English Dictionary.

WEST noun

Definition of West (noun)

  1. the countries of (originally) Europe and (now including) North America and South America
  2. the cardinal compass point that is a 270 degrees
  3. the region of the United States lying to the west of the Mississippi River
  4. the direction corresponding to the westward cardinal compass point
  5. British writer (born in Ireland) (1892-1983)
  6. United States film actress (1892-1980)
  7. English painter (born in America) who became the second president of the Royal Academy (1738-1820)
  8. a location in the western part of a country, region, or city

WEST adjective

Definition of west (adjective)

  1. situated in or facing or moving toward the west

WEST adverb

Definition of west (adverb)

  1. to, toward, or in the west
    • "we moved west to Arizona"; "situated west of Boston"
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: