English dictionary

west country meaning and definition

Definition and meaning of west country at MeaningMonkey.org. west country meaning and definition in the English Dictionary.

WEST COUNTRY noun

Definition of West Country (noun)

  1. the southwestern part of England (including Cornwall and Devon and Somerset)
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: