English dictionary
west country meaning and definition
Definition and meaning of west country at MeaningMonkey.org. west country meaning and definition in the English Dictionary.WEST COUNTRY noun
Definition of West Country (noun)
- the southwestern part of England (including Cornwall and Devon and Somerset)
Source: Princeton University Wordnet