English dictionary
west coast meaning and definition
Definition and meaning of west coast at MeaningMonkey.org. west coast meaning and definition in the English Dictionary.WEST COAST noun
Definition of West Coast (noun)
- the western seaboard of the United States from Washington to southern California
Source: Princeton University Wordnet