English dictionary
west indies meaning and definition
Definition and meaning of west indies at MeaningMonkey.org. west indies meaning and definition in the English Dictionary.WEST INDIES noun
Definition of West Indies (noun)
- the string of islands between North America and South America; a popular resort area
- synonyms: the Indies
Source: Princeton University Wordnet