English dictionary

west indies meaning and definition

Definition and meaning of west indies at MeaningMonkey.org. west indies meaning and definition in the English Dictionary.

WEST INDIES noun

Definition of West Indies (noun)

  1. the string of islands between North America and South America; a popular resort area
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: