English dictionary

haiti meaning and definition

Definition and meaning of haiti at MeaningMonkey.org. haiti meaning and definition in the English Dictionary.

HAITI noun

Definition of Haiti (noun)

  1. a republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the western hemisphere
  2. an island in the West Indies
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: