English dictionary

florida meaning and definition

Definition and meaning of florida at MeaningMonkey.org. florida meaning and definition in the English Dictionary.

FLORIDA noun

Definition of Florida (noun)

  1. a state in southeastern United States between the Atlantic and the Gulf of Mexico; one of the Confederate states during the American Civil War
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: