English dictionary
florida meaning and definition
Definition and meaning of florida at MeaningMonkey.org. florida meaning and definition in the English Dictionary.FLORIDA noun
Definition of Florida (noun)
- a state in southeastern United States between the Atlantic and the Gulf of Mexico; one of the Confederate states during the American Civil War
- synonyms: Everglade State, FL, Sunshine State
Source: Princeton University Wordnet