English dictionary

france meaning and definition

Definition and meaning of france at MeaningMonkey.org. france meaning and definition in the English Dictionary.

FRANCE noun

Definition of France (noun)

  1. a republic in western Europe; the largest country wholly in Europe
  2. French writer of sophisticated novels and short stories (1844-1924)
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: