English dictionary

california meaning and definition

Definition and meaning of california at MeaningMonkey.org. california meaning and definition in the English Dictionary.

CALIFORNIA noun

Definition of California (noun)

  1. a state in the western United States on the Pacific; the 3rd largest state; known for earthquakes
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: