English dictionary

katar meaning and definition

Definition and meaning of katar at MeaningMonkey.org. katar meaning and definition in the English Dictionary.

KATAR noun

Definition of Katar (noun)

  1. an Arab country on the peninsula of Qatar; achieved independence from the United Kingdom in 1971; the economy is dominated by oil
  2. a peninsula extending northward from the Arabian mainland into the Persian Gulf
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: