English dictionary

namibia meaning and definition

Definition and meaning of namibia at MeaningMonkey.org. namibia meaning and definition in the English Dictionary.

NAMIBIA noun

Definition of Namibia (noun)

  1. a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: