English dictionary
namibia meaning and definition
Definition and meaning of namibia at MeaningMonkey.org. namibia meaning and definition in the English Dictionary.NAMIBIA noun
Definition of Namibia (noun)
- a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa
- synonyms: Republic of Namibia, South West Africa
Source: Princeton University Wordnet