English dictionary
west africa meaning and definition
Definition and meaning of west africa at MeaningMonkey.org. west africa meaning and definition in the English Dictionary.WEST AFRICA noun
Definition of West Africa (noun)
- an area of western Africa between the Sahara Desert and the Gulf of Guinea
Source: Princeton University Wordnet