English dictionary

west africa meaning and definition

Definition and meaning of west africa at MeaningMonkey.org. west africa meaning and definition in the English Dictionary.

WEST AFRICA noun

Definition of West Africa (noun)

  1. an area of western Africa between the Sahara Desert and the Gulf of Guinea
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: