Word:   
Definitions of Namibia:
  • noun:   a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia form part of the high Namibian plateau of South Africa

Related words...
Descriptive words...

Search for Namibia at other dictionaries: OneLook, Oxford, American Heritage, Merriam-Webster, Wikipedia

See Namibia used in context: 1 rhyme, several books and articles.

Help  Advanced  Feedback  Android  iPhone/iPad  API  Blog  Privacy

Copyright © 2020 Datamuse