English dictionary

tropical medicine meaning and definition

Definition and meaning of tropical medicine at MeaningMonkey.org. tropical medicine meaning and definition in the English Dictionary.

TROPICAL MEDICINE noun

Definition of tropical medicine (noun)

  1. the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: