English dictionary
tropical medicine meaning and definition
Definition and meaning of tropical medicine at MeaningMonkey.org. tropical medicine meaning and definition in the English Dictionary.TROPICAL MEDICINE noun
Definition of tropical medicine (noun)
- the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions
Source: Princeton University Wordnet