English dictionary
dermatology meaning and definition
Definition and meaning of dermatology at MeaningMonkey.org. dermatology meaning and definition in the English Dictionary.DERMATOLOGY noun
Definition of dermatology (noun)
- the branch of medicine dealing with the skin and its diseases
Source: Princeton University Wordnet