English dictionary

dermatology meaning and definition

Definition and meaning of dermatology at MeaningMonkey.org. dermatology meaning and definition in the English Dictionary.

DERMATOLOGY noun

Definition of dermatology (noun)

  1. the branch of medicine dealing with the skin and its diseases
Source: Princeton University Wordnet

If you find this page useful, share it with others! It would be a great help. Thank you!

  

Link to this page: