dermatology
1.
(
noun
) the branch of medicine dealing with the skin and its diseases
Related Words:
medicine
Terms and Conditions of Use / Copyright Notice