Definition of Anatomy
Anatomy: The study of human or animal form, by observation or examination of the living being, examination or dissection of dead specimens, microscopic examination, and/or textbooks.Source: MedTerms™ Medical Dictionary
Last Editorial Review: 3/19/2012
Drug Medical Dictionary of Terms by Letter
Medical Dictionary Term:
Find out what women really need.