Definition of Western medicine
Western medicine: Conventional medicine, as distinct from an alternative form of medicine such as ayurvedic or traditional Chinese medicine.Source: MedTerms™ Medical Dictionary
Last Editorial Review: 9/20/2012
Drug Medical Dictionary of Terms by Letter
Medical Dictionary Term:
Find out what women really need.