Definition of Mainstream medicine
Mainstream medicine: Medicine as practiced by holders of M.D. or D.O. degrees and by their allied health professionals, such as physical therapists, psychologists, and registered nurses. The term "mainstream medicine" implies that other forms of medicine are outside the mainstream.Source: MedTerms™ Medical Dictionary
Last Editorial Review: 9/20/2012
Find out what women really need.