Definition of WHI
WHI: The Women's Health Initiative, a long-term health study sponsored by the National Institutes of Health (NIH) focused on strategies for preventing heart disease, breast cancer, colorectal cancer and osteoporosis in postmenopausal women.Source: MedTerms™ Medical Dictionary
Last Editorial Review: 6/14/2012
Top RxList Drug News
Find out what women really need.