Slideshows Images Quizzes

Definition of WHI

WHI: The Women's Health Initiative, a long-term health study sponsored by the National Institutes of Health (NIH) focused on strategies for preventing heart disease, breast cancer, colorectal cancer and osteoporosis in postmenopausal women.

Reviewed on 9/14/2016

Health Solutions From Our Sponsors