Definition of WHI

Reviewed on 3/29/2021

WHI: The Women's Health Initiative, a long-term health study sponsored by the National Institutes of Health (NIH) focused on strategies for preventing heart disease, breast cancer, colorectal cancer and osteoporosis in postmenopausal women.

SLIDESHOW

Pelvic Pain: What's Causing Your Pelvic Pain? See Slideshow

Health Solutions From Our Sponsors