Are Female Doctors Better? Here's What to Know

A new study suggests female doctors may provide patients better care, especially when those patients are women. Here's what to know.
Source: WebMD Health - Category: Consumer Health News Source Type: news
More News: Health | Study | Women