Are Female Doctors Better? Here's What to Know

A new study suggests female doctors may provide patients better care, especially when those patients are women. Here's what to know.

Are Female Doctors Better? Here's What to Know
photo of

A new study suggests female doctors may provide patients better care, especially when those patients are women. Here's what to know.