White Men ' s Grip on U.S. Health Care May Be Slipping

TUESDAY, July 20, 2021 -- The U.S. medical field is less dominated by white men than it used to be, but there are still few Black and Hispanic doctors, dentists and pharmacists, a new study finds. The study, which looked at trends over the past 20...
Source: Drugs.com - Daily MedNews - Category: General Medicine Source Type: news