Immigrant Doctors Provide Better Care, According to a Study (Harvard Business Review)

International graduates are vital to providing health care in the U.S., and policies that discourage doctors from other countries from wanting to practice in the U.S. are likely to have unintended consequences for the health of American people, especially for those who live in traditionally underserved areas.
#care #compassion #wholeness #emotionalhealth #communitydevelopment
2 views