The history of American medical education is one of increasing standardization and professionalization. In colonial America, medical education was haphazard and inconsistent, but gradually medical schools emerged to provide training that is more homogeneous. As the body of medical knowledge increased, medical education became more grounded in science and less dependent on wisdom handed down from practicing doctors to students. By the 1960s and 1970s, universities had established the system of American medical education familiar to late 20th-century and early 21st-century Americans.
Colonial Period
During the 1600s and 1700s, most colonial Americans aspiring to become a doctor generally did so by apprenticing themselves to an already established physician. Wealthy colonials had the option of traveling to Great Britain or Europe to receive training at a school or hospital. Some doctors entered the medical profession through less formal means; for instance, a person could establish a reputation as a medic by nursing a sick acquaintance or selling curatives, according to William G. Rothstein, a sociologist who has studied the history of American medicine.
The First Medical Schools
In the mid-1700s, Americans began establishing their own medical schools, starting with medical colleges at Columbia University in the City of New York in 1767, the University of PennsylvaniaPennsylvania in 1769 and Harvard University in 1783. At first, these schools only offered a Bachelor of Medicine but very rapidly began awarding a Doctor of Medicine degree. By 1820, according to Rothstein, the United States had 13 medical schools. Apprenticeships also became more standardized during this period, lasting around three years and including both practical and theoretical studies, Rothstein reports.
A Scientific Curriculum
The initial requirements for graduation from an American medical school included between 32 and 40 weeks of lectures and an apprenticeship, according to Rothstein. Toward the late 1800s, medical schools developed a curriculum that was more rigorous, as medical knowledge became more grounded in science, according to medical historian Kenneth M. Ludmerer. The founding of Johns Hopkins University's School of Medicine in 1893 served as the standard for medical schools that wanted to institute a more scientific curriculum, including more laboratory work and longer hands-on training.
The Flexner Report
In 1910, educator Abraham Flexner issued a report on the state of American medical education, which accelerated the reform of medical schools toward a John Hopkins-model. His report castigated the sub-standard education many physicians were receiving in medical school; within a decade, 30 percent of American medical schools had closed, according to Ludmerer. Other schools strove to raise the quality of doctors' education, for instance, by establishing teaching hospitals and, increasingly, an internship and residency training.
Rise of Residency and Specialty Training
Hand-in-hand with residency training, came the rise of specialization and sub-specialization among American doctors by the 1930s and 1940s. Specialties such as ophthalmology, surgery and obstetrics/gynecology -- along with associated certification boards -- arose during the first few decades of the 20th century. Out of the 15 specialty-certifying boards that existed in 1942, 12 mandated at least three years of residency training, according to Rosemary Stevens, a sociologist of science and medicine. By the 1960s and 1970s, the basic components of the modern American medical educational system were in place: students attended medical school for four years and then completed an internship followed by a residency, taking board examinations along the way.
Beyond the 1970s
By the 1980s, the last two years were required clerkships, which combined major clinical disciplines and electives, in which the residents were assigned to a team led by an attending physician, according to a report from the American Association of Medical Colleges: The Education of Medical Students: Ten Stories of Curriculum Change. The focus was largely on working with seriously ill patients in hospital settings, but that model was flawed because of the variability among hospitals and among individual patients. Changes implemented included more training in ambulatory care such as that provided in physicians' offices, but this change came slowly and wasn't implemented until the 1990s, when medical students spent time each week with physicians in their office. However, critics point out that physicians do not have the time to conduct training and that the study and treatment of chronic diseases is complex and not well served by either in-office or in-hospital training, pointing to a critical need for further review and change during the 21st century. The rise of managed care affects physician practice but most managed care firms have been reluctant to participate in medical education.
Related Articles
References
- "History and Health Policy in the United States: Putting the Past Back In"; Rosemary Stevens; 2006
- "Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care"; Kenneth M. Ludmerer; 1999
- "American Medical Schools and the Practice of Medicine: A History"; William G. Rothstein; 1987
- "Higher Education in Transition"; John Seiler Brubacher and Willis Rudy; 1997
- Association of American Medical Colleges: The Education of Medical Students; Ten Stories of Curriculum Change
- PubMed: The Effect of Medical Student Teaching on Patient Satisfaction in a Managed Care Setting
Writer Bio
Georgia Alton holds a Doctor of Philosophy in history from Emory University. Her specialty is 20th-century U.S. history. Alton has written articles on subjects like World War I and colonial America for ABC-CLIO encyclopedias. She also works as a freelance writer with articles on eHow, Answerbag and Brighthub.