The major features of contemporary medical schools took shape in the late nineteenth and early twentieth centuries. What occurred at that time thoroughly transformed medical education. Subsequent developments rarely had the same far-reaching effects as the changes put in place from the 1870s through the 1920s.
Early Medical Schools
Before the 1870s, most U.S. medical schools supplied only part of the education necessary to be a competent physician. The typical curriculum featured one year of courses in six to eight subjects. The instruction was didactic. A lecturer occasionally demonstrated procedures for the class to watch, but few instructors let their pupils carry out the tasks. Students spent little, if any, time in laboratories or with patients. To learn by doing required securing an apprenticeship, summer schooling, European study, or private instruction. What the medical school offered was a series of lectures from faculty who usually owned the school, concentrated on their private practices, and saw research and writing as optional rather than mandatory.
In contrast, medical education in France and Germany was more rigorous. In those countries, medicine was studied as an experimental science best grasped through investigation in the laboratory and service on the hospital ward. In exchange for free care, impoverished patients filled the hospitals where faculty undertook research and taught students. A year or two of European study capped the education of many ambitious American doctors, and they brought home the conviction that their colleagues should emulate the spirit of painstaking experimentation that pervaded the great medical centers of Paris, Berlin, and Vienna.
Another source of dissatisfaction with the patchy American medical curriculum stemmed from the surge of discoveries in the late nineteenth century. Advances in medical knowledge made the short course of study seem wholly inadequate to understand the germ theory of disease, vaccines and antitoxins, precise tests of bodily fluids and tissues, and other diagnostic and therapeutic breakthroughs. Medicine as a profession acquired a firmer and deeper knowledge base, even if many diseases were not yet fully understood. The greater authority and prestige of the profession elevated the standing of the better schools and pressured the weaker ones to improve or close.
The rise of research universities offered a third reason for revamping traditional medical education. As several dozen prominent colleges added graduate divisions and professional schools in order to become universities, they reassessed their relationships with local proprietary medical schools. What had been a casual connection often became a formal affiliation. Mergers appealed to the cadre of medical reformers who believed that the profession would never flourish in the absence of research careers for talented physicians. For the universities, the link was another way to establish their legitimacy as the best regional sources of theoretical and practical knowledge.
By the 1880s, several medical schools stood apart from others as models of what was possible. At Harvard, Michigan, and Pennsylvania, the three-year curriculum included extensive hands-on work in laboratory science courses taught by full-time faculty. The clinical courses took small groups of students to the bedside and to outpatient dispensaries. Written examinations replaced perfunctory oral exams, and instructors gradually began to test students on the practical work they did in laboratories or with patients. Entrance requirements also stiffened, with graduation from a good high school expected and some college coursework encouraged.
In the 1890s, the new standard bearer, Johns Hopkins, adopted a four-year curriculum and required a bachelor’s degree for admission. Above all, Hopkins led the way by virtue of extensive clinical opportunities in its teaching hospital. The senior faculty in the school controlled the major divisions of the hospital, where students served as clerks responsible for many aspects of patient care. In most hospitals, the board of trustees limited faculty involvement and instructional activity. Often, they regarded educational work with suspicion, and understandably so in an era when most medical students were meagerly trained. Hopkins and then other schools demonstrated that well-taught students and their faculty mentors enhanced patient care. Academic physicians were increasingly respected as the best physicians, and properly supervised students were desirable assistants. Previously seen as a branch of public welfare, the hospital redefined itself as the best place for anyone to receive state-of-the-art treatment.
Lack of money slowed the pace of change at many schools. The vision of good medical education was in sight long before the funding was in hand. Learning by doing cost much more than learning by listening. The student/teacher ratios fell markedly whenever large lecture classes gave way to small group instruction. Unlike part-time and adjunct faculty, full-time faculty were expensive. The space and equipment necessary for laboratory science were also costly. Without the support of local donors, state legislators, and rich philanthropists like J. P. Morgan, John D. Rockefeller, Edward Harkness, and Cornelius Vanderbilt, the transformation of American medical education would have been impossible. Benefactors shunned the proprietary schools, and by 1930, all for-profit medical schools had either closed or affiliated with a university.
Another ongoing challenge was the shape of the curriculum. There were constant discussions of how much time each subject deserved and when it should be taught. Those debates intensified as new fields such as biochemistry, immunology, and preventive medicine emerged. State licensing boards often stipulated how many hours to allocate each subject, but that rigidity provoked outcries about overburdened students with no free time to read, write, or study. How to link the first two years of basic science with the final two years of clinical study also baffled most instructors, as did the matter of special provisions for the very best students. Over time, most medical educators acknowledged the impossibility of covering everything and sought instead to cultivate habits of mind that would equip the young doctor to continue to learn after graduation. Through learning by doing, cross-disciplinary connections, and lifelong education, medical education embodied central tenets of progressive education, the vision of active learning designed to enliven American elementary and secondary schools.
By the 1920s, the enduring features of the modern American medical school had taken hold. No one questioned the value of teaching hospitals, university affiliations, faculty research, and hands-on learning. Medicine had become one of the most respected professions in the country, and the public no longer tolerated third-rate schools where an unprepared student could graduate quickly and easily.
The Price Of Prosperity
Not everyone benefited from the improvements in medical education. The longer course of study excluded some working-class youth who could not afford four years of tuition as well as an unpaid fifth year for a hospital internship, which became common after World War I. As the medical schools’ budgets soared, only one women’s medical school survived, and female enrollments elsewhere rarely exceeded 5 percent until the 1960s. The two schools for blacks barely stayed alive. Jewish students also suffered. When applications exceeded spaces in the 1920s, many schools discriminated against Jewish applicants, accepting only a few each year. Furthermore, as the number of medical schools declined from 1900 to 1930, the number of doctors per capita fell, with rural areas unable to attract as many new doctors as the cities, where lucrative practice as a specialist drew many graduates.
Even so, medical schools became one of the strongest branches of the university, and throughout the 1930s and 1940s, they flourished. Enrollments and endowments held up despite the economic and military turmoil in those decades. Research generated a steady flow of useful discoveries, including antibiotics and vitamins. Faculty salaries were far below what talented physicians could earn in private practice, but academic careers continued to attract many of the best doctors.
As support for research continued to rise throughout the 1950s and 1960s, the composition of the faculty changed. There were more specialists, including many PhDs. The size of the typical faculty expanded more rapidly than the student population, with some departments larger than the entire school had been twenty years earlier. Because promotion and tenure hinged on excellent research, many instructors gave less time to teaching. The overall shape of the curriculum remained unchanged, although a few schools created new courses around particular organs and diseases, and several others either shortened (the six year BA/MD) or extended (Stanford’s five-year MD) the traditional four-year time span.
What also altered the nature of faculty work was the dramatic increase in third-party payments for patient care. Private insurance as well as federal Medicare/Medicaid dollars reimbursed teaching hospitals for patient care that had been pro bono work. As a result, the fraction of faculty time devoted to patients rose sharply. The prosperity enlarged the size of the typical faculty, with nearly all the growth in the clinical ranks. From 1965 to 1990, patient care revenues rose from 6 percent to nearly 50 percent of medical schools’ annual income. Faculty salaries increased rapidly, approaching the earnings of colleagues in private practice. The size and scale of most medical schools, from the 1970s on, was immense, with annual revenue often more than $100 million. In contrast, only nine schools in 1910 had budgets over $100,000. Although cost containment efforts by insurance companies from the 1980s on, coupled with less robust growth in federal funds, forced modest retrenchments here and there, the overall financial health of medical schools usually matched or exceeded that of other professional schools in the university.
With research projects and patient care burgeoning, the task of teaching students became less central to the school’s work. Many senior faculty preferred teaching postdoctoral students and research fellows, delegating other instruction to adjuncts, interns, and the growing number of residents pursuing a specialty. The camaraderie between professors and students that marked turn-of-the-century schools diminished; the heavy workload and ensuing sense of exhaustion and stress did not. The rigorous expectations did not discourage a wider range of applicants from seeking and winning admission. The old restrictions on Jewish students ended after World War II, and in the 1970s, the number of female and minority students increased sharply. Financial aid allowed more working-class students to attend. Foreigners served as interns and residents, and some college seniors who were denied admission went abroad to inferior schools and then reapplied to an American school.
Counterbalancing the lower priority of teaching was renewed interest in curricular reforms. The familiar challenges of an overburdened curriculum, endless memorization of facts, and weak articulation of the basic sciences with the clinical work sparked a variety of innovations: fewer hours devoted to laboratory exercises, more emphasis on problem solving, and new opportunities to see patients and their families in the first two years. Many schools added instruction on interpersonal relations to encourage empathic concern for all aspects of patients’ well-being. As another sign of interest in educational improvement, most schools established divisions charged with the evaluation of teaching. Throughout their history, medical schools benefited from the keen American faith in science. The respect for physicians extended to physicians in training and their instructors. A good medical school was cherished by the community as a reliable sign of local vitality, not weakness. The ability of the profession to retain that trust will be necessary in the future to sustain the remarkable record achieved since the 1870s.
- Bonner, T. N. (1999). Becoming a physician: Medical education in Britain, France, Germany, and the United States, 1750–1945. New York: Oxford University Press.
- Flexner, A. (1910). Medical education in the United States and Canada. New York: Carnegie Foundation for the Advancement of Teaching.
- Ludmerer, K. M. (1985). Learning to heal: The development of American medical education. New York: Basic Books.
- Ludmerer, K. M. (1999). Time to heal: American medical education from the turn of the century to the era of managed care. New York: Oxford University Press.
- Morantz-Sanchez, R. M. (1985). Sympathy and science: Women physicians in American medicine. New York: Oxford University Press.
- Rosenberg, C. E. (1987). The care of strangers: The rise of America’s hospital system. New York: Basic Books.
- Starr, P. (1982). The social transformation of American medicine. New York: Basic Books.
This example Medical Education Essay is published for educational and informational purposes only. If you need a custom essay or research paper on this topic please use our writing services. EssayEmpire.com offers reliable custom essay writing services that can help you to receive high grades and impress your professors with the quality of each essay or research paper you hand in.