Circumstances that shaped early US Healthcare – 1760-1850

Part 2 in the series “The importance of determining ‘value’ in medicine”

This is the second in a series of articles in which Dr Arby Khan looks at the concept of ‘value’ in healthcare. In this issue he looks at the historical background that shaped healthcare in the US, knowledge of which is essential to understand the current economic determinants of medical practice in the US. In future articles in the series Dr Khan will show what lessons can be gleaned by countries currently developing their healthcare systems and infrastructure, such as those in the Middle East. The previous article – Part 1 of the series – is summarized below.

A summary of the article published January 2012

Healthcare costs are rising in all countries, but exponentially in the US. This is due largely to a highly fragmented system that has evolved piecemeal over the last 150 years – primarily due to pressures by special interest groups and political conditions. This high degree of fragmentation leads to incredibly high administrative and healthcare delivery costs. To rein in such costs, the concept of ‘value’ was introduced which tries to provide better outcomes with similar or less cost. ‘Value’ is currently defined as ‘outcomes/cost’ – however defining and quantifying outcomes is difficult. Because of the subjectivity inherent in the definition of ‘value’, it is imperative that a framework of definitions and axioms be created so that a coherent, consistent, and aligned system can be created in which ‘value’ can be usefully defined. There are important lessons here for nations who are in the process of developing their healthcare system. A national goal, or set of goals, should be created that unifies the care of the people, systems should be developed that generate the requisite data to achieve those goals, and financial incentives should be aligned with these goals. Whereas these unifying goals may vary from country to country, or even from region to region, they must always facilitate and encourage high ‘value’ processes. One of those goals should certainly be prevention of disease – it is the highest ‘value’ intervention there is. The eradication of small pox is a premier example of the high ‘value’ of prevention – and the same concept can be applied to the five major conditions that account for 75% of healthcare costs – heart disease, cancer, stroke, diabetes, and chronic lung disease.

Introduction: Any given institution comes into being based on how various members of society act and pursue their interests under the prevailing conditions over, usually, long periods of time. Healthcare is no exception. There were many forces that shaped the structure and directed the evolution of healthcare infrastructure in the US – physicians, politics (both national and international), pharmaceutical companies, and insurance companies, to name a few[2, 3]. Even though it is convenient and easier to describe and understand the evolution of healthcare if divided into time periods – like in this article – the reader will allow that there may be certain overlap of social and other phenomena. Additionally, the reader will also appreciate that relating the history of American medicine in a few short articles may occasion certain omissions – however these omissions should not detract from conveying the major, underlying themes. We shall use the following time periods as guides: 1760-1850, 1850-1915, 1915-1935, 1935-1950, 1950-1969, 1969-1980, 1980- 1991, 1991-1995, 1995-2006, 2006 and beyond.

The Beginning – 1760-1850: As one might surmise, the practice of medicine in this era was rudimentary and rarely produced positive results. This lack of results, quite obvious to everyone, provided the local doctor no real advantage, and created the context for the development of three spheres of practice – each essentially equivalent in popularity and medical authority: medicine of the domestic household (where the American mother reigned supreme as healer), medicine of the physicians, and medicine of lay healers.

To appreciate the status of medicine in this period of time (1760-1850), it is useful to review a few landmarks in the history of medicine.

Massachusetts General Hospital in Boston, USA, was not established until 1811 and not functional until 1821. There were only two other “General” hospitals in America – the Pennsylvania Hospital and the New York Hospital[4]. The most elementary of stethoscopes would not be invented until 1816. Infections were by far the greatest source of morbidity and mortality and Louis Pasteur and Joseph Lister would not be born until the year 1822[5] and 1821[6] respectively. It would not be until about 1850 when Lister would combine his observations with those of Pasteur’s to surmise that microbes caused infections and that antisepsis might be effective. Subsequently in 1861, Lister observed that 45-50% of amputation patients died from sepsis (and these were the ones that had survived surgery). In 1865, putting his and Pasteur’s theories to the test, he used phenol as an antiseptic and reduced mortality in his ward to 15% within four years. He is, today, deservedly regarded as the founder of antisepsis[6].

Pasteur was French and Lister English. Europe, based on medical knowledge, science, and the educational structure of countries such as Germany, Austria, Switzerland, Czechoslovakia, England, and France, was considered to be the leader in medicine and surgery until the early 1900s[7 9]. However, even the leaders had little to offer when it came to actually curing patients during 1760- 1850. For posterity sake, it is fortunate that one of England’s leading surgeons of the time, Sir Frederick Treves, was also a gifted writer and historian, and that after retiring from practice devoted his time to recording the evolution of medicine and surgery. His impressive script looks back and records the operating room of a hospital in London before antisepsis, and gives us an appreciation of prevailing conditions[5].

If this was the state of medicine and surgery in the leading centers, one can only imagine what the situation was in America between 1760 and 1850. Thus, it is hardly surprising that non-physicians, including the American mother, wielded equal authority over the practice of medicine as did physicians. There were three major ways to get medical treatment during this era. Let’s look at these in more detail.

DOMESTIC MEDICINE: The family was indeed the center of social and economic life in early America. Women were expected to deal with illness in the home and to be familiar with a host of remedies and medicines – which consisted mostly of medicinal herbs. If there were an illness that the lady of the house did not understand she would reach out to a network of other mothers and wives for advice and assistance. Domestic practice was driven mostly by oral tradition and there were few written texts. There were, however, some guides that were published, the best known being William Buchan’s Domestic Medicine. His belief was that ordinary people were fully competent to treat any and all illnesses and upheld the view that professional knowledge and training were not necessary to treat most diseases. It would not be difficult to prove, he said, “that everything valuable in the practical part of medicine is within reach of common sense, and that the Art would lose nothing by being stripped of all that any person embued with ordinary abilities cannot comprehend.” He further proclaimed that most people, “trust too little to their own endeavors”[2].

If the professional physician had only misery to offer, and painful misery at that, there was every reason for the common man to trust his “own endeavors” and this ideology resonated with many. There was also a mistrust of complicated, Latin, and medical sounding language and terminology. If the formal doctors couldn’t really heal why did they make their field so difficult to understand with such difficult words? Thus, any complicated, Latin sounding words were construed mostly as trying to confuse the lay person and mystifying medicine so that it could become the monopoly of a few physicians. This is the main reason a subsequent text on Domestic Medicine, written by John C Gunn, and which later replaced William Buchan’s guide as the favorite, was described on the title page as written “In Plain Language, Free from Doctor’s Terms… Intended Expressly for the Benefit of Families… Arranged on a New Simple Plan, By Which the Practice of Medicine is Reduced to Principles of Common Sense”. Gunn further stressed that Latin names for common medicines and diseases were “originally made use of to astonish the people” and aid the learned in deception and fraud. He further railed that: “The more nearly we can place men on a level in point of knowledge, the happier we would become in society with each other, and the less danger there would be of tyranny…”[2].

The focus of Buchan’s and Gunn’s endeavors in healing was mainly naturalistic and secular. There was no suggestion of magic or witchcraft and their remedies consisted mainly of admonishing the people that “proper attention to AIR and cleanliness would do more to preserve the health of mankind than all the endeavors of the faculty”. For some maladies, bloodletting, purging, and blistering were also suggested. Nevertheless, incantations and charms were not part of their therapeutic regimen.

This is not to say that this naturalistic outlook towards disease and healing was not tainted somewhat by religion. It was indeed common to have a moral interpretation of the etiology of disease. Clerics would frequently warn that immorality and sin were a predisposing cause of illness and that prayer was an appropriate response. Most Americans in that era often believed that illness was a sign of God’s displeasure and a warning to the dissolute. However, this religious interpretation waned over subse-quent years. Paul Starr[2] refers to a nice analysis by Charles Rosenberg[10] that shows the evolution of clerical responses to three cholera epidemics in America during the years 1832, 1849, and 1866, that showed diminishing responsibility attributed to God. In 1832 clergymen believed that while cholera may follow the laws of nature, it was really sent by God to punish sin. One newspaper warned Sunday school students: “The cholera is not caused by intemperance and filth, in themselves, but it is a scourge, a rod in the hand of God…” During the 1849 epidemic, clerical attacks on science were more common, but religious connotations were decreased significantly. And finally, in the epidemic of 1866, there were essentially no religious associations as public health methods and organizations had assumed more effective authority. This period of 1760-1850 was interesting in that it focused prevailing beliefs towards a natural, rather than magical, perspective where prevention, cause, and treatment of disease did not entail witchcraft or incantations. Thus, even though domestic medicine guides challenged the professional authority of physicians, by claiming that the American mother could do just as well if not better, they actually helped lay the foundations of modern medicine by insisting on a predominantly natural and secular view of illness.

Physicians, as we know them today, did not exist in the 17th and 18th centuries. There were no formal medical schools or medical degrees, and citizens of any background could appropriate the title of doctor. It was common, for example, for clergymen to combine medical and religious services to their congregations. However, people ranked lower in society than clergymen could also serve as doctors. There were essentially no boundaries. In the state of Virginia, for example, one historian records a doctor “who besides drugs, sold tea, sugar, olives, grapes, anchovies, raisins and prunes.” Another doctor, described as a surgeon, was also a wigmaker. Yet another woman, who advertised in 1773 that besides practicing midwifery, she cured “ringworms, scald heads, piles, worms” and also made ladies’ dresses and bonnets in the newest fashion”.[2]

Medicine in these circumstances could have little collective identity or influence. However, those doctors serious about healing gradually proceeded to practice medicine full time, or at least the vast majority of their time. Some American doctors travelled to Europe to be enlightened in the art and science of medicine and were rewarded with a higher status. These doctors brought back with them an awareness of what degree of formality in training and organization would be required for them to be taken seriously. The first Medical School was chartered in Philadelphia in 1765 and the first medical society was organized in New Jersey in 1766. At the time of the American Revolution, about 1775-1781, there were about 3500-4000 physicians in the newly formed United States of America. Of these, only 400 had formal medical training (as in an apprenticeship of sorts) and of these only 200 held medical degrees. There was over time, however, legitimization of doctors as bona fide professionals – people who practiced only medicine – and this is reflected in the decreasing proportion of clergymen in the medical society of New Jersey. In 1776, the year the society was created, its president was a clergyman and similarly the first president of the College of New Jersey was both a physician and a minister. In the early years of the society, six of the thirty-six (16%) members were pastors. By 1796, the medical society had ninety-one members but only seven were clergymen (7%), and of the last fifty-five to join at that time, only one (2%) was a member of the clergy.

Even though medical schools were now being created, they were actually just supplemental and in fact optional. Apprenticeship was the principal form of medical training. Successful doctors took in young hopefuls who served as assistants, read medical books, and also performed household chores for the “professor”. In turn, the apprentices were clothed, fed, and at the end of three years given a certificate of completion and of good character. There were no formal standards of what knowledge an apprentice should have after three years of training.

The lack of standards for the knowledge and skill of a physician prevented the establishment of the field as a bona fide profession. It became apparent over time that if doctors were to command the same respect and status as in Europe, all doctors would need to emanate from a professional, recognized, and standardized seat of learning. Yet legitimization of their profession was not the only concern doctors had – there was of course the issue of making money – and medical schools seemed like a good opportunity. There were generally two types of physicians that started medical schools. First, physicians who wanted to raise the American profession to the dignity and privileges that doctors enjoyed in Europe[7], and second, physicians who sought to start medical schools for their own financial gain. Medical schools multiplied slowly at first and then rapidly after 1820. Typically, a group of physicians would approach a local college to create a medical school. From the doctors’ perspective, the college would lend its legal authority to grant degrees. From the colleges’ perspective, they added the prestige of having a medical school without any cost. The whole enterprise of medical schools, including the physicians’ salaries, was funded by the tuition medical students paid. The curriculum was quite interesting – even though it took two years to graduate, only three to four months of every year were actually spent studying. What’s more, the second year consisted of merely repeating the same courses completed in the first year. The concept of offering distinct courses in the first and second years was introduced in 1850 and was considered to be almost revolutionary.

Originally, medical schools offered both a bachelors and a doctoral degree. However, most students would not return after the first year when they received their Bachelor’s degree. Thus, in 1789, the medical school at the College of Philadelphia introduced stiffer requirements. They eliminated the Bachelor’s degree, made the Doctoral degree (MD) a necessity, and promulgated the following requirements: a knowledge of Latin and experimental philosophy, three years of tutelage as an apprentice, the attendance of two terms of lectures, passing of all examinations, and writing a thesis. However, medical schools that were created by physicians whose sole reason was financial gain did not follow suit and in fact proceeded in the other direction. The length of studying was kept at a minimum, requirements were frequently sacrificed, and student fees were driven down (due to a large number of medical schools), and as a result the quality of physicians worsened. And there was an even bigger problem – the students were not obliged to pay the tuition, and in turn the physicians’ salaries, unless they passed the exams. So the physicians would pass any and all medical students[7]. Ironically, in seeking to raise their status as individual physicians they were actually undermining the entire field of medicine.

However, all physicians were by no means selfish. John Morgan, founder of the medical school at the College of Philadelphia, was a physician clearly amongst those who truly wanted to raise the standard, status and authority of American medicine. He attempted to create medical societies that would have the authority to license physicians to regulate entry into the field. In fact, physicians in the state of Connecticut also attempted to create a medical society with licensing authority. Unfortunately, both bids were rejected by the legislature. And in New York, even though the concept was accepted and legislated, it was never enforced against unlicensed physicians. Interestingly, as medical societies were established in other states, their respective legislatures also bestowed them with licensing authority. Still, there was no enforcement, no standards for the licensure, no authority to rescind a license once given, and no serious penalties for violating the law. In fact, growing suspicion that licensure was created merely to give friends and family an easy route into the field actually led to the repeal of licensure authority by most states (e.g. Ohio-1833; Alabama- 1832; Mississippi-1836; South Carolina, Maryland, and Vermont – 1838; Georgia- 1839; New York- 1844; Louisiana-1852)[2, 7].

Thus, this era of 1760-1850 was one of confusion – created by the competing interests of the individual physician and the profession as a whole. The inadequate, scientific state of the profession, with its dismal outcomes, could not really confer the prestige that some physicians desired. So they tried to gain authority by creating medical schools, medical societies, and the concept of licensure. However, many individual physicians sought to hijack these institutions for their own personal and financial gain. As a result, by the end of this era, there was much confusion and no order. There were three ways that one could be considered a legitimate doctor. The physician could have completed an apprenticeship, obtained a degree from a medical school, or received a licensure from the medical society – each being sufficient. No one institution or pathway held definitive authority and there were no standards of what knowledge and skills a person need possess to call himself a physician. As we shall later see, however, advances in antisepsis, the discovery of anesthesia, and the subsequent, rapid improvement in surgical outcomes in the late 1800s and early 1900s gave the profession an air of legitimacy and authority that quite rapidly established medicine as a highly regarded profession.

LAY OR POPULAR MEDICINE: This can be considered an area of medicine occupying a niche somewhere between domestic and professional medicine and encompasses folk medicine and people who had remedies of their own to offer. However, it was somewhat more organized and influential than it might appear – it was in fact an active rival of domestic and professional medicine and had its own coherent infrastructure[7]. One of the major movements in popular medicine was the Thomsonian movement. Samuel Thomson, who had no formal education, was a botanic (herbalist) from New England. He obtained a patent from the federal government for his system of botanic medicine – thus enabling him to sell rights for use of his methods. His followers were mostly rural and could be found from New England through the Mohawk Valley to Western New York. The Thomsonians were quite organized - they boasted medical societies, medical conventions, and published journals. Thomson wrote a book called New Guide to Health which was essentially the bible of botanic medicine.

Thomson’s system of medicine was rather simple to understand, which is probably why it was so popular. All animals, including humans, consisted of four elements: earth, air, fire, and water. Earth and fire were solids, while air and fire, or heat, were considered the cause of all life and motion. All diseases had one cause and as such there was only one treatment for all diseases. Cold was the cause and heat was the remedy. If disease struck, heat could be restored either directly – by clearing the system of obstructions so that the stomach could digest food and thus generate heat – or indirectly by perspiration. Thus Thomson’s principal remedies included a violent emetic know as lobelia inflata (Indian Tobacco), red pepper, steam, and hot baths. He opposed all remedies offered by the medical profession.

Outside of the mainstream Thomsonian system, there were several other types of lay healers as it was relatively easy to become a “doctor” in that era. For example, an autobiography of a freed slave recounts that in 1844 he bought a book of medical botany and began preparing remedies for his family[7]. These remedies were then offered to his neighbors who gladly accepted – and the remedies actually worked – and thus his star rose. He continued to teach himself and eventually took up healing as a practical vocation. Botanics and midwives were probably the most numerous of the lay therapists but there were also numerous cancer doctors, bonesetters, inoculators, abortionists, and sellers of nostrums. The midwives, bonesetters, and inoculators were “specialists” whereas the botanics and nostrum vendors were generalists and treated all ailments. Of these groups, the bonesetters were perhaps the most remarkable of all. They specialized in treating fractures and dislocations and were basically artisans with no formal education. However, they were quite skilled in applying mechanical craftsmanship to medicine. As a group, lay healers had quite a following and almost one third of the population looked to them for treatment.


Medicine in 1760-1850 – even though armed with long, complicated, Latin words, and hard to understand diagnoses – did not produce any significant, positive outcomes. It is hardly surprising, then, that people mistrusted professional doctors and looked elsewhere for answers. Naturally, alternative forms of medicine – domestic medicine and lay or popular medicine – flourished and even commanded equal respect. There was, of course, some overlap between these three forms of medicine but they existed mostly as distinct entities. Domestic medicine was dominated by the American mother who was charged with taking care of and healing her family. Professional medicine was more confusing. There were three ways one could become a doctor – apprenticeship, medical school, or licensure and neither one had any strict standards of required knowledge or skill. The few requirements that did exist were never enforced. Lay or popular medicine essentially consisted of “specialists” such as midwives, bonesetters, and inoculators, and “generalists” such as the botanics (herbalists) and nostrum vendors, who treated all ailments. All three types of medicine wielded essentially equal influence and treated an equal number of patients. This state of affairs in conjunction with poor outcomes in medicine and surgery resulted in healers generally being held in low esteem. Efforts to bolster this esteem by doctors failed mostly because there was no real science, methodology, standards, or results to back them up. This was soon to change, however, and the invention of the stethoscope, the discovery of anesthesia, microbes, and antisepsis, and the evolution of surgical techniques, all in the late 1800s and early 1900s, would eventually elevate the physician to respectable heights.

The next article will examine the time period 1850-1915, which represents a scientifically productive period in the history of medicine and the resulting positive patient outcomes convinced the world that the growing complexity of medicine was indeed beyond the limits of lay competence.


1. Khan, A., The Importance of determining ‘Value’ in Healthcare. Middle East Health, 2012(Jan-Feb). 2. Starr, P., The Social Transformation of American Medicine1982: Basic Books. 3. Starr, P., Remedy and Reaction - The Peculiar American Struggle over Healthcare Reform 2011, New Haven, CT: Yale University Press. 4. A Narrative History of Mass General. 2011; Available from: narrativehistory/. 5. Nuland, S.B., “To Tend The Fleshly Tabernacle of the Immortal Spirit”, in Doctors-the Ilustrated History of Medical Pioneers, S.B. Nuland, Editor 2008, Black Dog and Leventhal Publishers, Inc: New York. 6. Story, B.-T. Joseph Lister - Biography. 2011; Available from: joseph-lister-37032. 7. Starr, P., Medicine in a Democratic Culture, in The Social Transformation of American Medicine, P. Starr, Editor 1982, Basic Books. 8. Imber, G., Becoming a Surgeon, in Genius on the Edge - The Bizarre Double Life of Dr. William Stewart Halsted, G. Imber, Editor 2011, Kaplan Publishing: New York. 9. Nuland, S.B., Medical Science Comes to America, in Doctors-The Illustrated History of Medical Pioneers, S.B. Nuland, Editor 2008, Black Dog and Levanthal Publishers, Inc: New York. 10. Rosengberg, C.E., The Cholera Years 1962, Chicago: University of Chicago Press.


Arby Khan, MD, FACS, MBA is the Deputy National Director for Surgery for the United States Veterans Health Administration - which oversees 152 acute care facilities and 965 outpatient clinics. Dr Khan is a regular contributor to Middle East Health. He has written on a range of subjects – such as Human Resources management in hospitals, Change Management in GCC hospitals, Brain Death and Hospital Resource Management and organ transplant-related legislation, among others – with a view to improving healthcare in the UAE and the wider region. He is a multi-organ Transplant Surgeon and Immunologist and has successfully started, from the ground up, two multi-organ transplantation programmes – one in the United States and one in Abu Dhabi. He is the author of many clinical and basic immunology papers, and has been educated, trained and employed variously at University of California - Berkeley, McGill University, University of California - San Francisco, Harvard Medical School, Yale University - Graduate School of Immunobiology, University of Pittsburgh - Starzl Transplantation Institute, University of Vermont - School of Medicine, and Columbia University (NY). He also holds an MBA, with Distinction, from London Business School.

– The views expressed in this article are those of the author and do not necessarily represent the views of the institutions for which Dr Khan has worked or currently works.

ate of upload: 24th Mar 2012


                                               Copyright © 2012 All Rights Reserved.