Wednesday, August 23, 2017

A salute to General Anesthesia

The term "anesthesia" was coined in 1846 by the American physician and poet Oliver Wendell Holmes. The term has Greek origins and refers to the inhibition of sensation. General anesthesia is a medically induced state of unconsciousness accompanied by a loss of protective reflexes. The development of general anesthetics has enabled the execution of medical procedures, which would otherwise be too painful to carry out.

The earliest records of anesthetics dates back to 3400 B.C. where then the Sumerians cultivated the opium poppy. Around 2225 B.C. the Sumerian territory became part of the Babylonian empire, which extended from Persia in the east to Egypt in the west, resulting in dissemination of the knowledge and use of the opium poppy. Prior to the introduction of opium in ancient India and China, the civilizations used cannabis and aconitum, both potent hallucinogenic agents. Ancient Indian texts also advocated the use of wine with the incense of cannabis for anesthesia.

Figure 1: A signet ring from 1500 B.C. Demeter, the Greek goddess of harvest, is shown seated below the double-axe handing over three poppy heads to Persephone, the Greek goddess of the underworld. Source

The Persian and Arabic physicians were the first to utilize inhaled anesthetics. Several of their records from the 9th-11th century describe the use of "soporific sponge", a sponge imbued with aromatics and narcotics which was usually placed under the patient's nose during surgeries. In England, a potion called dwale gained popularity during the 12th century. This was a potent mixture of opium, bile, and hemlock; the patients were usually revived by rubbing salt and vinegar on their cheekbones. Dwale became so ubiquitous that it was mentioned in several works of literature including Shakespeare's "Hamlet" and "Ode to a Nightingale" by John Keats.

The next improvement in the quality of anesthetics was made in 1525 during the Renaissance by Paracelsus, a Swiss physician and alchemist. Paracelsus noticed that ether ((C2H5)2O), when administered in "oil" form by mixing alcohol (C2H5OHand sulfuric acid (H2SO4), induced sleep in chickens when it was incorporated in their feed.

                                      2C2H5OH +      H +            (C2H5)2O + H2O     
                                                          (provided 
                                                           by the acid)

He realized that ether was a potent analgesic. The use of ether as an anesthetic occurred much later; initially it was used for "ether frolics" mostly by medical students. Even today the exact mechanism of action remains unknown, although it is known that ether can suppress the nervous system. Ether is still used in poorer countries which lack elaborate anesthetic equipment. Ether is highly soluble in the bloodstream and therefore it can easily induce muscle relaxation. The primary disadvantage of using ether is that it is highly inflammable and can cause unpleasant side effects including bronchial spasms and vomiting.

In the 18th century nitrous oxide (better known as laughing gas) was discovered by the English polymath Joseph Priestly; the same Mr. Priestly who featured in a previous blog about carbonated beverages. To synthesize nitrous oxide (N2O), he heated iron (Fe) filings which were dampened with nitric acid (HNO3).

                                       4Fe + 10HNO3    4Fe(NO3)2 + N2O + 5H2O     

The analgesic effects of nitrous oxide were first observed by the Cornish chemist Humphry Davy who also coined the term "laughing gas". He had a chamber built where he would study the effects of the gas. Interestingly, his notebooks also suggested that he had succeeded in mixing the gas with wine as a cure for hangovers. Unsurprisingly, Davy eventually became addicted to laughing gas. Nitrous oxide has several advantages compared to ether: it allows rapid recovery, does not cause nausea or bronchial irritation, is non-flammable. Nitrous oxide is still used extensively especially in dentistry. The main limitation is that its administration requires expensive equipment.

Figure 2: Depiction of a laughing gas party in the 19th century. Also known as hippie crack, nitrous oxide was frequently used recreationally in British upper class. It gained popularity due its powerful "mystical" and "spiritual" effects on the user. Source.

The use of anesthetics in surgery began in the 1820s. Henry Hill Hickman, an English physician, experimented with the use of carbon dioxide. He would administer the gas to an animal to the point where it became almost insensible and then he would determine the anesthetic effects by amputating its limbs. In 1824 he submitted his report to the Royal Society but was met with criticism and deemed a "Surgical Humbug". His work was later reexamined and he is now recognized as one of the fathers of anesthesia. The other pioneers Crawford Long, Horace Wells, and William Morton owe their eureka moments to ether frolics; all of them were regular attendees of these events. Crawford Long, an American surgeon noticed that the participants of the frolics would be oblivious to the pain of inadvertant bruises. In 1845, he became the first obstetric anesthetist by having his wife inhale ether while giving birth. Horace Wells, an American dentist pioneered the use of nitrous oxide as an anesthetic in dental surgery. Unfortunately, in 1845, during a demonstration at Massachusetts General Hospital in Boston, his patient cried out in pain during the procedure which prompted the audience of students to jeer at him. Humiliated, Wells returned home the very next day. The patient later admitted that although he had cried out in pain, he had no recollection of the pain or the tooth extraction. William Morton, also an American dentist, used ether for tooth extractions. His method became famous when an American surgeon, John Collins Warren, used ether to painlessly remove a neck tumor. Warren reportedly quipped "Gentleman, this is no humbug" after the procedure. Reports of this demonstration travelled across the world and ether quickly became the predominant form of anesthetic. 

Figure 3: The Ether Monument in Boston's Public Garden. Also known as The Good Samaritan, the monument depicts a doctor holding the body of a drooping patient on his knee. The doctor holds a piece of cloth in his left hand suggesting the use of ether. Source.

In 1847, James Young Simpson, a Scottish obstetrician, discovered the use of chloroform as a general anesthetic. Dr. Simpson, along with his assistants, would sit in his dining room every evening and test new chemicals to see if any of them had anesthetic effects. On inhaling chloroform, an atmosphere of cheer set in followed by a loss of consciousness. The next morning, in an attempt to reproduce the results from the previous night, Dr. Simpson tried it on his niece. She fell asleep while singing the words "I am an angel!". Thereafter, he used chloroform extensively and its use spread rapidly through Europe. However, it was later abandoned in favor of ether due to its side effects which included cardiac toxicity.

The final class of anesthetics that are still in use today are barbiturates. These are a class of sedative drugs that are derived from barbituric acid. Barbituric acid was first synthesized and named by the German chemist Adolf von Baeyer in 1863. There are several interesting theories about why he called the compound barbituric acid, including the conjecture that he had named it after St. Barbara because he had discovered the compound on the feast day of St. Barbara (December 4th). His work was closely followed by Hermann Emil Fischer, a German chemist and Nobel prize recipient, and Joseph von Mering, a German physician. Together, they developed the first barbiturate, diethylbarbituric acid, in 1902. They realized that this compound could be used for treating insomnia, epilepsy, anxiety, and as an anesthetic. Unlike ether, nitrous oxide, and chloroform, barbiturates are administered intravenously. Barbiturates are still widely used because they cause smooth and rapid sedation. However, they can cause cardiorespiratory depression and tissue damage when not administered carefully.

Figure 4: An advertisement from an American medical journal in 1933 highlighting the "short but powerful hypnotic effect and prolonged sedative action from a small dosage". Source.

The ideal anesthetic should be non-flammable, should dissolve easily in the blood stream, should not have any toxicity, and should not be an irritant to the respiratory system. Usually anesthetics are used in combination for optimum results. It is important to remember that there are inherent risks and drug interactions that are specific to each patient. Currently, there are a wide range of anesthetics that are used. They include derivatives of ether compounds, nitrous oxide, and barbiturates. These compounds can be administered either by inhalation or by intravenous injection. 

Friday, August 11, 2017

A round of shots for everybody!

Vaccination is the process by which vaccines are introduced into the body to stimulate the defense mechanisms of the immune system, thereby enabling the body to defend itself against a future infection. Vaccines contain antigens- molecules whose structural components usually resemble the target pathogens and are used to elicit an immune response in the host. The immune response involves mounting a defense against the antigens as well as activating memory cells that will "remember" the antigens. The latter aspect is especially important because if, on a later occasion, the pathogen is detected in the body, it will be recognized immediately, and the immune system will mount a much stronger response thus preventing the development of the disease.

As early as 430 B.C. it was observed that smallpox survivors were immune to subsequent infections of the disease, enabling the survivors to nurse the afflicted individuals. In the 10th century, the Chinese attempted to immunize susceptible individuals. They did so by transferring the contents of the smallpox pustules on cotton and then putting the cotton up the nostrils of the individuals. Furthermore, there are documented examples of inoculation from the 17th century in Africa, India, and China. The process of inoculation involved introducing the smallpox virus subcutaneously into non-immune individuals. Interestingly, in India the process of inoculation was in conjunction with the worship of Shitala Devi, the goddess of smallpox. Other deities of smallpox include T'ou-Shen Niang-Niang in China, Tametomo in Japan, St. Nicaise in Europe, and Sopona in Africa.

Figure 1: Shitala Devi represented as a young maiden. She carries a short broom to dust off all the germs and a pot full of viruses for vaccination. Source.

The concept of inoculation spread to Europe in the 18th century via Turkey. The Turks were aware of the benefits of inoculation because it was a routine practice among the Circassian women. The women would introduce small pox pustules into their infants via arm incisions. Inoculation was primarily carried out to prevent facial disfiguration, a common outcome of the disease. The procedure was of utmost importance because girls were sold as slaves and retaining their beauty was paramount. Reports on the process of inoculation were presented to the Royal Society of London in in 1714 by the physicians Jacob Pylarini and Emanuel Timonius; both had independently sent their accounts from Constantinople. Unfortunately, their accounts were not taken seriously at the time. The practice was later popularized by Lady Montagu. She lived in Turkey with her husband who was the British ambassador to the Ottoman Empire. She had witnessed the inoculation process firsthand and in 1721 had her daughter inoculated under the scrutiny of the King's physician, Sir Hans Sloane. This event, coupled with studies done in America, helped spread the idea from England throughout western Europe.

In America the idea of inoculation was introduced by Cotton Mather, a reverend who is well known for his support of the Salem witch trials. In 1706, Mather's slave Onesimus, explained to him how he had been inoculated as a child in Africa. Fascinated by the idea, Mather read the letters of Timonius and convinced Zabdiel Boylston, a physician, to inoculate people in Boston during the smallpox outbreak in 1721. Of the 248 people inoculated only six died, confirming the effectiveness of the procedure. These statistics helped convince the British physicians to adopt variolation (inoculation against smallpox virus, also known as variola virus). After repeated trials, the practice spread among the royal families of Europe, followed by a general adoption of the procedure by the public.

Figure 2: Zabdiel Boylston's account of smallpox inoculation to the Princess of Wales. Source.

In 1757, a boy from Gloucester was one of thousands to be variolated. He developed a mild case of smallpox and was subsequently immune to the disease. His name was Edward Jenner and he went on to become the pioneer of the world's first vaccine-the smallpox vaccine. Before Jenner popularized vaccination, it was well documented that individuals infected with cowpox (a disease that affects both humans and cows) were immune to smallpox. The mechanism was unknown at the time; we now know that the cowpox virus is similar to, but much milder than, the smallpox virus making it the perfect antigen for a smallpox vaccine. Nevertheless, this phenomenon had been reported in 1768, by an English physician John Fewster, and later in 1782 by a French politician Jacques Antoine Rabaut. The first person to apply these findings to practice was a Dorset farmer, Benjamin Jesty in 1774. Since he had already been variolated, he tried to immunize his wife and sons during a smallpox outbreak. To do so, he used a darning needle to transfer the pustular material from diseased cows to scratches on their arms. It worked perfectly. Unfortunately, his neighbors met his discovery with hostility and labelled him "inhuman". On May 14th, 1796, Edward Jenner performed his first vaccination from a cowpox pustule of a milkmaid to the arm of a young boy. Two weeks later he inoculated the boy with smallpox. The boy was immune. Jenner published his findings in 1798 and coined the term vaccination. By 1801, his report was translated into six languages and over 100,000 people were vaccinated. 

Figure 3: "The Cow-Pock-or-the Wonderful Effects of the New Inoculation!" by the British caricaturist James Gillray, 1802. It depicts Jenner vaccinating patients who feared that it would make them grow cow-like appendages.  Source.

Although vaccination had become commonplace, it did not confer lifelong immunity to smallpox. This fact became apparent during the smallpox pandemics of 1824 and 1837 during which there was a high incidence of mild illness in previously vaccinated adults. In the 1930s Germany was one of the first countries that recognized the need for revaccination which lead to a decline in the incidence of smallpox. Today these additional rounds of vaccination are known as booster doses- they increase immunity against an antigen back to protective levels. This is important because memory against an antigen declines over time. 

Standing on the shoulders of giants

The next breakthrough in vaccine development came from Louis Pasteur, a French microbiologist who is famous for his several contributions to the field of Microbiology. He was working on chicken cholera caused by the bacteria Pasteurella multocida. In 1879, Pasteur asked his assistant Charles Chamberland, another famous French microbiologist, to inoculate the chickens with the bacteria when Pasteur was on holiday. Like any good assistant, Chamberland failed to do so because he went on holiday himself. When he came back after a month, he introduced the month-old culture into the chickens. After a brief period of illness the chickens recovered. Subsequently when Pasteur inoculated the chickens with the infectious form of the bacteria the chickens survived. He thus invented the first attenuated vaccine- vaccines that are created by reducing the virulence of the pathogen without affecting its viability. This vaccine was different from the smallpox vaccine because the strains had been artificially weakened and so a naturally weakened form of the disease organism was not required. 

Figure 4: A painting of Louis Pasteur in his laboratory by Albert Edelfelt, 1885. Source


Another famous attenuated vaccine is the BCG (Bacillus Calmette-Guérin) vaccine. In 1908, Albert Calmette, a French physician, and Camille Guérin, a veterinarian, were trying to develop less virulent strains of the tuberculosis bacteria. They noticed that bacteria grown in a mixture of glycerin, potato, and bile were less virulent. They wanted to test whether subculturing i.e. transferring bacteria from a previous growth medium to a fresh growth medium, could produce a strain that was sufficiently attenuated to be considered as a vaccine. After 239 subcultures over 13 years, they isolated the BCG strain. The vaccine was first used on humans in 1921 and was the first live tuberculosis vaccine. The WHO (World Health Organization) currently recommends the vaccine be given to children in countries that are highly endemic for tuberculosis. 

During the first half of the 20th century there were pandemic outbreaks of polio all around the world. Many famous people from President Roosevelt to nuclear physicist Robert Oppenheimer were victims of the disease. From 1935-1950 many scientists tried to invent polio vaccines, which they tested on themselves and their families. Unfortunately, all their efforts failed. In 1952, Jonas Salk developed the first effective polio vaccine. He used formaldehyde to kill the three types of poliovirus and then administered the inactivated strains by intramuscular injection. The vaccine worked and the resulting vaccination program was the first of its kind involving 20,000 health officials, 220,000 volunteers, and 1,800,000 school children participating in the trail. Although the Salk vaccine helped prevent most of the complications from polio, it did not prevent the initial intestinal infection. In 1954, Albert Sabin developed a live, attenuated vaccine which blocked the polio virus from entering the bloodstream from the intestine. Furthermore, the vaccine could be administered orally, provided longer lasting immunity, and was cheaper to produce. These advantages enabled the mass production of the vaccine and played a key role in nearly eradicating polio.

Figure 5: Newspaper headlines about the polio vaccine from 1955. A Gallup poll showed that more Americans knew about the polio trials than the President's full name (Dwight D. Eisenhower). Source.

Another important contributor to the field of vaccination was Maurice Hilleman. He was an American microbiologist and developed over 40 vaccines; eight of which are now routinely used in recommended vaccine schedules. Of these, one of the most noteworthy vaccines is the mumps vaccine which was developed in 1967. Hilleman's daughter Jeryl Lynn had come down with mumps and he cultured the mumps virus from her throat to develop the vaccine. This strain is still used today in the MMR vaccine, also developed by Hilleman, the first vaccine to be approved that incorporates multiple live virus strains.

To vaccinate or not to vaccinate?

Controversies over vaccination began approximately 80 years before the terms vaccine and vaccination were coined. There was severe religious objections to variolation, one of which included a sermon in 1772 by an English theologian Edmund Massey. The sermon titled "The Dangerous and Sinful Practice of Inoculation" decried variolation stating that diseases were God's way of punishing sin and any attempt to prevent smallpox was diabolical. This sermon was published and soon reached North America where it contributed to the initial opposition movement. As with variolation, vaccination was also initially opposed on religious grounds. However there were several clergymen, including Robert Ferryman and Rowland Hill who were influential advocates of vaccination. Some of the opposition to vaccination came from variolators who lost a lucrative monopoly. Ultimately, the anti-vaccine movement was galvanized due to legislation, both in England and North America, that made vaccination compulsory. Although vaccination worked, there were no techniques to ensure quality control. Therefore, the vaccines were sometimes contaminated with pathogens, which lead to diseases such as tuberculosis, syphilis, and tetanus. As a compromise, governments introduced clauses that allowed parents to opt out of compulsory vaccination, provided that they fully understood the associated risks. Furthermore, as stated previously, smallpox did recur in previously vaccinated individuals, which caused anti-vaccine proponents to point out that vaccine protection was not absolute.

Figure 6: 19th century image from a Victorian era anti-vaccination journal. It depicts a police officer reminding a mother to vaccinate her child, while a skeleton is touching the child at the site where vaccines are routinely administered. Source.  

In the 20th century there were several mishaps in vaccine formulation that caused major setbacks for pro-vaccination advocates. In 1901, antitoxin from a horse, Jim, was contaminated with tetanus and killed 13 children in St. Louis, Missouri. The antitoxins were prepared from animals that had been immunized against diphtheria. Introducing horse serum into humans conferred passive immunity- short term immunity due to introduction of antibodies from another person or animal. In the same year 9 children in Camden, New Jersey died from contaminated smallpox vaccines. In 1955, Cutter Laboratories produced 120,000 doses of the Salk vaccine which were contaminated with live polio virus. The vaccine caused 40,000 cases of polio, 53 cases of paralysis, and 5 deaths. The resulting polio epidemic was one of the worst pharmaceutical disasters in U.S. history. 

One of the biggest vaccine controversies today involves the theory that thimerosal, a preservative used in some DPT vaccines, causes autism. Thimerosal gained popularity doing World War I because its incorporation into vaccines improved their shelf life and prevented the growth of pathogenic contaminants. Furthermore, thimerosal did not have any adverse health effects. The first real concern over thimerosal arose in  the 1970s when it was discovered that methylmercury, which was used as a fungicide in agriculture, caused adverse health effects. This form of mercury is distinct from thimerosal, and unlike thimerosal it is harmful to humans. Methylmercury also affects the fetus resulting in neurological defects in newborns. This observation set the stage for the autism controversy. In the 1990s several studies were carried out to understand the factors that contribute to the development of autism. Unfortunately there were several conflicting theories on the causes and treatment methods of autism. Disillusioned by the lack of scientific consensus, parents of autistic children blamed thimerosal. Unfortunately, this erroneous belief is still propagated today. Currently one out of seven types of DPT vaccine uses thimerosal and the dose used has been shown to be non-toxic. Furthermore, thimerosal has been phased out of most vaccines purely as a precautionary measure.

The other noteworthy vaccine controversy is the claim that the MMR vaccine results in autism.  The controversy started in 1998 with the publication of a fraudulent research paper by Andrew Wakefield. This paper is singularly damaging because of media attention which has lead to the repeated propagation of its findings. This is concerning because it has been conclusively proven that the conclusions of the paper are inaccurate, several of the data have been manipulated, and the paper has been completely retracted. According to the paper, eight out of the twelve children investigated had developed pathologies related to autism after the vaccine was administered. Although the link between autism and the pathologies is real, the paper did not provide conclusive evidence that the MMR vaccine was involved. Several studies, including a 2012 study which involved 14,700,000 children, have concluded that there is no relation between the vaccine and the incidence of autism. 

This article has highlighted the centuries of work that have gone into developing vaccines. It is this work that has eradicated smallpox, nearly eradicated polio, saved lives, and prevented infectious diseases including anthrax, cholera, influenza, and typhoid. Forgetting the benefits of vaccine, listening to adverse media coverage, and believing fraudulent articles has terrifying and far-reaching consequences. Routine vaccinations are a cost-effective practice for disease prevention. In 2001 it was calculated that in the U.S. routine childhood immunizations saved over $10 billion in direct health care costs. More importantly, incomplete vaccine coverage weakens herd immunity- where the population of immune individuals protect the individuals who are not immune. This is possible because it reduces the possibility that the non-immune individual will come in contact with a carrier of the infection. The increase in measles outbreaks in different parts of the world in the past decade are evidence of this fact.
Figure 7: The benefit of herd immunity.  Source.