Read The Knowledge: How to Rebuild Our World From Scratch Online
Authors: Lewis Dartnell
Tags: #Science & Mathematics, #Science & Math, #Technology
Arriving at the correct diagnosis of a disease is useful only if you’ve already developed a set of pharmaceutical preparations that are known
to be effective against particular ailments. For much of human history this has been a real stumbling block, and before the twentieth century the doctor’s medicine bag was largely ineffectual: imagine the frustration at understanding the diseases killing your patients but being powerless to stop them.
Many modern drugs and treatments derive from plants, and the traditions and folklore of herbal medicine are as old as civilization itself. Almost 2,500 years ago Hippocrates—renowned for the Hippocratic oath of the physician’s ethical code—recommended chewing willow to alleviate pain, and ancient Chinese herbal medicine similarly prescribes willow bark to control fever. The essential oil extracted from lavender has antiseptic and anti-inflammatory properties and is therefore useful as an external balm for cuts and bruises, whereas tea tree oil has been used traditionally for its antiseptic and antifungal action. Digitalin is extracted from foxgloves and can slow down the heart rate of those suffering from a fast irregular pulse, while the bark of the cinchona tree contains the antimalarial drug quinine, which gives tonic water its characteristic bitter flavor (and led to the British colonial penchant for sipping gin and tonics).
One particular class of drugs we’ll linger on for a moment are those used for pain relief, or analgesia. While these pharmaceuticals are palliative, targeting the symptom rather than the cause, they are the most commonly taken drugs in the world, for everything from everyday discomforts like headache to more serious injuries. Analgesia is an essential prerequisite for the redevelopment of surgery. Limited pain relief can be achieved by chewing willow bark, and topical analgesia, suitable for superficial injuries or minor surgical procedures such as lancing boils, is provided by chili peppers. The capsaicin molecule that gives chilies their illusory fiery burn in the mouth is known as a counterstimulant, and, like the contrary cooling effect of menthol from mint plants, can be rubbed onto the skin to mask pain signals (both
capsaicin and menthol are used in muscle-easing heat patches or ointments like Tiger Balm).
But the universal painkiller, used since antiquity, is provided by the poppy. Opium is the name of the milky pink sap that can be harvested from the poppy after it has flowered, and it has considerable pain-relieving qualities. Traditionally, opium is collected daily by making several shallow slices in the swollen, golf-ball-size seedpod of the poppy plant, allowing the sap to seep out and dry to a black latex encrustation that is scraped off the following morning. Morphine and codeine are the major narcotics in opium: the dried sap can contain up to 20 percent morphine. These opiates are far more soluble in ethanol than in water, and a potent (but addictive) tincture of opium, laudanum, is made by dissolving powdered opium in alcohol. A much less labor-intensive system developed in the 1930s uses several washes of water (often slightly acidic to improve solubility) to extract opiates from the poppy after the plant has been reaped, threshed, and winnowed—the poppy seeds kept for eating or replanting, just as you would do with cereals. In fact, 90 percent of medical opiates today are still harvested from poppy straw.
The risk, though, in taking crude decoctions or tinctures of plant extracts is that, without the capability for chemical analysis, you don’t know the actual concentration of the active ingredient, and taking too much can be dangerous (particularly if, like digitalin, it interferes with your heart rate). You may have a narrow window of opportunity in the dosage: trying to hit the sweet spot of administering enough to be effective, but not so much as to become lethal.
For the vast majority of serious and ultimately fatal conditions, from pervasive infection and septicemia to cancer, no effective treatment at all is available from simple herbal concoctions. The key enabling technology that started the phenomenal medical revolution after the Second World War was prowess in organic chemistry for
isolating and manipulating pharmaceutical compounds. Pharmaceuticals today are available in precisely known concentrations, and either have been synthesized artificially, or a plant extract has been modified using organic chemistry to increase the potency or decrease the side effects of the compound.
For example, a relatively simple chemical modification is made to the active ingredient in willow bark, salicylic acid, that allows it to retain its efficacy as a fever-reducing painkiller but reduces the side effect of stomach irritation. The result is aspirin, the most widely used drug in history.
The key practice in evidence-based medicine that you’ll need to return to after the Fall is running a fair test to see if a particular compound or treatment actually works
*
—or whether it should be thrown out alongside useless snake oils, witch-doctor potions, and homeopathic concoctions. Ideally, objectively testing a treatment’s effectiveness in a clinical trial involves a meaningfully large number of patients split into two groups: one to receive the putative therapy and the other, the control group that forms the baseline for comparison, to be given a placebo or the current best drug. The two pillars of successful clinical trials are the random assignment of test subjects to the groups so as to remove bias, and the use of “double blinding”: neither patients nor practitioners know who has been assigned to which group until the results are analyzed. During the redevelopment of medical science after a Fall there will be no shortcuts for meticulous, methodical work, which may also call for disagreeable practices like animal testing for the sake of easing human suffering.
For some conditions, the best course of action is surgery: to physically correct or remove the faulty or troublesome component of the body’s machinery. But before you can even think about attempting surgery (with a reliable chance of patient survival)—intentionally creating a wound to open the body, having a look inside, and tinkering with the workings inside like a car mechanic—there are several prerequisites that a post-apocalyptic society will need to develop. These are the three As: anatomy, asepsis, anesthesia.
We have already seen that you need to know how our body is built so that you can tell a diseased organ from a healthy one. And without a detailed grasp of anatomy, your surgeons are literally poking around in the dark. You need to have a comprehensive map of the body’s internal makeup, the normal forms and structures of each of its components; you need to understand their function and know the paths of major blood vessels and nerves so that you don’t accidentally sever them.
Asepsis is the principle of preventing microbes getting into the body in the first place during surgery, rather than trying to clean the wound later with antiseptics like iodine or ethanol solution (antiseptics are your only option for an accidental, dirty wound). To maintain aseptic conditions, scrupulously clean the operating theater and filter the air supply. The site of the operation can be cleansed with 70 percent ethanol solution before the incision is made, and the patient’s body covered with sterile drapes. Surgeons must wear clean surgical gowns and face masks, scrub their hands and forearms, and operate with heat-sterilized surgical instruments.
The third crucial element is
anesthetics. These are drugs that don’t cure disease but do something just as valuable: they can temporarily pause all sensitivity to pain, or even induce complete unconsciousness. Without this, surgery is an abominably traumatic experience and can
be attempted only as a last resort. The surgeon must work rapidly, slicing through muscular tensions and spasms as the patient writhes in agony, and only simple procedures can be considered: removal of a kidney stone or the brutish amputation of a gangrenous limb with a butcher’s saw. With an insensate patient, however, surgeons can afford to work much more slowly and carefully, and are able to risk invasive operations on the chest and abdomen, as well as exploratory surgery to see what might be the underlying causes of an ailment.
The first gas to be recognized for its anesthetic properties was
nitrous oxide, or “laughing gas”: when it is inhaled at high enough doses, its exhilarating sensation can give way to true unconsciousness, suitable for surgery or dental work. Nitrous oxide is generated from the decomposition of ammonium nitrate as it is heated—be careful, though, as the compound is unstable and may explode if it gets much hotter than 240°C—and the anesthetic gas is then cooled and cleaned of other impurities by bubbling through water. Ammonium nitrate can itself be produced by reacting ammonia and nitric acid (see Chapter 11). Nitrous oxide alone is good for dulling the sensation of pain, but it is not very powerful as an anesthetic. If, however, it is administered with other anesthetics, such as diethyl ether (often abbreviated to ether), it acts to potentiate them, enhancing their effectiveness. Ether can be produced by mixing ethanol with a strong acid, such as sulfuric acid, and then extracting ether from the reaction mixture by distillation. It is a reliable inhalation anesthetic, and although ether is relatively slow to work and can produce nausea, it is medically safe (although it is an explosive gas). The advantage of ether is that not only does it induce unconsciousness, but it also acts to relax muscles during surgery and provides pain relief.
But what if, generations after the Fall, society has regressed so much that the vital knowledge of germ theory has been lost, and pestilence is once again attributed to bad air (
mal aria
) or fractious gods? How could a post-apocalyptic civilization rediscover the existence of unimaginably tiny creatures invisible to the eye that cause food spoilage, festering wounds, putrefaction of corpses, and infectious diseases?
In fact, bacteria and other single-celled parasites can be seen with some beguilingly simple equipment. A rudimentary microscope is surprisingly easy to make from scratch. You’ll need to start with some good-quality, clear glass. Heat the glass and draw it out into a thin strand, and then melt the tip of this in a hot flame so that it drips. The globule cools as it falls, and with luck you’ll produce some very tiny glass beads, perfectly spherical in shape. Use a thin strip of metal or cardboard with a hole in the middle to mount your spherical lens, and hold it over a sample. This simple microscope works because the tiny ball of glass has a very tight spherical curvature and thus a powerful focusing effect on light waves passing through it. This also means that the focal length is exceedingly short, though, and you will need to position the lens and your eyeball right down close to the target.
*
The realization born of your instrument-enhanced senses is that there’s a whole teeming universe of invisibly small organisms down there—astonishingly diverse varieties of new wildlife for post-apocalyptic micro-naturalists to identify and sort into related families and groups. With the rigor demanded of scientific proof, you can demonstrate not only that microbes are present in infected wounds or spoiled milk, but that food is preserved if microbes are
not
present
. If you seal nutritious broth or corruptible meat within an airtight jar and heat it to inactivate any microbes already present, no decomposition will occur: things don’t spoil spontaneously. Better microscopes can be constructed, similar to a telescope, from combinations of lenses, and in time you’ll be able link the presence of specific microorganisms to particular infectious diseases.
*
You can even grow and study these microorganisms in captivity, culturing them in flasks of liquid broth or as colonies on the surface of a solid nutrient. Petri dishes can be molded from glass, filled with nutrient-enriched agar poured in to set, and fitted with a lid to prevent contamination. Agar is a gel-forming substance extracted from boiled red algae or seaweed (and common in Asian cuisine), similar to the gelatin derived from cattle bones but indigestible by most microbes.
In earlier chapters we have seen that this fundamental microbiology is needed for the optimization of processes such as making leavened bread, brewing beer, preserving food, and producing acetone. But perhaps most important in improving the human condition after the
Fall, microbiology provides the prerequisite knowledge base for discovering more targeted methods than noxious antiseptic chemicals for killing bacteria and curing infection.
In 1928 Alexander Fleming had been working on cultures of
Staphylococcus aureus
bacteria from skin abscesses before leaving for a holiday. On his return he started clearing his lab bench and washing up the old Petri dishes. Randomly picking up one from the top of a pile in the sink that had not yet been treated with disinfectant, he noticed a small patch of mold surrounded by a ring clear of bacteria on an otherwise overgrown plate. It seemed that some substance secreted by the mold, later identified as a species of
Penicillium
, had inhibited bacterial growth. Penicillin, the secreted compound, and numerous other antibiotics discovered or synthesized since, are extremely effective at treating microbial infections and save millions of lives every year.
“The most exciting phrase to hear in science, the one that heralds new discoveries,” said science fiction author Isaac Asimov, “is not ‘Eureka!’ [“I found it!”], but ‘That’s funny . . .’” This is certainly true of Fleming’s chance finding, along with many other serendipitous discoveries, but only if the implications are grasped. Indeed, fifty years earlier other microbiologists had noticed that
Penicillium
prevented bacterial growth, but had not made the next conceptual leap from this observation to pursuing the ramifications for medicine.
With hindsight, however, and knowing of the existence of such effects, could a rebooting society replicate a similar series of experiments to deliberately search for effective molds and so rapidly rediscover antibiotics? The basic microbiology is straightforward. Fill Petri dishes with a beef-extract nutrient bed that is hard-set by seaweed-derived agar, smear across
Staphylococcus
bacteria picked out of your nose, and expose different agar plates to as many sources of fungal spores as you can, such as air filters, soil samples, or decaying fruits and vegetables. After a week or two, look carefully for molds that have inhibited the growth of bacteria around them (or indeed other bacterial colonies that
do so: many antibiotics are produced by bacteria locked in an evolutionary arms race with one another). Pick them off to isolate the strain and attempt to grow it in liquid broth to make the secreted antibiotic more accessible. Antibiotic screens have now found numerous compounds from fungi and bacteria, although
Penicillium
molds are so common in the environment they are likely to be among the first re-isolated after the apocalypse. They’re one of the principal causes of spoiling food: in fact, the
Penicillium
strain responsible for most of the penicillin antibiotic produced worldwide today was isolated from a moldy cantaloupe in a market in Illinois.
However, even for a rough-and-ready post-apocalyptic therapy you can’t simply inject the antibiotic-containing “mold juice” because, without refining, its impurities will trigger anaphylactic shock in the patient. The chemistry worked out by Howard Florey’s research group at the end of the 1930s to purify penicillin from the growth medium exploits the fact that the antibiotic molecule is more soluble in organic solvents than in water. Strain the growth culture to remove bits of mold and detritus, add a little acid to this filtrate, and then mix and shake with ether (we saw earlier in this chapter how to make this versatile solvent). Much of the penicillin will pass from the watery growth fluid into the ether, which you need to let separate and rise to the top. Drain off the bottom watery layer, and then shake the ether with some alkaline water to entice the antibiotic compound to pass back into the aqueous solution, now cleansed of much of the crud in the growth fluid. The daily dose of penicillin for a single person prescribed today requires up to 2,000 liters of mold juice to be processed, and so post-apocalyptic antibiotics will demand a high level of organized effort to produce. By the end of 1941 Florey’s team had scaled up production to make enough penicillin for clinical trials, but they were forced by wartime shortages of equipment to improvise. Mold cultures were grown in racks of shallow bedpans and makeshift extraction equipment built using an old bathtub, trash cans, milk churns, scavenged copper
piping, and doorbells, all secured in a frame made from an oak bookcase discarded by the university library—inspiration, perhaps, for the scavenging and jury-rigging necessary after the apocalypse.
So while the discovery of penicillin is often portrayed as accidental and almost effortless, Fleming’s observation was only the very first step on a long road of research and development, experimentation and optimization, to extract and purify the penicillin from the “mold juice” to create a safe and reliable pharmaceutical. In the end, the United States provided the large-scale fermentation to supply enough for widespread treatment. Similarly, once it understands the necessary science, a post-apocalyptic civilization will need to reattain a certain level of sophistication before it can produce enough antibiotic for it to have an impact across the population.