The Rise and Fall of Modern Medicine (21 page)

BOOK: The Rise and Fall of Modern Medicine
8.24Mb size Format: txt, pdf, ePub
ads

This argument might seem esoteric but its implications are not. If Pickering were right then logically anyone whose blood pressure was higher than ‘average' should benefit from having their blood pressure lowered, leading to the claim following one important study that 24 million United States citizens had ‘hypertension' that was either ‘undetected, untreated, or inadequately treated'. This was clearly good news for the pharmaceutical industry who had sponsored the study, for the prospect of finding 24 million as yet undiscovered patients and treating them with regular medication for life was nothing other than a gold mine. The catch was that the evidence of benefit from treating the millions with ‘mild' hypertension was less than compelling.
16

There are two predictable effects of telling someone his raised blood pressure needs treatment. The first is to make him worry about his health and be more aware of his mortality. Such fears
are likely to be hidden and so difficult to measure, but a study of 5,000 steelworkers in 1978 found ‘dramatically increased rates of absenteeism where steelworkers are labelled hypertensive'. Those ‘labelled' as having raised blood pressure tended to see themselves as being vulnerable to having a stroke, which naturally encouraged their adoption of a ‘sick role'.
17
The second predictable adverse consequence is that no matter how relatively free of side-effects chlorothiazide and propranolol (and similar drugs) might be, they still prove unacceptable to a small percentage of those to whom they are prescribed. Both drugs cause lethargy, dizziness and headache in 5 per cent of those taking them and – in men – impotence in 20 per cent and 6 per cent respectively.
18
When these drugs are being taken by millions of people the cumulative burden of these adverse effects is ‘not inconsiderable'. Is it worth it?

Back in 1967 the study of US military veterans readily demonstrated that treating markedly raised blood pressure for only a year could dramatically reduce the risk of strokes. But the results of treating those with ‘mild' hypertension, as it turned out, were much more equivocal – 850 people would have to be treated for a year to prevent just one stroke. Eight hundred and forty nine out of the 850 in any one year would not benefit from taking medication.
19

Nonetheless, the ‘Pickering paradigm' of ‘the lower the blood pressure the better' prevailed and, when hypertension is defined as any level higher than the average, then the obvious implication is that surreal numbers of people need to take blood-pressure-lowering medication. By 1996 more than one in three Americans between the ages of thirty-five and seventy-four were taking medication to lower their blood pressure, generating an annual revenue for the pharmaceutical industry of $6 billion.
20

In the 1990s, the same argument was to be repeated, but this time with cholesterol, where again the benefits of treating those with high levels were extrapolated downwards. The notion of ‘the lower the cholesterol the better' prevailed and millions were started on cholesterol-lowering drugs. And so it is that the great – and very desirable – project of preventing strokes by treating hypertension has enormously expanded the scope of medicine from treating the sick to finding, in the majority who are well, ‘illnesses' they do not necessarily have, and treating them at enormous cost.

10
1971: C
URING
C
HILDHOOD
C
ANCER

I
n medicine – as in life – some problems are more complex than others and, science being the art of the soluble, it is only sensible to leave the apparently intractable aside, hoping perhaps that at some time in the future something will happen to open the doors to their resolution. It is, nonetheless, a distinctive feature of post-war medicine that many doctors and scientists attempted, against all the odds, to take on ‘the insoluble'. Here the long march in the search for the cure for childhood cancer, and Acute Lymphoblastic Leukaemia (ALL) in particular, stands in a league of its own. Whereas the effectiveness of the other discoveries in the post-war years – such as antibiotics and steroids – were immediately apparent, the anti-cancer drugs were different. They worked, but not very well, prolonging the life of a child by, at the most, a few months. So the cure of ALL, as will be seen, required not just one drug discovery but four quite separate ones combined together. Further, it was not sufficient merely to dispense these drugs and observe what happened; rather, a vast intellectual machine had to be created to assess the outcome of different treatment combinations to reveal the small incremental gains that eventually made ALL a treatable disease. Finally, the patients involved were children and the drugs very
toxic. It needed an extraordinary sense of purpose to persist when most doctors believed that inflicting nasty drugs on children to prolong by only a few months a lethal illness was immoral. For all these reasons the cure of ALL ranks as the most impressive achievement of the post-war years.

Acute Lymphoblastic Leukaemia is a malignant proliferation of lymphoblasts (precursors of the white blood cells in the bone marrow). Those afflicted – usually children around the age of five or six – died within three months from a combination of symptoms caused by this lymphoblastic proliferation packing out the bone marrow, thus preventing the formation of the other components of the blood: the reduction of red blood cells resulted in anaemia; the paucity of platelets caused haemorrhage; and the absence of normal white blood cells created a predisposition to infection. The children were pale and weak and short of breath because of the anaemia, they bruised easily because of the low platelets, and the slightest injury could precipitate a major haemorrhage into a joint or even the brain. It was, however, their vulnerability to infection that posed the greatest risk, as they were defenceless against the bacteria that cause meningitis or septicaemia. The ‘inevitable' could be postponed for a month or two, with blood transfusions to correct the anaemia and antibiotics to treat these infections. But so dismal was the prognosis that some doctors even disputed whether these supportive treatments should be given. Professor David Galton of London's Hammersmith Hospital summarises the prevailing pessimistic view at that time: ‘Children were sent home as soon as they were discovered to have the disease. Even blood transfusions might be withheld on the grounds that it only kept the child alive to suffer more in the last few weeks.'
1

From the first attempts to treat ALL in 1945 it took more than twenty-five years before a truly awesome combination of
chemotherapy (or ‘chemo') with cytotoxic (cell-killing) drugs and radiotherapy was shown to be capable of curing the disease.
2
The origins and rationale of this treatment will be considered in detail later but in broad outline it took the following form. The treatment started with a massive assault on the leukaemic cells in the bone marrow, with high doses of steroids and the cytotoxic drug vincristine, lasting six weeks. This was followed by a further week of daily injections of a cocktail of three further cytotoxic drugs: 6-mercaptopurine (6-mp), methotrexate (MTX) and cyclophosphamide. Next came two weeks of radiation treatment directly to the brain, and five doses of MTX injected directly into the spinal fluid. This regimen, which eliminated the leukaemic cells from the bloodstream in 90 per cent of the children, was called ‘remission induction' (i.e. it induced a ‘remission' of the disease) and was followed by ‘maintenance therapy', two years of continuing treatment to keep the bone marrow free of leukaemic cells – weekly injections of the cocktail of three cytotoxic drugs already mentioned at lower doses, interspersed every ten weeks by ‘pulses' of the ‘induction' regime (steroids and vincristine) for fourteen days.

It is impossible to convey the physical and psychological trauma this regime imposed on the young patients and their parents. Each dose of treatment was followed by nausea and vomiting of such severity that many children were unable to eat and became malnourished, stopped growing and ceased to put on weight. Then there were the side-effects caused by the action of the drugs, which not only poisoned the leukaemic cells but also the healthy tissues of the body: the children's hair fell out, their mouths were filled with painful ulcers, they developed chronic diarrhoea and cystitis. It is not for nothing that chemo has been described as ‘bottled death'.
3

This terrible burden of physical suffering would be just
acceptable were it to result in a cure, but there was absolutely no certainty that this would be the case. Prior to the introduction of this particular regime of treatment in 1967, a survey of nearly 1,000 children treated over the previous two decades found that only two could be described as having been cured – having survived for more than five years – and one of these subsequently relapsed and died.
4
Looking back now, it seems astonishing that those responsible for devising this highly toxic regime, Dr Donald Pinkel and his colleagues at St Jude's Hospital in Memphis, should have imposed it on these desperately ill children, not least because of the profound scepticism of his professional colleagues that ‘success' – a major improvement in the prospects of survival – was achievable. This ambivalence is well caught by the contribution of a fellow specialist, Dr Wolf Zuelzer, a paediatrician at the children's hospital in Michigan, in his contribution at an international conference on leukaemia: ‘The side-effects of treatment outweigh those directly attributable to the disease,' he observed, and after reviewing recent progress he could only express the hope that ‘others might find grounds for greater optimism than I have been able to distil from the facts now at hand'.
5

But unknown to Zuelzer the protocol of treatment devised by Pinkel in 1967 would indeed produce the ‘cures' that many had begun to think might elude them for ever, as the cure rate soared from 0.07 per cent to over 50 per cent. ‘We conclude from the results of this study that complete remission of childhood ALL is significantly prolonged by intensive combination chemotherapy. The toxicity and infection encountered are significant but certainly not prohibitive in view of the results obtained,' Pinkel observed in 1971, and everyone agreed. The following year, when he gave the annual guest lecture at the Leukaemia Research Fund in London, he outlined to an
‘entranced audience' of doctors from all parts of Britain the clinical studies co-ordinated at St Jude's over many years.
6
‘There is now no place for the palliative treatment of leukaemia,'
The Lancet
commented in an editorial. ‘Dr Pinkel's results are impressive, not least for the methodical manner in which seemingly intractable problems have been solved at every stage.'
7

To properly appreciate Pinkel's achievement it is necessary to clarify the fundamental problems of treating cancer. When a malignant tumour is limited to one part of the body – say, the breast or the gut – it can be removed and, with luck, cured, either by surgery or radiotherapy. But when the cancer is dispersed – as with acute leukaemia (and the same applies to any cancer that has spread or ‘metastasised') – the only hope lies in drug treatment, which can selectively kill the cancer cells wherever they might be. This would be relatively straightforward were the cancer cells different in some special way that could be interfered with, thus making it possible to kill them while leaving the healthy cells untouched. But though cancer cells are indeed different from normal ones, it has never been possible to turn those differences to therapeutic advantage. The problem was aptly summarised by Professor W. H. Woglom, a distinguished cancer researcher, back in 1945: ‘Those who have not been trained in chemistry or medicine may not realise how difficult the problem really is. It is almost, not quite, but almost as hard as finding some agent that will dissolve away the left ear but leave the right ear untouched.' In the thirty years following Woglom's disheartening analogy, hundreds of thousands of chemicals were investigated for their anti-cancer activity, of which a handful, thirty or so, were found to be of any value. Virtually all owe their origins to chance observation or luck.

The first was nitrogen mustard. At the outbreak of the Second World War, it was anticipated that the Axis powers, Germany and Japan, would resort to chemical warfare – and the use of mustard gas in particular – on a massive scale. Alarmed at the prospect, the US military authorities set up the Chemical Warfare Service to find an antidote. The immediate incapacitating effect of mustard gas is to cause a severe watery inflammation of the eyes (conjunctivitis) and painful blistering of the skin. Its lethality, however, results from its effect on the bone marrow, where it destroys developing blood cells, leaving its victims vulnerable to haemorrhage and overwhelming infections. These effects had first been documented at the close of the First World War and were to be confirmed in 1943 when a German raid on the US fleet in Bari harbour on the Italian peninsula sank a ship – the
Harvey
– with 100 tonnes of mustard gas on board.
8
In a medical report compiled on those exposed to the gas, Colonel Stewart Alexander of the US Medical Corps observed ‘the effects upon the white blood cells was most severe – on the third or fourth day the count began to drop in a steep downward trend'.
9

BOOK: The Rise and Fall of Modern Medicine
8.24Mb size Format: txt, pdf, ePub
ads

Other books

Beginnings by Sevilla, J.M.
It's Nobody's Fault by Harold Koplewicz
The Socotra Incident by Richard Fox
The Passenger by F. R. Tallis
Cat in a Jeweled Jumpsuit by Carole Nelson Douglas
The Chastity Collection by Daniels, Daiza
1 Life 2 Die 4 by Dean Waite