Read The Rise and Fall of Modern Medicine Online
Authors: James Le Fanu
While the everyday practice of medicine, whether in hospital or general practice, is likely to remain much the same, the two driving forces of intellectual enquiry and therapeutic advance, the Big Science that medical research has become and Big Pharma, can scarcely survive in their present form.
The Big Science of the New Genetics is in big trouble in a way that could scarcely have been anticipated in the heady atmosphere surrounding the completion of the Human Genome Project. There is no arguing with the crushing verdict of the âmissing 95 per cent' of the heritability of common disorders. For the moment the view prevails that current techniques will, by generating yet more billions of gigabytes of basic biological data, bulldoze a causeway through current perplexities to an understanding of the phenomena of illness. But on past experience the reverse is likely to be the case, with an inverse relationship over the past fifty years between the scale of medical research funding and its âproductivity' in terms of original and useful findings.
âThe problem may be that Big Science is inappropriate for generating medical progress,' writes Professor Bruce Charlton of Newcastle University.
The dominant research paradigm is the âbasic-to-applied' model, the assumption that expanding âbasic' medical research leads predictably to an increase in âapplied' clinical breakthroughs. The continuing failure to [do so] makes it apparent this assumption is, at best, only partially valid. [Rather] the excessive monolithic funding of the âbasic-to-applied' model may be stifling a diversity of more fruitful research strategies, at present disregarded because they are not amenable to the Big Science model.
1
Jonathan Rees, Professor of Dermatology at Edinburgh University, concurs. âThe financial hegemony of the “basic-to-applied” paradigm [of Big Science] accounts for the slowing in real therapeutic advance.' The three most significant developments in his specialty, he points out, over the past twenty years all originated in traditional methods of clinical observation: the dramatic effectiveness of the Vitamin A derivative tretinoin in the treatment of acne; ultraviolet ray treatment for those with psoriasis; and the discovery that the common skin condition seborrhoeic dermatitis is due to a chronic yeast infection, curable with anti-fungal drugs. âThese “clinically driven therapeutic advances”', Professor Rees points out âare not marginal, but astonishingly effective. Most of the key insights came from clinical investigators with a history of success in more than one field. The corollary of this is that the ability to experiment or make observations at the level of the whole human is incredibly important â [whereas] predictions from basic research to patients are frequently, if not usually, inadequate.'
2
The further problem is structural. Big Science means big projects with big budgets to the tune of millions of pounds which can only be justified on the basis that they will produce âresults'. That means pursuing the sort of research that is technically routine and can be guaranteed to generate new facts that can then be written up in a science journal. There is therefore little space or opportunity for originality, although it is precisely the pursuit of the anomalous and unexpected findings that in the past has proved so fruitful.
3
âThose applying for research funds are expected to have a clearly defined programme for three to five years,' observes Professor Morton Meyers of the State University of New York. âImplicit is the assumption that nothing unforeseen would be
discovered during this time, and even if it were, it would not distract from the approved line of research.'
4
It is obviously highly significant that Big Science is, on practically every count, the antithesis to that which gave rise to the therapeutic innovation of the post-war years. It is carried out by groups of scientists (or technicians) rather than original and determined individuals; it is well funded rather than operating on a shoestring; it is predictable rather than speculative, and unidirectional rather than multi-faceted and interrelated. Big Science is intrinsically conservative in its outlook, committed to âmore of the same', which is then interpreted in a way so as to fit in with the prevailing understanding of how things are. It systematically ignores the âunexpected', whose investigation from Galileo onwards has been the lifeblood of scientific progress. This would matter less were it not that the big guns of Big Science who dominate the grant-giving agencies are scarcely inclined to allocate funds to those whose more radical ideas might challenge the certainties on which their professional reputation rests.
This comparison with the past is highly injurious and so, perhaps, when the full implications of that âmissing heritability' are more generally appreciated the pressure will grow to reorganise and redirect medical research towards more productive forms of scientific enquiry.
The parallel with Big Pharma is obvious enough, for it is similarly confronted by the potentially lethal combination of escalating research costs and low productivity. The vast multinational drug companies are prone to similar problems of organisational inertia and a desire to âplay it safe', cutting back on support for individual scientists pursuing original research in favour of copying the blockbusters produced by their competitors.
The recent switch in emphasis to âbiological therapies' may provide a reprieve from the potentially catastrophic loss of revenue from those blockbusters âfalling off a cliff' but, as pointed out, this is a high-risk strategy with substantial implications for the viability of the health services of the Western world. It is even possible that the immensely successful synergy between profits-driven, innovatory capitalist enterprise and medicinal chemistry has served its purpose and is drawing to an end. For all the pharmaceutical companies' attempts to blur the boundaries, their interests are not synonymous with those of the wider medical enterprise. âThe primary mission of any drug company', as Marcia Angell points out, âis, like any investor-owned business, to increase the value of their shareholder stock. That is their fiduciary responsibility and they would be remiss if they didn't uphold it. All their other activities are means to their end.'
5
Their purpose is still to develop profitable drugs, but where that conflicts with the common (or higher) good then they too will come under pressure to reform themselves or face the prospect, like the US automobile industry, of extinction in their present form.
Meanwhile the stakes could scarcely be higher. The creation of the modern state-funded health service ensuring the prodigious benefits of medicine are available to all is undoubtedly amongst the supreme achievements of Western civilisation. But like any beneficent institution it is vulnerable to those powerful forces, epitomised by Big Science and Big Pharma, whose priorities, inadvertently or not, favour their own sectional interests. The best defence against such threats remains, as ever, a proper historical understanding of medicine's recent past â that âhigh point of advantage' from which alone we can see the age in which we live.
It now seems clearer than ever that the future prospect of
medical advance is predicated on confronting the central, unanswered but most potent of all questions, the biological cause of diseases such as multiple sclerosis, rheumatoid arthritis and, indeed, the vast majority of the conditions in the medical textbooks (as described in âThe Unsolved Problem'). Crack that, and so much that is currently obscure will become clear, along with the opportunity to prevent, or indeed cure, what for the most part still remain incurable illnesses. The lengthening list of those possible causes now includes the viruses implicated in cancers of the cervix and liver and adenocarcinoma of the lung; half a dozen potential culprits to account for the rise and fall of heart disease; and four different viruses implicated in diseases such as multiple sclerosis. That was the essential issue ten years ago and it remains so today. The prospects for the next ten years stand or fall on progress in its further elucidation.
T
he fate of arthritic patients in the 1930s has been described as âhorrendous' â âold women crippled with rheumatoid with their knees tucked under their chins and nails grown through the back of their hands . . . spondylitics so bent they could only see the ground and sometimes it was easier to progress backwards looking between their legs than trying to see in front of them'. In 1948 everything changed with rheumatology's
annus mirabilis
â the discovery of the dramatic effectiveness of cortisone, or compound E, in the treatment of rheumatoid arthritis. Rheumatologists used the discovery of cortisone (and, as important, the recognition that it had severe side-effects that, they argued, meant that only specialists should administer it) to transform their specialty. Prior to this their main responsibility had been in the field of physical rehabilitation, supervising physiotherapists in the administration of their physical treatments, a specialty described as being âas unscientific as it was unfashionable'. Now they had their own highly effective medical treatment to link them much more strongly to the mainstream of general medicine, a trend powerfully
enforced as cortisone was found to be effective in an ever-widening circle of illnesses.
1
But there was even more to 1948 than cortisone, as in the same year a marker in the blood, rheumatoid factor, was discovered in the blood of patients with rheumatoid arthritis and a similar marker was identified in systemic lupus erythematosus (SLE). These two diagnostic tests led to the elaboration of the concept of autoimmunity as the unifying feature of the rheumatological diseases otherwise known as connective tissue disorders: rheumatoid arthritis, SLE and polyarteritis nodosa. This profoundly altered the focus of the science of immunology away from the infectious diseases that had been its main preoccupation for 100 years (and had already been âsolved' by the discovery of antibiotics) towards rheumatological diseases. So rheumatology moved from being âunscientific and unfashionable' to become one of the most scientifically based of all disciplines.
2
Nonetheless, the new drugs that were to have such an impact on the treatment of arthritis and rheumatism in the post-war years did not come from the science of immunology but rather were discovered either blindly or by chance. Fortuitous events led to the discovery of cortisone, as already described, but there were other important innovations.
Antibiotics transformed the specialty of rheumatic disorders, most notably with the eradication of rheumatic fever caused by beta haemolytic streptococcus group A bacteria, much the commonest cause of arthritis in children. When the specialist children's hospital for rheumatological disorders was founded
in Taplow in 1947, ninety-six out of 100 beds were reserved for children with rheumatic fever. But over the next ten years rheumatic fever all but disappeared, mainly because of âthe totally irresponsible use of antibiotics by general practitioners' for the treatment of sore throats. Thus, perhaps without realising it, the humble family doctor âeffectively wiped out a major cause of debility and death'. That was not all, as antibiotics also eliminated a further major group of joint diseases responsible for much misery and disability: acute and chronic infections of the joints and spine, particularly those resulting from tuberculosis.
3
Robert Koch reported to an international conference in Berlin that the combination of gold and cyanide was much the most effective of all antiseptic agents against tuberculosis, though ineffective in curing experimentally infected animals. Little more was heard about the possible therapeutic uses of gold until the 1930s, when J. Forestier in Paris, influenced by the prevailing view that arthritis arose from chronic infection, started treating rheumatoid patients with gold injections. The results were surprisingly good, though accompanied by severe side-effects, and in the absence of any other effective remedies its use spread rapidly. With the decline of the infective theory of rheumatoid arthritis, the role of gold clearly needed to be re-evaluated. In the late 1950s Stanley Davidson, Professor of Medicine at Edinburgh, initiated a multi-centre trial, which âto the surprise of three-quarters of the participants, came out directly in favour of the gold-treated cases'.
4
The serious side-effects associated with steroid treatment encouraged the drug companies to search for a safer compound, resulting in the non-steroidal anti-inflammatory drugs: phenylbutazone, indomethacin and ibuprofen.
Phenylbutazone:
The powerful and widely used analgesic amidopyrine was found to have the potentially lethal side-effect of markedly reducing the white blood cell count, exposing those taking it to the danger of serious infection. The drug company Geigy sought to minimise this side-effect by making the drug in an injectable form on the grounds that smaller doses would be safer. As amidopyrine was insoluble, it had to be coupled with a solvent, the most effective of which was found to be an acidic analogue â phenylbutazone. When it was subsequently discovered that the blood levels of the solvent were much higher than those of the active ingredient amidopyrine, an astute research chemist speculated whether the solvent might be an effective anti-inflammatory drug in its own right. This was duly investigated, leading Geigy to market the solvent on its own under the trade name Butazolidine.
5