Infected Blood Inquiry: The Report

Open Accessibility and Translation Tool

Volume 3

Presented to Parliament pursuant to section 26 of the Inquiries Act 2005.

IBI logo_White

OGL Logo

© Crown copyright 2024

This publication is licensed under the terms of the Open Government Licence v3.0 except where otherwise stated. To view this licence, visit nationalarchives.gov.uk/doc/open-government-licence/version/3.

Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned.

This publication is available at www.gov.uk/official-documents.

Any enquiries regarding this publication should be sent to us at [email protected].

ISBN 978-1-5286-4685-7
E03066671 05/24

Contents

3.1 Risk

Attitudes to risk go to the heart of the matters this Inquiry is tasked to look into.

More than one politician giving evidence accepted that a first duty of a state is to keep its citizens safe. They were right to do so. It is clear that unless the safety of citizens is regarded as a first consideration there may be harm, and that harm might have been avoidable.

In today’s society risk, meaning the possibility of physical or mental injury, is ubiquitous and inescapable. Almost everything we do comes with the possibility of some adverse consequence of one form or another. Driving a car risks injuring other people, as well as the driver and any passenger, but the purpose of driving it would be defeated if the existence of these risks required cars to be banned. Instead, action can be taken to reduce the likelihood of risks – by such as airbags, better car design, more effective braking, speed limits where appropriate, and by taking individually protective steps such as wearing seat belts. Where the harm that might happen is not that of accidental injury, but of disease, or of a consequence of medical treatment, the principle “first do no harm” may be engaged. However, any treatment will come with some consequences, just as a decision not to treat actively may also do. Those consequences may be serious. It is why pharmaceuticals distributed in the UK will come with a product leaflet pointing out risks which might transpire. They do not aim to stop a person taking the medicine, but to enable clinicians to alert people to what might be an unwanted consequence: that person can then make their own choice whether to take the medicine, and run those risks, or to decline it.

Whether it is necessary to take precautions against a risk depends on the balance to be struck between the magnitude of the risk on the one hand, and the importance of the purpose to be achieved by running it on the other, taken together with the practicability and cost of preventative measures. The assessment will take into account whether alternatives offering less risk are available. Care may need to be taken that any precaution does not itself cause unreasonable risk.

The magnitude of the risk is a combination of the likelihood that what is feared will occur, coupled with its seriousness if it does. Thus, catching the common cold, especially in winter time, is highly likely to occur, but the infection is unlikely to be of any great severity if it does. If the risk is that of flu, it is less likely to occur, but more serious if it does: and in that case, for those who are particularly vulnerable to serious damage to their health if they do catch flu, there are vaccinations to ward against it. A small chance of a more deadly illness occurring – for instance cancer caused by the use of some industrial solvents – would be a risk of much greater magnitude, even though much less likely to occur.

In just the same way, it may be very rarely that a bolt on the wing or body panel of an airliner is left loose when it should be tight, especially when thinking about the number of flights taken every day, worldwide, without that being known to happen. But the consequence of leaving it loose is a risk that the airliner might crash. A small chance of that serious risk plainly requires preventative action to be taken. It does not need to wait for an accident to happen before it occurs, if it can be foreseen that it might. In short, the magnitude (or size) of a risk is not simply a question of counting the number of cases there has been, but the foreseeability they may arise and the gravity of the harm if they do.

The Inquiry has centrally concerned the risks of three diseases by name – hepatitis (of two principal sorts, B and C, undistinguished one from the other for the first 20 years or so of the NHS); AIDS (HIV being the virus which led to it); and vCJD. The magnitude of those risks is now undoubted.

Looking some 70 years ago, though, hepatitis (“serum hepatitis”, as the mixture of what was largely B and C was then called) was seen as a definite risk of transfusion, and it was already recognised that it could be a serious condition (see the chapter on Knowledge of Risk Before 1970). It was thus already seen to be of some magnitude. The importance of the purpose was to be achieved depended on the need for the intended transfusion, or treatment: and (as will be seen) some protective measures were taken. An issue in respect of hepatitis, discussed in later chapters, is whether many health professionals became so inured to their patients running the risk that those measures did not go far enough, quickly enough; whether alternatives were available which carried no or less risk; and whether those people who had to face the risk (the patients) were told by their treating clinicians enough of what was known about the risk to enable the patient to make their own judgement about running it.

When it was realised, after it became possible to test for the presence of Hepatitis B virus, that another as yet unidentified virus (“non-B”, or “non-A non-B”) was responsible for a greater proportion of the hepatitis which had followed transfusion or the taking of blood products, the magnitude of the risk it posed was thought debatable by many health professionals. Whereas Hepatitis B was undoubtedly potentially serious, it was unclear to them that this was also true of the unidentified virus. If it had not been serious, then the practicability and cost of protective measures against it would come into question. Whether such an assumption was an appropriate way of approaching the risk which the unidentified virus posed is considered further in the chapter Hepatitis Risks 1970 and After.

With AIDS, the issue at first was centrally that of the magnitude of the disease. First, did treatment for bleeding disorders or transfusion risk causing the disease (or was the cause something entirely different)? Second, if it did, was the magnitude of the risk sufficient to call for a protective response? There was little doubt, from mid 1981 onwards, that AIDS itself was a very serious disease. A risk of suffering it, therefore, would almost unarguably be of significant magnitude: thus, even if the chances of treatment causing or transmitting it were slim, protective measures or alternative means of meeting the purpose of giving the treatment would be called for. The issues which arose (at least until 1984, as later chapters reveal) were whether there was in truth any real risk from the treatment; and if so, whether the chances of treatment causing AIDS were so slim that the risk could properly be regarded as requiring little or nothing to be done to protect against them.

In assessing the chances of occurrence of infection, it must be remembered that it is of the nature of many diseases that an infection which causes symptoms recognisable by a patient may be caused by events some time previously. The lag time between infection and the occurrence of the obvious ill effects of it may be short – but it might be months and even years. In assessing the likelihood that that disease will eventuate, regard must be had to the fact that what is coming out of the tap may not be the full extent of what is in the pipeline. If a disease takes two years to show itself, then one has to go back two years to see what the cause is. If that is identified, then it may also be true that that has been the cause of other infections throughout the entire two-year period, which have yet to show themselves.

In an epidemic, where cases of a disease being transmitted from one person to another multiply exponentially, it will follow that the number of transmissions two years ago will be far less than they were a year ago, and the rate of transmissions a year ago far less than they were six months ago, and the rate six months ago far less than now. To look at the number of individual cases of disease which have, by now, shown themselves is to look at the incidence of infections two years ago: it does not give an accurate picture of the risk of transmissibility to anyone newly exposed to its transmission, nor is it an accurate reflection of the number of cases that will emerge over the next two years. They will be escalating, exponentially, unless something happens to prevent this.

The Observer on 1 May 1983, [1] expressed these ideas concisely in everyday language in relation to AIDS when it said: “The disease, characterised by a collapse in the body’s ability to fight infection, is a medical time-bomb for Britain. Although only fifteen cases have been reported since it first crossed the Atlantic in December 1981, many more people could be harbouring it, for the incubation period is up to three years.” In other words, the risk that requires a response is the risk of what may be coming.

In assessing the degree of likelihood of the infection as part of judging the magnitude of the risk, it should also be borne in mind that one reason why diseases sometime seem to emerge suddenly in parts of the world new to them is the modern ease and speed with which people cross borders, travelling distances in hours that once took weeks or months. There is danger in thinking that a disease is somewhere else, and not here, when there is no basis to do so. The chapter on the Knowledge of the Risks of AIDS amply demonstrates this in the context of this Inquiry: AIDS was too often regarded either as an “American problem” or “more of an American problem”, which might have been true at the time if one looked at the number of reported cases in North America, but actually said nothing about the risk it might happen here. The question to ask should have been “Is there any reason to think it will not come to the UK?” Epidemics can be exported, or refuelled, from around the world.

If, then it is reasonable for protective measures (or alternatives) to be called for, what can be said about their timing and nature? First, it is elementary that a response to a real risk that a disaster might occur should not wait. What must be addressed is the risk, not the certainty or near certainty, of damage; and the time for addressing it is before damage occurs. Speed of reaction to risk is of the essence.

As to the nature of the response, if a risk appears to be real, and if that risk is one of sufficient magnitude, then it is a fallacy to suppose that the response should be “all or nothing” – that is, if the risk cannot be eliminated, it must be tolerated. There is a real value in reducing the risk.

Two examples of this may be drawn from the subject of the Inquiry. First, that of Hepatitis B. Once it was identified and a test made available for it in about 1970, a universal screening test for donations was quickly introduced by 1972. This was not sensitive enough at first to identify more than around a third of the infective units. Nonetheless, though this lack of sensitivity was known, a universal test was introduced in the UK (at some expense, as well as at some further cost in human resource). To incur that expense (of time, effort and money) was nonetheless an appropriate reaction, when combined with attempts to improve the quality of the test. It was (rightly) thought valuable to reduce the risk even to the limited extent achieved by the test.[2]

So, too, in 1998 a decision was taken to reduce a risk which was by then only theoretical: that plasma might transmit the prion which caused vCJD to develop in the brain. The removal of white blood cells from plasma (“leucodepletion”) was adopted. This precaution looks, on available current evidence, to have significantly reduced the risk of plasma products transmitting the infective prion which caused the condition. Yet at the time it was taken there had been no known case of transmission of vCJD by blood. If government had waited until the first case was reported, more cases might have followed than actually did.

These are two examples of responses to risk which may fall short of complete elimination of the risks involved, but were nonetheless (rightly) considered appropriate by the authorities at the time.

Identifying that there was real risk would, in the cases of both hepatitis and HIV, lead to the obvious answer that it was a risk of significant magnitude, given the seriousness of the consequences of infection. The importance of running that risk, coupled with the availability of protective measures and alternatives, then called for evaluation. An issue in this Inquiry has been that the purpose was said to be ensuring that people with bleeding disorders received treatment which was life-saving; an approach which begged the question whether there were safer alternatives which had less risk which were also life-saving, and/or other measures which would reduce the risk even if there were no suitable alternative.[3] The same is true for transfusion as a treatment: whilst it can obviously be life-saving, the evidence before the Inquiry indicates that transfusions were also often given in circumstances where there was no threat to life.

Summing up, after identifying a real risk to health safety what is critical in the process is acting with speed to address it:

  1. assessing its magnitude not by simple regard for the number of times the risk has already resulted in harm but by the prospect it will do so in future and the seriousness of the harm that will be done;
    1. taking steps in the light of its magnitude to reduce the risk where it can be reduced;[4]
    2. taking a critical look at what is truly important about running the risk to the extent that the risk cannot be eliminated;
    3. making sure that the people most affected by the risk have the information to make informed decisions.

A successful system of identifying and reacting to risks of sufficient magnitude to justify a reaction thus involves constant vigilance, “horizon scanning” and the taking of urgent coordinated action in response. It also will demand a way of assessing whether the consequences involved in taking that urgent action in response to the risk (the “knock-on” effects) are themselves not unreasonable, and do not create worse problems than they solve.

So far as biological threats to people in the UK are concerned, such an approach demands a system of public health capable of achieving this. One of the paradoxes of a good public health system is that there will seem to be no need for it – if it succeeds in identifying and taking steps which will ward off the emergence in the UK of a threat which is real, then it may seem that there has been nothing to worry about. The public may wonder what the point is of spending money when it seems there has never been a case of any such infection.

It is not always easy to identify whether the risk is of a kind which can be ignored, which is why it has never emerged, or whether the system to prevent it taking hold has worked. The sense of security into which a good public health system may lull the population may thus, paradoxically, lead to the complacency about it which is its worst enemy. One of the lessons from the account which follows is that the UK, as a nation, had become too complacent about the risk of infectious diseases, and not sufficiently concerned about resourcing public health.

I regard it as a very important principle in relation to the way risk is handled that so far as practicable those whose safety is at risk should be told this is the case, in sufficient detail and with sufficient clarity for them to understand what the risk is and what they may do to prevent or reduce it, and indeed to allow them to decide for themselves whether they choose to face the risk at all. When considering patient safety, this involves issues of communication, and consent, which are addressed in the next chapter on Consent.

The risks which this Report goes on to consider are those of transmission through blood, and blood products. The blood services of the four nations of the UK form a front line of our protections so far as future risks from blood are concerned. All have now adopted the same, systematic approach to their evaluation and management: a Risk-Based Decision-Making Framework which aims to ensure that the wellbeing of transfusion recipients is central to blood safety decision-making, as well as helping to align resources with health outcomes and to produce evidence-based decisions.[5] It applies eight general principles,[6] seeks to define when risks may be tolerable in light of the benefits gained by running them, and states a policy as to how they should be assessed. Where a risk is identified, this approach involves first identifying the options for dealing with it, next assessing these measures and the risk itself, and then addressing whether the risk is intolerable, tolerable or acceptable.[7] In the light of this assessment, risk management options are determined and scored so as to come to a recommended decision as to what should be done. Where there is insufficient evidence to make a risk-based decision it does not preclude the use of the “precautionary principle”.

The narrative which follows shows the value of a systematic approach to addressing risk. Readers will see that a system was needed, designed with careful unpressured forethought, agreement and patient involvement, which provided sufficient flexibility to allow for coping with the risks discussed here. A proper system to protect against future risks must be capable (as the Risk-Based Decision-Making Framework is) of dealing with the several different ways in which risks of different magnitudes can emerge. It has come too late to affect what happened: but it is encouraging to see that the transfusion services now have a much better system for dealing with risks arising from and through blood than was the case before the mid 1990s.

This chapter however ends with the simple message with which it began. Safety – of the health of populations, of groups within them, and of individuals within those groups – is the starting point. And that centrally involves recognising and reacting to risks to safety so as to reduce or eliminate those risks as far as is practicable.

The fundamental principle of patient consent can be shortly stated. Medical treatment can only be given to a person with their consent.[8] That consent must be informed: in other words, the patient must have been given sufficient information about the risks and benefits of treatment, alternative treatments and the right to refuse treatment. [9] That principle is well understood. It is founded in ethical norms: [10] in particular the principle of autonomy: “by knowingly considering, and then accepting rather than rejecting a proposed course of action based on adequate information, a patient expresses their autonomy and their responsibility for the decision, while also accepting the expertise of the clinician.[11]

Since that principle has been so poorly respected in the context of the treatment with blood and blood products that has infected so many people,[12] this chapter considers the principle in a little more detail.

Consent is more than the patient simply agreeing to or refusing what is proposed, it “should be voluntary, denoting an absence of control by others, and informed, requiring sufficient information and understanding to allow autonomous choice. Three elements of consent therefore include agency (capacity), liberty (absence of coercion), and autonomy.” In considering coercion, “one can effectively coerce someone without intending to do so, particularly if one holds a position of power and operates within a context where the other person is disempowered by their circumstance or role.[13]

If material risks are not explained, “the patient is denied an effective choice”;[14] if information about risks of reasonable treatment alternatives is withheld, “then the patient cannot make an informed selection and is more reliant on the paternalistic considerations of the clinician in choosing on their behalf.[15]

The Inquiry’s medical ethics expert group told the Inquiry that “respect for autonomy has always been an ethical cornerstone of medicine”.[16] Autonomy involves the concept of self-determination (“that each of us has our own life that we should be free to fashion and shape as we see fit … each of us should be free to live our lives according to our own values and choices[17]) and it may be said to embrace too the concept of dignity.[18] Not being informed “fundamentally undermines your autonomy because you are denied the opportunity of real choice.[19]

In the 1970s interest in medical law and ethics was, the expert group said, renewed. The influential 1979 publication Principles of Biomedical Ethics[20] set out four principles: autonomy (protecting the rights of individuals to make their own choice); justice (fairness, equity and equality), beneficence (doing good); and non-maleficence (expected benefits should outweigh expected harms). In the UK, Professor Ian Kennedy’s highly influential Reith Lectures in 1980[21] called for a greater role for ethics and characterised the “so-called therapeutic privilege” (that a doctor may withhold information from the patient if in the doctor’s judgement it is not in the patient’s best interests to know) as “clearly a device created by doctors to do what is in the best interests of doctors”.[22]

These principles found their expression in guidance and other publications relevant to the time period under consideration by the Inquiry.[23] They were not new: as the ethical experts said, the principles are fundamental. Thus, a booklet produced by the Medical Defence Union (“MDU”)[24] in or around 1953 explained that: “It is not sufficiently widely known by practitioners that, in law, consent must be given by a patient before an examination can be conducted or treatment administered.[25] The booklet continued:

“To obtain consent it is necessary for the practitioner to explain carefully to the patient in non-technical language the need for an examination to arrive at a diagnosis or decide on the line of treatment. The character and the likely results of the treatment should be outlined to the patient in such terms that he can appreciate fully what is proposed and what may ensure. A practitioner, aware of the uncertainties of treatment, should avoid sweeping promises; and should not minimise the risks that may be inherent in the procedure he proposes … The consent thus obtained must be genuine consent; not merely an apathetic acquiescence but a real expressed willingness by the patient to undergo the treatment after he had its nature, its risks and its objective clearly explained.”

The MDU’s 1966 memorandum Consent To Treatment began with a quotation from a 1912 decision by the US courts: “No amount of professional skill can justify the substitution of the will of the surgeon for that of his patient”.[26] It recorded the right of the patient not to submit themselves to medical treatment if they do not wish to do so. The MDU emphasised the importance of consent, and of informing the patient of the nature and effect of the treatment and its risks: “If an inadequate or misleading explanation is given there is the danger that the apparent consent obtained will be held to be ineffective.[27]

In 1970 the British Medical Association (“BMA”)[28] published guidance entitled Medical Ethics.[29] This guidance said relatively little about informed consent[30] but it described the basis of the relationship between doctor and patient “as that of absolute confidence and mutual respect.[31] It may be thought inherent in the idea of “mutual respect” that a patient be provided with sufficient information regarding the risks of treatment, and the availability of other treatments, to enable them to give properly informed consent.

In 1980 the BMA published its Handbook of Medical Ethics which expressly addressed consent to treatment in these terms: “The patient’s trust that his consent to treatment will not be misused is an essential part of his relationship with his doctor, but for a doctor to touch a patient without consent is an assault. Consent is valid when freely given if the patient understands the nature and consequences of what is proposed. Assumed consent or consent obtained by undue influence is valueless.” It explained that: “The onus is always on the doctor carrying out the procedure to see that an adequate explanation is given.[32] An amended version of the Handbook of Medical Ethics published the next year added that “Doctors offer advice but it is the patient who decides whether or not to accept the advice.[33]

In autumn 1981 the World Medical Association adopted the Declaration of Lisbon on the Rights of the Patients. These rights included “the right to accept or to refuse treatment after receiving adequate information”.[34]

In 1988 the BMA published a revised ethics handbook – Philosophy & Practice of Medical Ethics. This discussed consent, and the underlying ethical principles, in more detail than previous handbooks.[35] On consent, it said this: “The basis of any discussion about consent is that a patient gives consent before any investigation and treatment proposed by the doctor. Doctors offer advice, but the patient decides whether to accept it. Before a patient can consent the options have to be presented in such a fashion as to allow a decision to be made. Consent must involve the ability to choose.” Referring to implied consent (for example “in attendance for an inoculation which implies that the patient expects the inoculation”) the handbook noted that this does not absolve the doctor from explaining any risks.[36]

Addressing the question of paternalism, the handbook identified it as being “in direct conflict with the principle of autonomy”. Although “the concept of autonomy is not new, it is now becoming a central influence on the expectations of patients. In the past many patients would accept without question decisions made by their doctor. Today, patients are more critical.” The handbook identified “truth telling” as “another principle by which people address medical ethics”: “the doctor and the patient are bound by an unspoken, unwritten agreement which is based on the patient’s ability to trust his doctor. Truthfulness is therefore seen as important because it is a moral imperative in itself and on utilitarian grounds produces a good social relationship.[37]

In August 1990 the Department of Health published a Guide to Consent for Examination or Treatment[38]: a similar guide (A Guide to Consent to Examination, Investigation, Treatment or Operation) was published in Scotland in October 1992.[39] These too emphasised the right of the patient to give or withhold consent and the entitlement of the patient to receive sufficient information in a way they could understand about any proposed procedure, possible alternatives and any substantial risks so that they could make a balanced judgement.

Medical Ethics Today: Its Practice and Philosophy was published in 1993 by the BMA as “a practical guide which reflects contemporary ethical thinking.[40] It contained detailed guidance on the issue of consent. It referred to the relationship between doctor and patient as: “based on the concept of partnership and collaborative effort … the basic premise is that treatment is undertaken as a result of patients being actively involved in deciding what is to be done to them.[41] It advised that “As a prerequisite to choosing treatment patients have the right to receive information from doctors and to discuss the benefits and risks of appropriate treatment options.[42] Further:

“Some people see the purpose of consent as chiefly being the provision of a defence for doctors against legal liabilities … In the BMA’s view, respect for others and their rights lies at the heart of the issue of consent. A feature of our present society is the emphasis on the value and dignity of the individual. It is said that principles of inherent natural rights dictate that each person who is competent to do so should decide what happens to his or her own body. The patient exercises this autonomy by deciding which treatment option to accept. The decision is based on information given by the clinician. For consent to be valid, the patient must know what options are available and have the ability to choose.”[43]

In 1995 the World Medical Association’s Declaration of Lisbon was expanded to provide that:

The 1995 Declaration of Lisbon also created a right for the patient to receive information about themselves recorded in any part of the medical records and to be fully informed about their health status, including the medical facts and their condition.[44]

1995 also saw the issue by the General Medical Council (“GMC”)[45] of Good Medical Practice, which sought to articulate the fundamental duties of a doctor registered with the GMC. Those duties included: making the care of the patient the doctor’s first concern; listening to patients and respecting their views; providing information to patients in a way they can understand; and respecting the rights of patients to be fully involved in decisions about their care and to refuse treatment.[46]

The GMC published its first specific guidance on consent in November 1998: Seeking patients’ consent: the ethical considerations. This explained that “Patients must be given sufficient information, in a way that they can understand, in order to enable them to exercise their right to make informed decisions about their care.[47]

Some of the evidence which the Inquiry has received suggested that clinicians’ approach to consent[48] in the 1970s and 1980s reflected a culture of medical paternalism. Whilst it is no doubt correct that, to some extent at least, “medical decision-making was previously paternalistic” and that it is “now recognised that decision-making should be shared and that informed patients should have the power to decide what happens to their lives”,[49] that is neither an excuse nor a defence for a failure to ensure that people treated with blood or blood products were given information about the risks of viral transmission, such that they could give (or withhold) consent on an informed basis. Nor does it justify the multiple examples which the Inquiry has heard of people being tested for HIV or hepatitis without their knowledge or consent.

Whilst guidance relating to clinicians has over time become more detailed, the underlying ideas and principles have not. The degree of articulation of those principles may have shifted over the years, but the principles themselves predate the events with which this Inquiry is concerned. Judged against stable and consistent “fundamental moral principles”, “past behaviours and practices, that may have been unchallenged or standard practice at the time, may still be considered morally questionable.[50] Although “people may have been operating in line with contemporary moral norms, their actions can be challenged where we can identify relevant fundamental moral values which should have been respected, irrespective of time or place.[51]

The descriptions given in the chapter on People’s Experiences show what too often happened: that people were not given adequate information about risks, or about alternatives; that too often doctors made decisions for, rather than with, patients; that clinical freedom was misunderstood as allowing doctors the freedom to decide what their patient’s treatment should consist of; and that the guidance this chapter has recorded as having been set out consistently since 1953 was not followed. It also demonstrates the consequences of this, not just in terms of the infections that followed but in the significant and destructive loss of trust that patients then had in their doctors.

3.3 Blood and Transfusion

Introduction

Almost from the dawn of human civilisation, blood has held a special significance. Before the principles of circulation became known, societies appreciated that blood was critical to life. We speak of “lifeblood”. Or say “It’s in the blood...”

Blood has held a significance that goes beyond it being seen as necessary to sustain life. It has been associated throughout time with the way in which that life is lived; with the central character of a person. There has been a sense that personalities are linked to blood – “hot-blooded”, or someone who acts “in cold blood”.

It is an easy step from thinking that blood dictates human characteristics and behaviour to thinking that if only blood could be replaced by better blood it would benefit recipients. They might then show some of the desirable characteristics of the animal or human from which the blood came, or find their own illnesses cured. The origins of blood transfusion to improve health lie in this instinctive sense that blood can convey beneficial characteristics.

It can be easy to overlook the other side of the coin: that someone else’s blood may contain undesirable characteristics too.

What makes up blood?

If blood is taken out of the body it will in most cases quickly coagulate. If however it is mixed with a suitable anticoagulant it will settle in a test tube into three portions. At the top will be a straw-coloured fluid called plasma. This is the major component of blood by volume (about 55%). At the bottom will be tightly packed red blood cells (amounting to 45% by volume). Between them is what is known as the “buffy coat” (less than 1%). This contains platelets and white blood cells.

The illustration shows a vial of whole blood being separated into plasma, buffy coat and red blood cells.

Figure 1. Composition of blood

Red blood cells are produced in the bone marrow. What gives them their distinctive red colour is the iron in the haemoglobin they contain. This important protein carries oxygen from the lungs to all parts of the body, and carries carbon dioxide as a waste product away from the tissues and back to the lungs for oral and nasal exhalation. Though described as “cells”, red blood cells do not contain a nucleus, nor do they contain mitochondria. [52] They are very tiny – small enough to pass through the narrowest of blood capillaries. If red blood is lost, it may take some time progressively to replace the loss: to the extent that blood donors providing a pint (just over half a litre) of whole blood are not asked to donate again within 12 weeks (if male) or 16 weeks (if female).[53]

Platelets (thrombocytes) are fragments of bone marrow cells, which again contain no nucleus. They are naturally sticky, and play an important role in the control of bleeding, described further below.

White blood cells (leucocytes) help to fight infections and so help the immune process. They consist of granulocytes, monocytes, and lymphocytes.

Granulocytes[54] make up about two thirds of the white cells. They consist of three kinds: neutrophils, which engulf foreign particles such as bacteria, which they then destroy with their powerful digestive enzymes; eosinophils, which are concerned with parasitic infection, and with allergic reactions; and basophils, which are responsible for inflammatory reactions and produce compounds such as histamine and serotonin which coordinate immune responses.

Monocytes, which are the largest of the white cells, are the scavengers of the blood circulation. They go for the damaged red cells and larger particles often found around chronic infections, and are able to enter tissue spaces and clear from them the debris of both infections and the bodily response to them.

Lymphocytes are central to many of the Inquiry’s concerns. They play an important role in the immune system, more fully described below.

Plasma contains thousands of proteins, performing a wide range of functions. What has been described as the “classical plasma proteins” include albumin, clotting factors, immunoglobulins and fibrinogen.[55]

The clotting process is the body’s response to a loss or threatened loss of blood. It begins when the lining of a blood vessel is damaged. Platelets in the blood (contained in the buffy coat) are activated in a process which eventually leads to them sticking together in clumps to form a plug at the site of injury. Clotting factors respond in a cascade – always in the same order – to form fibrin strands from the fibrinogen in the blood. These strands strengthen the platelet plug: fibrin in effect sticks the platelets together in a way which prevents further loss of blood.

Immune response

The essence of the immune system lies in its ability to recognise “self” from “foreign” and to tackle “foreign” material. A molecular structure which leads to an immune response is known as an “antigen”.[56] Two of the most important parts of this immune response, cells (lymphocytes) and antibodies (immunoglobulins), are found in blood.

Lymphocytes make up about 30% of the white cells. They are mostly very small and travel throughout the body, sampling the surrounding environment. They originate in the bone marrow, where they separate into two different classes: B lymphocytes and T lymphocytes. (The T indicates that this class of lymphocytes matures in the thymus, a small ductless gland just below the throat at the top of the chest, whereas the B indicates an origin in the bone marrow.)

T cells are classified into three main groups: killer cells, helper cells, and suppressor cells. Each T helper cell carries a number of molecular structures on its outer surface, including a T cell receptor.

Where such a T cell receptor recognises a specific antigen presented to it, it will bind with it. The interaction is like that of lock and key, initiating an immune response. There are a multitude of potential antigens, all different; but to meet them there is also a multiplicity of T cells, some of which will have the appropriate receptor to enable them to bind to the antigen.

When a helper T cell first locks on to an antigen, it proliferates into two general subtypes, one of which will activate killer T cells to bind with and kill the infected target cell, so as to begin its destruction; the other of which will cause B cells to proliferate, differentiate, and produce antibodies. The antibodies float free and themselves bind to and neutralise (or destroy) antigens.[57]

Following recognition of the antigen, the B lymphocyte not only rapidly produces copies of itself to produce more antibodies but also produces memory cells which can induce a secondary immune response if there is contact with the same pathogen in the future. These memory cells are important in speeding up the response to any future attack by an identical pathogen, which is necessary because the process just described takes quite a time to develop into an effective response.

Thus there are different parts to an immune response, mediated by different components of blood. There is the response in which T cells themselves attack and kill the foreign bodies carrying the relevant antigen (“the cell-mediated response”) and one in which the antibodies which the B cells have produced attack the antigens (the “antibody-mediated response”, also known as the “humoral response”).

CD4/CD8 T cell ratio

In summary, the helper T cells (CD4 T cells) work by triggering a response when faced with an antigen, for example, from a pathogen. Killer cells (CD8 T cells) respond by attacking the tagged pathogen and neutralising it. Suppressor cells regulate CD4 activity, “turning it off” when sufficient immune response has been achieved.[58]

In a healthy individual, the proportion of CD4 T cells (which are more numerous) to CD8 T cells varies, but is generally between 1.5 and 4 to 1. This is described as a ratio of (for example) 1.5:1, or 4:1.

When a person is first exposed to HIV, a virus which attacks CD4 T lymphocytes,[59] there is generally a drop in the number of CD4 T cells, since HIV targets those cells and depletes their numbers. By contrast, CD8 T cells generally increase by around 40%, although their ability to neutralise the virus will wane over time as there are simply fewer CD4 T cells to trigger an effective immune response.

An inverted CD4:CD8 T cell ratio is thus an indication of serious problems: if HIV therapy is initiated in a timely manner, the ratio will generally return to normal. However, if it is delayed the body’s ability to create new CD4 T cells weakens, and if and when this happens the ratio may never normalise.[60]

Before there was certainty that a virus was the cause of AIDS, a significant sign that AIDS might be about to develop in an individual was such an inverted T cell ratio or a low absolute CD4 T cell count.[61]

The story of transfusion

In the mid 17th Century Sir Christopher Wren (the famous architect) used a hollow goose quill attached to a bladder to perform intravenous injections on dogs. There were apparently no ill effects. It was not long before his friend Richard Lower experimented with the direct transfer of blood between two dogs using a system of hollow goose quills and tubing.[62] That was in 1666.[63] The first transfusion was then attempted from an animal (a lamb) to a human (a boy suffering a fever), by Dr Jean-Baptiste Denis (physician to King Louis XIV) in France.[64] It was followed shortly after by a presentation at a meeting of the Royal Society by Richard Lower. In front of the lecture theatre he transfused the blood of a lamb into a human, Arthur Coga, who suffered from a malady. Arthur Coga survived without ill effect: history does not record what happened to the lamb.[65] Dr Denis conducted a handful of further transfusions, but one recipient died. The public outcry that followed resulted in blood transfusions being banned by the French parliament and the practice discredited.[66]

It was nearly 140 years later before the practice of transfusion began again. A significant number of maternal deaths occurred through haemorrhage during and after childbirth. In 1818 Dr James Blundell (an obstetrician) became interested in transfusion to treat postpartum haemorrhage which would otherwise prove fatal. He eventually succeeded and thereafter repeated the procedure.[67]

The first article about inherited bleeding disorders was published in 1803 and the term “haemophilia” was first used in 1828.[68] Samuel Lane at St George’s Hospital, advised by Dr Blundell, performed the first transfusion successfully to treat haemophilia in 1839.[69] Dr Blundell’s work was replicated in Edinburgh, with a successful transfusion to a woman with severe uterine bleeding in 1845.[70] All transfusions were direct – the donor being connected by tubing to the recipient with no intermediate bladder, or bottle, and no intermediate storage. There could be no storage for very long, since if blood were taken out of the body and left to stand for any length of time it would begin to coagulate.

Two significant advances occurred just after the start of the twentieth century. Success in transfusion had been hit and miss, since the recipient could react seriously, and sometimes fatally, to it. Then in 1901 Dr Karl Landsteiner discovered blood groups.[71] People’s blood can be classified as falling into one of four groups: A, B, AB, or O. Blood group A has A antigens on the red blood cells, with anti-B antibodies in the plasma; blood group B has B antigens on the red blood cells with anti-A antibodies in the plasma; blood group AB has both A and B antigens but no antibodies; and blood group O has no antigens but both anti-A and anti-B antibodies in the plasma. These differences mean that someone with blood group A cannot safely receive blood from someone with group B, and vice versa.

Because blood group O has no antigens, it may be given to anyone, regardless of their blood group, without being seen as foreign by the recipient’s immune system and creating a devastating immune response. It is known as the “universal donor” blood group. Almost half of the UK population has blood group O. Blood group AB is known as “universal recipient” because it has no antibodies.[72] Once the system became known it was apparent why a number of recipients had reacted so badly to receiving transfusions of blood, and doctors soon learned how to “group” blood, and to know what transfusions to avoid.

The second significant development was the discovery a few years later that sodium citrate could be used as an anticoagulant.[73] On 27 March 1914 came a first tentative use of citrated blood which had been taken from a donor shortly before.[74] The feasibility of indirect transfusion was thus established. This would eventually enable red blood to remain effective for around 21 days, during which it could be used for indirect transfusion.

The First World War led to further advances. The importance of keeping a fighting force on its feet, and treating casualties to preserve life as far as possible, led to direct transfusions being conducted near the front line. Though citrated blood was not used extensively, its use did increase especially towards the end of the War.[75]

A further lesson from war was that the principal cause of death following a severe wound was not so much loss of red blood, but the effects of post traumatic shock. Shock caused by haemorrhage or overwhelming infection is characterised in most cases by a weak pulse, low blood pressure and cold sweaty skin. Its effect is to reduce blood flow through the small vessels or capillaries. Circulating volume is not maintained. Without that, the organs of the body may lack the oxygen necessary to avoid being damaged irretrievably. The use of plasma counteracted these symptoms.[76] This was centrally because of the albumin present in significant quantities in plasma. This acted as a volume expander. Giving a seriously wounded serviceman plasma thus was more effective at first to maintain life than restoring the lost red blood cells to his circulation.

In due course, this led to a focus on the use of plasma on its own to treat traumatic shock, and in turn this led to the use of albumin once that could successfully be separated from plasma.

Following the First World War, and the recognition that transfusions could keep troops alive, there was fertile ground for the development of the use of transfusion in civilian medical emergencies. The first organisation of blood donations for transfusion in the UK began in 1921 when Percy Lane Oliver, as honorary secretary of the Camberwell Division of the British Red Cross, organised a panel of donors who were willing to donate their blood voluntarily at hospitals around London. This became the London Blood Transfusion Service. It served only a handful of central London hospitals – blood was taken directly, donors had to live near to the transfusion centre, and the experience for the donor in giving blood, lying alongside or close to a recipient who was often in pain could be unpleasant. Nonetheless, Percy Lane Oliver’s enthusiasm was such that after he lectured at St Thomas’ Hospital on 11 November 1924 his system was adopted by the Red Cross for use across the country. In 1926 the British Red Cross transfusion service formally began, limited to urban areas at first because it depended on “walking donors” giving their blood directly.[77] In 1929 in Edinburgh, Jack Copland inspired a group of walking donors to give transfusions of their blood to the Royal Infirmary in Edinburgh.[78]

It took another war, after another ten years, for the next significant development to occur in the UK. During the Spanish Civil War, Frederic Durán-Jordà established one of the first blood banks in Barcelona. Blood was transfused indirectly, having first been donated into glass bottles containing anticoagulant, and stored in a “bank” from which the bottles might be drawn to treat casualties.[79] In 1939 Frederic Durán-Jordà fled to London, where he published about the Barcelona Blood Service in The Lancet and shared his knowledge with Dr Janet Vaughan who was influential in creating four transfusion depots around London, administered by the Medical Research Council (“MRC”), in readiness for the expected war.[80] These were followed by regional depots in 1940.[81] The Scottish National Blood Transfusion Association was established in January 1940 with five regional blood transfusion centres.[82] The Army Blood Supply Depot was also set up, with a donor pool of some 5,000 at the start of the war and over half a million donors by the end of the war.[83] Indirect transfusion, and short-term storage in a blood bank, had become established.

By the start of the Second World War plasma had long been separated from whole blood. Indeed, frozen plasma was first developed in 1925[84] and plasma itself was first used in the treatment of haemophilia a year earlier.[85] However, the volume of plasma (or thawed fresh frozen plasma) was likely to be unsuitably large for many purposes. Its volume often hindered easy transport in bulk. This was especially the case where it was the albumin it contained which was essentially what was needed initially to keep troops with serious wounds alive. By the Second World War, however, it could be freeze dried, which made it more transportable, easier to store without deterioration, and more easily usable.[86]

In 1941 the Treasury War Emergency Committee decided to finance two facilities in the UK to prepare freeze-dried human plasma, to take advantage of these benefits. One of these was to be situated “in the north”.[87] Thus a unit for drying plasma was constructed in an underground site at the Royal Infirmary Edinburgh; and production near London centred on Cambridge (later relocated to Elstree).[88]

1941 saw a further major advance, which together with freeze drying improved the ease of supply of blood derivatives to theatres of war. In the US, Dr Edwin Cohn and his team at Harvard discovered how the constituent parts of plasma could be separated for use. His process, known as “Cohn fractionation” was developed then, and its essential principles remain in use to this day.[89] He knew that what uniquely distinguished one protein from another was its solubility under different conditions – almost to the extent of this being a fingerprint. Using ethanol allowed the process to be conducted at a low temperature, and minimised the risk of bacterial growth. In a five stage process, some of the proteins would become insoluble at each stage. They would drop to the bottom of the fractionating vessel (or in a centrifuge be spun to the sides), leaving a fluid containing the remaining proteins in solution – the supernatant – to pass to the next stage of fractionation. “Cold ethanol fractionation” deposited fibrinogen at its first stage, together with two proteins present in such small quantities as to be described as “trace proteins” – which became known as Factor 8, and von Willebrand factor. The next stage saw the deposit of Factor 9 and after further stages the last stage would leave albumin.[90] Dr Cohn’s discovery of his fractionation process was timely: the albumin which was extracted, in concentrated form, saved many lives in the immediate aftermath of the attack on Pearl Harbour.[91]

The flowchart shows the different stages of the Cohn fractionation process, with Factor 8 produced first and then Factor 9 and albumin at the end.

Figure 2. Cohn fractionation process

Fractionation by the Cohn process was enthusiastically adopted in both the Edinburgh and Cambridge centres before the end of the Second World War. The principal aim of it was to produce the two fractions of (then) greatest therapeutic interest – immunoglobulins and albumin. Clotting factor proteins came to prominence later.

The intense demands of warfare, and escalating use of transfusion during it, alerted administrators, scientists, researchers and clinicians to the fact that a transfusion could do harm as well as good. By the end of the Second World War, if not earlier, this had become well established by experience. More is said about this in the chapter Knowledge of Risk Before 1970: it is sufficient here to note that by 1944 The British Medical Journal was drawing attention to the risk that transfusion could transmit hepatitis;[92] and on 19 August 1946 Dr William d’A Maycock, who had become the consultant advisor in blood transfusion to the Ministry of Health, recorded in a letter that he agreed that “users must be told that [plasma] is a potentially lethal fluid which should be used with discretion.[93] In October 1947, the Journal of the American Medical Association, the prestigious US medical journal, contained an article which ended by observing that: “Plasma, as well as other forms of transfusion therapy, should be administered only when the clinical indications are absolute.[94]

In December 1964 a circular from the Scottish Home and Health Department echoed the same theme: “All blood for transfusion must be regarded as potentially contaminated … The most important transmissible disease in this country is homologous serum jaundice or serum hepatitis … No transfusion should be undertaken unless the benefits outweigh the risk of hepatitis.[95] Dr Jean Grant, the director of the regional transfusion centre in Oxford wrote the following year: “The practitioner should satisfy himself that it is really necessary to give blood and that no other treatment would be equally efficacious even though it might take a little longer to achieve results.[96]

Just over 30 years later, similar words were used by Sir Colin Walker, when as chairman of the National Blood Transfusion Service he wrote in a foreword to a booklet celebrating 50 years of the blood transfusion service in the UK that “Our blood supply is amongst the safest in the world but, even so, medical advice is always likely to be that the best transfusion is no transfusion.[97]

3.4 Nature of the Diseases

Hepatitis

Hepatitis is an inflammation of the liver.[98] It may be caused in a number of ways: by infection (most commonly viruses, but also bacteria and parasites), by medication, or by a high alcohol intake. It may be the result of a fatty liver (steatosis),[99] of autoimmune disease or some metabolic disorders.

Inflammation of the liver is diagnosed by a blood test which measures the levels of certain enzymes in the blood: ALT [100] and AST. [101] These are released when cells in the liver are damaged or start to die. Though a low normal level is always present, hepatitis is suspected when one or more of these enzymes is elevated above the normal range. However, a reading above the normal does not identify on its own what the likely cause is: a wide range of factors may cause a temporary elevation of ALT or AST levels; “an evening’s sustained drinking” may do so, for instance. For that reason, where there is an elevated reading, much will depend upon the extent to which the level is higher than the normal range, and after a short period a further test still shows an elevated reading. Further testing to help discover the cause of the hepatitis usually involves a panel of blood tests, ultrasound imaging of the liver, and (in some cases) may involve a liver biopsy.

There are five hepatitis viruses – lettered A–E. Of these, the two most important viral causes of hepatitis are Hepatitis B virus and Hepatitis C virus; Hepatitis D only occurs if Hepatitis B is present.[102]

Both Hepatitis B and C can range in severity from being very mild, where an individual has no significant symptoms in the six months after infection and clears it naturally to being so severe that the liver can no longer carry out its essential functions and fails. If it fails there is a high risk of death. A liver transplant may be the only solution where there is liver failure.

Both Hepatitis B and Hepatitis C are transmissible by blood. Hepatitis A is sometimes transmitted by blood, but its normal route of transmission is oro-faecally. It comes via environmental factors, such as contaminated food or water.

The letters “A”, “B” and “C” were not used until around 1970 (for A and B) and 1988 (for C). Before 1970 no distinction was made between Hepatitis B and Hepatitis C. Together they were termed “serum hepatitis” (which indicated that they were carried by the serum (or plasma) component of blood). By contrast, what is now known as Hepatitis A was termed “infectious hepatitis”, which indicated it was not usually transmitted in that way.

By 1944 it had become well known that serum hepatitis could be transmitted by blood (this might be either by sharing a needle which had been used to vaccinate a number of people in turn, or by transfusion from one person to another).

There was no laboratory test for either part of serum hepatitis (indeed, it was not recognised that there was more than one constituent virus) until the discovery of an antigen associated with serum hepatitis in 1965.[103] When this was followed in 1970 by the identification of a virus-like particle by Dr David Dane of the Middlesex Hospital (known, therefore, as “the Dane particle”) it became known as Hepatitis B to distinguish it from infectious hepatitis, which then became known as Hepatitis A.[104]

After Hepatitis B was isolated, and a test for it developed in the early 1970s it was progressively realised that it had not been the sole cause of serum hepatitis. A larger component was a virus which had not yet been precisely identified. Since this was neither A, nor B, it became known as non-A, non-B Hepatitis (virus) (“NANBH”). Since compromised liver function can be tested for by measuring ALT and AST levels, and the presence of Hepatitis B infection in the past could be shown by the presence of anti-HBc,[105] NANBH was established as a diagnosis of exclusion. Where liver enzyme counts taken on at least two occasions close in time were elevated, tests for Hepatitis A or B were negative, and there was no other more obvious explanation for the elevation, then it was present.

The Hepatitis C virus was cloned in the US; the Chiron Corporation announced on 10 May 1988 that it had achieved this. The scientific details of its achievement were not fully released until 21 April 1989. Tests for the virus were then developed, NANBH stopped being used as a label, and screening blood for the virus became possible. The history of Hepatitis C screening is the subject of a later chapter.

Together, Hepatitis B and Hepatitis C viruses are amongst the leading causes of mortality globally, responsible for more deaths each year than malaria or HIV.[106]

The first six months of infection by either Hepatitis B or Hepatitis C is known as the “acute phase”; those infections which last for over six months are known as “chronic”. In this context, the words “acute” and “chronic” are no indication of the severity of symptoms. They are simply a measure of the length of time for which infection has persisted, whether the symptoms are significant or not. However, long-term (chronic) hepatitis infection is likely to lead to progressive scarring of the liver (fibrosis) which can lead to cirrhosis (a nodular form of scarring) and then an increased risk of liver cancer (hepatocellular carcinoma).

Hepatitis C

About 180 million people are infected worldwide: it is likely that the disease has been around for over 3,000 years. The onset of Hepatitis C can be insidious. The body can clear the disease naturally. When it does so, this usually occurs within the first six months after infection. Where chronic, its effects may take a long time to show. The symptoms are unspecific, presenting at first like many other illnesses. They may simply be understood by a patient as the ravages of time on the human body, and in particular cases (where a woman is infected around the time that she gives birth) have been ascribed to the tiredness that comes with childbirth, and the strains of bringing up children; and later attributed to the menopause and then to increasing age. However, in a significant number of cases, cirrhosis develops. Whereas less than 20% of Hepatitis C patients experience the typical symptoms of acute hepatitis, such as malaise, fatigue and jaundice, the virus can persist in the liver and silently begin to cause liver inflammation and scarring. After 20 years, approximately 20 to 30% develop cirrhosis. After 30 years it is 40% and after 40 to 50 years 60%. (Roughly, therefore 1 to 2% per year). Between 2 and 8% a year of those who have cirrhosis will develop liver cancer: successful treatment for Hepatitis C will reduce, but not eliminate the risk of cancer to around one third of these figures.[107]

There are eight genotypes of Hepatitis C, of which genotype 1 is the most common globally. In the UK, genotypes 1 and 3 each account for approximately 40% of the infections. Patients with genotype 3 tend to progress more rapidly to fibrosis and cirrhosis, and have a higher prevalence of severe steatosis (fatty liver) and a higher incidence of liver cancer. Early treatments for Hepatitis C were based on interferon: these genotypes responded differently, so the choice of treatment and its duration depended upon them. Genotypes remain of some, though diminishing, importance now that modern treatments are active across all genotypes.[108]

Where there is cross-infection between HIV and Hepatitis C, the disease progresses more quickly to fibrosis. The likelihood of spontaneous clearance is reduced. Sustained virological response (or cure) after ribavirin combination therapy was significantly lower in HIV co-infected individuals, particularly those with genotype 1 Hepatitis C.[109]

Treatment was initially, in the early 1990s, by interferon, then interferon coupled with ribavarin, an antiviral. In the restrained language sometimes used in medical reports, it is said that this treatment was often “poorly tolerated”. The experiences described in the chapter on People’s Experiences give more colour to this, and set out just how brutal and destructive earlier treatments often were. As explained by the Hepatitis Expert Panel adverse events associated with interferon and its toxicity include a range of infections (such as bronchitis, respiratory infections, herpes, viral and bacterial infections, skin infections, endocarditis, otitis externa), serious complications (such as thrombocytopenia, immune system disorders and sarcoidosis, thryoiditis, rheumatoid arthritis, endocrine disorders, metabolism and nutrition disorders) and psychiatric symptoms and disorders (including depression, anxiety, insomnia, aggression, mood alteration, emotional disorders, decreased libido, suicidal ideation and suicide, hallucinations, psychotic disorder, mania and bipolar disorders).[110] Ribavirin could also cause significant side-effects.[111]

Between 2014-15, direct-acting antivirals (without either interferon or ribavirin) were developed. These are more effective, cheaper, can be taken orally in tablet form, and have far less by way of side-effects, and require a shorter period of treatment.

There is now a worldwide push to eliminate Hepatitis C altogether. The current aim in the UK is to eliminate Hepatitis C infections by 2025.

In contrast to Hepatitis B,[112] Hepatitis C is only rarely transmitted by sex: there are some, but few, cases which have been reported to the Inquiry where this has happened. It is usually transmitted by blood – shared needles, razors, toothbrushes or in childbirth; and may be transmitted vertically by mother to child during pregnancy.

Hepatitis C is more easily transmissible than HIV. A small amount can contaminate a whole pool of blood, and it is more difficult to eliminate. Dry heating of blood products at 80°C for 72 hours eventually proved effective in the UK to destroy Hepatitis C infections, whereas lesser heat for a lesser time was sufficient to eliminate HIV.[113] Hepatitis C is also highly resistant to chemical and physical methods of elimination, can survive for months in the frozen state, and withstands repeated thawing and refreezing.

In very young children (under the age of five) only about 10% would have symptoms or signs of the disease in the first phase of infection. In older children and adults between one fifth and one third have symptoms or signs. Symptoms could include nausea, loss of appetite, in particular fatigue, and vague abdominal pain. It could involve a skin rash, muscle aches and joint pains. There might be a dull pain in the right upper quadrant over the abdomen. Symptomatic patients usually had jaundice (yellowing of the skin), yellowing of the eyes and dark urine. Signs of the infection during the acute phase might include jaundice, tenderness over the liver, and a patchy red rash over the trunk or whole body.[114]

Where the infection becomes chronic many patients suffer from neurocognitive symptoms including fatigue, anxiety, depression, “brain fog”, attention deficit and impairment of memory. These symptoms are associated with a low level inflammation in the brain, and with functional changes which are identifiable. They may last even beyond successful treatment. There can be a skin rash and peripheral nerve damage, and a loss of sensation in the fingers. Again, any nerve damage may not improve after treatment. Hepatitis C can also lead to a wide variety of other health conditions and complications.[115] If the hepatitis leads to advanced liver disease, the loss of liver function and increasing pressure in the abdominal veins causes a variety of symptoms. They include abdominal swelling due to the collection of fluid (“ascites”),[116] jaundice, confusion and coma (known as encephalopathy), oesophageal varices, which are essentially varicose veins in the gastrointestinal system which may rupture and bleed, fatigue, breathlessness, and a susceptibility to bruising due to a loss of clotting factors. These symptoms and signs are associated with a limited life expectancy unless there is a transplant.[117] Many of the symptoms to which Hepatitis C infection led in individuals are described in the chapter on People’s Experiences. There are also rare, but serious, complications associated with Hepatitis C such as kidney damage (which can lead to kidney failure and require dialysis) and lymphoma.[118] Further information about the range of other health conditions or complications that can be caused or contributed to by hepatitis can be found in the Expert Report to the Infected Blood Inquiry: Hepatitis.[119]

Hepatitis B

An estimated 257 million people are living with chronic Hepatitis B infection worldwide and it has been found in human remains up to 4,500 years old. The symptoms of Hepatitis B are similar to those described for Hepatitis C. It is more likely than Hepatitis C to produce jaundice, and symptoms, in its acute phase. There are eight recognised Hepatitis B genotypes but, unlike Hepatitis C, their clinical relevance is relatively limited. Where Hepatitis B infection occurs in infancy, it is estimated that between 15 to 40% of people infected will develop cirrhosis during their lifetime. If infected in early adulthood, the virus is usually present at a high level, but in its chronic phase does not cause liver inflammation or damage. However, this phase (known as the immune tolerant phase) is of variable duration, and though it can last decades, over time patients develop liver injury as the immune system interacts with the virus (the immune active phase).Liver injury may thus persist and cirrhosis develop. It is not yet possible to cure Hepatitis B. There was no treatment until interferon started to be used experimentally in the 1980s and was approved for use in 1992. Direct acting antivirals are now used but with very few exceptions have to be taken for the rest of the person’s life: once chronic, hepatitis B is almost always life-long. Most people with chronic Hepatitis B are recommended to have six monthly testing (usually involving an ultrasound of the liver and an alpha-fetoprotein blood test) due to the risk of advanced liver disease and cancer.[120]

HIV

39 million people are estimated to have died worldwide as a result of HIV infection; at least 16 million have become orphans as a result; and nearly 80 million people worldwide have been infected. At least 36.9 million people worldwide are currently living with HIV infection. HIV exists in two types – HIV-1 and HIV-2 – with HIV-1 being responsible for the majority of infections worldwide and it is HIV-1 with which this Inquiry is concerned.

HIV (Human Immunodeficiency Virus) is not AIDS. HIV attacks the cells that control the body’s immune system and by damaging it compromise the body’s ability to fight infection. This renders the body vulnerable to life-threatening infections. AIDS is a failure of the immune system, manifested by a collection (syndrome) of infections which are a consequence of infection by HIV. HIV infection passes typically through three phases: an acute phase (in which the viral load is particularly high). Then a chronic phase which may well be asymptomatic, but during which the CD4 T-cell count declines.[121] Third comes the AIDS stage. Because the immune system is less effective, some infections which would normally be suppressed can take hold. Characteristic opportunistic infections which occur in the presence of HIV infection are pneumocystis pneumonia and Kaposi’s Sarcoma, a form of skin cancer. There may often be candidiasis (thrush/fungal infection), herpes diseases, histoplasmosis, cytomegalovirus, HIV-related encephalopathy, Burkitt’s lymphoma, wasting syndrome and tuberculosis. 23 clinical conditions, including opportunistic infections, were recognised in 1986 by the CDC as “AIDS defining”.[122]

HIV is a blood-borne infection, and may be transmissible by sex, though not by kissing or social contact. It is less readily transmissible than is Hepatitis C, and more easily inactivated by heat treatment.[123]

AIDS first came to attention in the western world after 5 June 1981 when the CDC reported in the Morbidity and Mortality Weekly Report that there was a strange disease of the immune system in five young gay men in San Francisco. It was not then labelled AIDS, but plainly serious symptoms had been caused by a failure of the immune system for some reason. Later it was discovered that HIV infection had most probably begun before 1981 – in the US in 1978, in the UK in 1979: though the first AIDS patient in the UK was identified as such in December 1981.[124] There is a long period between infection and AIDS becoming manifest.[125] Without treatment, the life expectancy after the occurrence of the first AIDS defining event (or the reduction of the CD4 cell count to below 200/uL) is very poor, usually around two years.[126]

Treatments began experimentally with AZT in 1986. There were very serious side-effects, with the dose used in the early trials twice that now licensed. Some of the side-effects were long-lasting, including lipodystrophy (fat redistribution disorder), lipohypertrophy (fat accumulation), lipoatrophy (subcutaneous fat loss), peripheral neuropathy, chronic liver disease, exocrine pancreatic deficiency, low bone mineral density and chronic kidney disease.[127] There was no other known means of treating the HIV infection itself, although some of the symptomatic opportunistic infections which occurred could be individually treated – for instance, pentamidine could be used to mitigate the effects of pneumocystis pneumonia. Treatments focussed on the way in which, through the activities of enzymes, the HIV virus attacks a CD4 T lymphocyte. Some therapies aim to prevent entry of the virus into the host cell in the first place (entry inhibitors). Some aim to prevent the enzyme replicating a strand of its RNA to form a double strand of DNA (nucleoside reverse transcriptase inhibitors “NRTIs”). Some aim to prevent the viral double strand becoming part of the host cell DNA (integrase inhibitors), some aim to prevent the DNA of the host cell assembling building blocks of protein to form new proto-virus particles, which might go on to infect further cells (protease inhibitors) and some aim to stop the viral replication cycle (non-nucleoside reverse transcriptase inhibitors). Ever since the mid to late 1990s a combination of three drugs (usually two NRTIs plus one other) have been used in antiretroviral therapy (“ART”) – this is known as HAART treatment, the “HA” standing for “highly active”.[128]

Although efforts continue, HIV infection cannot currently be “cured”. But, with treatment, then assuming resistance to that treatment does not build up, and that the effect of highly active treatment is to diminish the possibility of the virus mutating, viral load can be reduced to undetectable levels. If undetectable, it is untransmissible. The government has pledged to end new HIV infections by 2030.[129]

Acute HIV infection is frequently undiagnosed or misdiagnosed because the non-specific nature of many of the symptoms and signs mean that it may be confused with other viral infections such as glandular fever (Epstein Barr virus). Accordingly, a diagnosis of HIV infection is rarely made without specific testing. A proportion of people will develop symptoms during early HIV infection, most frequently, a raised temperature, sore throat, mouth ulceration, enlarged lymph nodes, aching muscles and joints, and tiredness. A short-lived faint pale pink rash is sometimes seen. Nausea, diarrhoea and weight loss can occur. Neurological symptoms are common and may include headache and aversion to light. In rare cases there may be signs of meningitis or of encephalopathy. In most people these illnesses last up to three weeks and resolve on their own. Without treatment, most people with HIV infection experience a gradual decline in CD4 count over a period of approximately eight to ten years before the development of symptomatic disease. People who begin treatment late in the course of their HIV infection may suffer severe inflammatory symptoms in addition. Even with effective antiretroviral treatment, people with HIV have higher levels of multimorbidity occurring at a younger age than those who are HIV negative. Frailty and its associated disabilities appear to occur at a younger age in people with HIV. Leading causes of hospital admission of people with HIV in Europe in 2017 included respiratory illness, psychiatric conditions, and cardiovascular, renal and neurological disorders. Data from the UK show 75% of those living with HIV have at least one other long-term condition including mental health conditions, hypertension, lipid disorders and diabetes.[130]

Coinfection

For people coinfected with HIV and one or more hepatitis viruses, the likelihood of spontaneous clearance of Hepatitis B and Hepatitis C decreases and Hepatitis B and Hepatitis C progress more quickly. Interferon-based treatment for Hepatitis C was less effective with HIV, leading to longer treatment and a higher risk for cumulative interferon and ribavirin toxicity. Treatment for Hepatitis B that can be taken with HIV treatment was not available until relatively recently. Treatment options are still poor for people infected with Hepatitis D as well as Hepatitis B and HIV. For people coinfected with more than one hepatitis virus, there is an interaction between them and if one is successfully treated another may flare. For people coinfected with more than one genotype of Hepatitis C, a genotype present at low levels may re-emerge after treatment of other coinfecting strains, particularly with the older treatments.[131]

3.5 Treatment of Bleeding Disorders[132]

Haemophilia is usually inherited. Inheritance is gender-linked: it is much more common for men to suffer than women. It is a clotting disorder. Some proteins are central to the process by which blood clots. Most are known as clotting factors. People vary in the amounts of these factors in their blood. The average level in humans is 100%. [133] There are thus many who have a level in excess of 100%. However, if someone has less than 1%[134] of Factor 8[135] he[136] is usually classed as having severe Haemophilia A. If he has less than 1% of Factor 9 he is usually classed as having severe Haemophilia B. For moderate Haemophilia A or B the level is between 1% and 5%; for mild, between 5% and 30%.[137] Rather than have an absence of Factor 8, a person may lack the desired level of a protein (von Willebrand factor) which is necessary to help Factor 8 function: this is von Willebrand disorder. Its absence is not a gender-linked characteristic. Accordingly, women will experience its absence as often as men.[138] The consequence of lacking von Willebrand factor is very similar to that of lacking Factor 8.[139]

Though the response to a lack of clotting factor inevitably varies from person to person, in the absence of treatment, people with severe haemophilia will bleed spontaneously, sometimes up to two or three times per week. By contrast, spontaneous bleeding is rare in those who have mild haemophilia: rather, they will bleed excessively only after accidents, injuries, surgery and dental extractions.

Though a common image of haemophilia is of a person bleeding uncontrollably and externally from a small, minor wound, this is far from the usual case. Bleeding is internal: usually into joints and muscles, but more seriously still into the gut or the brain.[140] Where blood collects in a joint it begins to destroy parts of the lining of the joint, the synovium, which causes it to swell and then tear, in turn causing more swelling until it “resembles a sponge laden with blood[141] making it yet more liable to further bleeding, and causing the joint to become a “target joint”. This leads to arthritis – painful, stiff joints.[142] That in turn causes serious immobility as well as pain. Being immobile compromises life. The consequence is that without treatment, a person with severe haemophilia had a life expectancy in the twenties before the 1950s,[143] and only a little more than that by the late 1960s.

A person with severe haemophilia would typically be diagnosed in early infancy, as mobility developed and joints were banged. From then on (until treatments improved) life was far from easy. Schooling was fractured by the need to go to hospital to treat bleeds – as many as one or two a week, losing one or two days’ education.[144] It was thus more difficult for those with severe haemophilia to get good qualifications. Earning a living was affected by the need for frequent spontaneous absences from employment. Many jobs were manual, which were best avoided by someone whose joints and muscles might suffer all the more if he were to do them.

Early treatments

Post-war treatment focused on restocking the bloodstream with enough Factor 8 or 9 to meet the immediate need to stop a bleed, because the underlying problem is a deficiency of clotting factor. Transfusion of whole blood was an option. The adult human body has eight or nine pints of circulating blood. To raise the level of, say, Factor 8 in the whole bloodstream from 1% or 2% to a level of 25% (which usually would be just sufficient to resolve most immediate bleeding problems)[145] would obviously require a large additional volume. That could not easily be provided without imposing a huge strain on the heart. Further, the donation used might not itself have as much as the average 100% of relevant factor: it was difficult, therefore, to titrate a dose appropriate to the recipient. The process took time, and could only be done in hospital. It would never be done in advance, but rather in response to a major bleed. The patient would have to get, or be brought, to hospital whilst bleeding internally.

Blood consists of three main parts: red blood cells (approximately 45%), plasma (approximately 55%), with the remainder being composed of platelets and white blood cells.[146] If an appropriate anticoagulant is applied to a phial of blood and it is allowed to settle, the platelets and white blood cells form a “buffy coat” between the red cells below and the straw-coloured plasma above.

Whereas donated blood had historically a useful life of 21 days after donation,[147] plasma, if frozen at the point of donation, could last for several months or even longer. Fresh frozen plasma (“FFP”) was therefore often used to provide the necessary clotting factors. The volume needed was less than the volume of whole blood. It was easier to handle. But the process still took time.

Preparations of clotting factors derived from cows or pigs were often used;[148] and sometimes the venom of a Russell’s viper (also known as Stypven), but so far as this was concerned, because of its toxicity it could not be transfused and could only be injected topically to stop a bleed into a joint.[149] Gradually greater and greater use was made of clotting factors derived from human plasma.

Plasma contains proteins. Early in the Second World War, Dr Edwin Cohn had developed ways of separating plasma into separate protein fractions, by treating whole blood with anticoagulant, separating off the plasma, and “fractionating” the plasma under varying conditions of acidity and proportions of ethanol so as to separate one fraction of it from another.[150] It proved possible to freeze-dry fractions which contained Factors 8 or 9. In 1957 The Lancet was able to describe some of the development,[151] which led to the preparation of a concentrated form of Factor 8, then known as antihaemophilic factor (“AHF”).[152] Freeze-dried granules of this could be mixed with sterile water to reconstitute what has been described as “dreadful soups and thick creamy stuff”.[153] The volume needed was still regarded as challenging.[154]

From around 1968, Factor 8 concentrate was produced under the aegis of the Lister Institute at Elstree and Oxford and by the Blood Products Unit in Edinburgh (renamed the Protein Fractionation Centre in 1970);[155] Factor 9 was produced in Oxford. Until about 1974, Factor 8 concentrate contained large amounts of protein (in particular fibrinogen) other than the protein of interest. Fibrinogen is not readily soluble, and the dose needed for injection was usually greater than 100 ml, making the early Factor 8 product far more suitable for administration in hospitals than at home.[156]

Cryoprecipitate becoming a mainstay of treatment

When FFP was slowly thawed under certain conditions of acidity and concentrations of alcohol, a sludge dropped out of the solution. In 1959 Dr Judith Pool at Stanford University discovered that this “cold-insoluble precipitate contains considerable quantities of the plasma antihemophilic globulin”.[157] It became known as cryoprecipitate. The “supernatant” (the solution which remained) contained many of the other proteins which were in the plasma. When this was further fractionated, Factor 9 dropped out of the solution too.[158] The great advance for which Dr Pool was then responsible came in 1965 when she realised that plastic bags (now increasingly used for transfusions in place of bottles[159]) could be used in a closed, sterile system to hold separated out cryoprecipitate, which could then be stored and used by blood banks.[160]

Cryoprecipitate provided a greater quantity of clotting factor in a smaller volume. Where derived from a pool of different donors, the concentrations of this factor in the pool would tend towards 100%, because of the natural averaging out that comes with numbers of individual donations of differing percentages. Cryoprecipitate could be given as the product of a single donation, rather than pooled. Where there were pools they were small: they did not need to be large. Cryoprecipitate became the mainstay of treatment for Haemophilia A before the mid 1970s. It had drawbacks: it was usually kept in a hospital deep freeze, had to be raised very carefully to around 4°C, and transfused into a vein slowly. It thus typically[161] involved attendance at hospital to administer it, and the process could take a number of hours to complete. Where it came from a single donor, there had to be a presumption that there might not be enough clotting factor, because it was not known how great a concentration of it was contained in that particular donation. It tended to be used to treat bleeds which had led to hospitalisation, after the event, rather than administered in advance to provide a sufficient level of the necessary factor in the bloodstream to deal with potential bleeds as and when they might arise. Because the precipitate or sludge from which the Factor 8 or 9 was drawn contained a number of other proteins, it could in a number of cases provoke an adverse reaction from the immune system of the recipient, which might produce “inhibitors”. These inhibitors limited the effectiveness of the “foreign” Factor 8 or 9 protein.

Cryoprecipitate was enthusiastically (and rightly) hailed as a breakthrough in treatment. For instance, Dr Peter Jones wrote in 1967 in a letter to The Lancet that “Cryoprecipitate is now the method of choice in treating bleeding episodes in patients with haemophilia, but, when not available, adequate therapy with fresh frozen plasma is possible and can be made relatively safe”,[162] and returned to the same theme of praising cryoprecipitate in July 1972 when he wrote that: “Thirty years ago most haemophilics died of exsanguination in childhood. Today they can expect to live a normal life-span.[163] He attributed this increased life expectancy to the identification of Factor 8, the organisation of the blood transfusion services post war – and to cryoprecipitate.[164] Commercial concentrate was to be used in the management of severe bleeds not as first choice, but as second best to cryoprecipitate: to be used “when insufficient cryoprecipitate is available”.[165] By 1977, with by now over ten years’ experience of its use, he remained of the view that cryoprecipitate had made a significant difference to the advantage of patients – talking of both cryoprecipitate and freeze-dried concentrates in the same breath as having made a difference, but then pointing out that Factor 8 concentrate was not without its problems, principal among which was that it was a significant cause of hepatitis.[166]

This enthusiastic embrace given to cryoprecipitate was echoed both contemporaneously and in retrospect by other clinicians. Thus Dr Elizabeth Mayne described its advantages amongst which were “efficacy, low donor exposure, simplicity of manufacture” set against some disadvantages; its discovery “transformed Haemophilia treatment”.[167] Even by 1990, after concentrates had become very widely used, she summarised her opinion as being “the selection of mode of treatment depends upon the category or type of patient concerned. In respect that cryoprecipitate and Factor VIII concentrates are both efficacious treatments, preference of the one over the other depends on the age, the severity of the haemophilia and whether the patient is on a self-treatment programme, or requires major or minor surgery.”[168] Two years earlier she had expressed herself more enthusiastically still: in a profile of haemophilia management in Northern Ireland she described how “In 1967 a milestone occurred; a revolutionary concentrate was produced called ‘cryoprecipitate’. It was prepared from single plasma donation according to the methodology discovered by Poole, 1965 … the patients were ecstatic about the new treatment. A simple dental extraction was normalised and no longer constituted a major ordeal necessitating many weeks in hospital.[169]

Some haemophilia clinicians have, in their written and oral evidence to the Inquiry, identified a number of disadvantages to cryoprecipitate: including that it was less effective clinically than concentrate, it was laborious to reconstitute, it had to be stored in deep freeze and was therefore not suitable for home treatment; and it caused side effects.[170] However, whilst it is undoubtedly correct that cryoprecipitate was more laborious to use, it could be used, and had been used, to raise Factor 8 levels. Side effects were for the most part transient.[171] As to the lack of suitability for home treatment, whilst there can be no doubt that cryoprecipitate was less convenient than concentrates, the evidence available to the Inquiry confirms that home treatment with cryoprecipitate could and did occur.

Cryoprecipitate was not used for the treatment of Haemophilia B. Prior to the availability of Factor 9 concentrates, treatment for Haemophilia B was with fresh frozen plasma (FFP).[172]

Haemophilia treatments in the 1970s[173]

The 1970s saw a gradual shift from the use of cryoprecipitate, largely in hospitals,[174] to the use of factor concentrates, particularly at home.[175] Improvements were steadily made in producing freeze-dried AHF, so that it became far more user-friendly than its early precursor in 1957.

Until 1973, when the first commercially manufactured concentrates were licensed for importation,[176] factor concentrate was produced at three sites in the UK. The principal English site was run by the Lister Institute (a privately funded research and production enterprise) at premises in Elstree. It operated as the Blood Products Laboratory (“BPL”) until 1978, though funded for the production of factor concentrates there by the Department of Health and Social Security (“DHSS”). Thereafter Lister was no longer able to continue its operations because of a lack of resources, and the operation continued as the BPL in that name.[177] A secondary site in England was that at Oxford, which was known as the Plasma Fractionation Laboratory (“PFL”), and the second major site was the Protein Fractionation Centre at Liberton in Edinburgh (“PFC”). At the start of the decade they tended to use small pools: a paper by Dr Rosemary Biggs in 1974[178] showed that the mean pool size in 1971 for Factor 8 was 192 donations.[179]

Competitive mass production led to commercial products being manufactured in the US and in Europe from pools which consisted of several thousand individual donations (“large pool concentrates”). The products, whether large pool concentrates from the US, or products manufactured by the three UK production facilities from smaller pools (“NHS concentrates”) revolutionised the treatment of haemophilia. The product, being concentrated and freeze-dried, could be stored in a small volume. It was relatively easy to transport. A deep freeze was not required. It was relatively easy to reconstitute the concentrate by adding sterile water: the earlier difficulties caused by “impurities”[180] (proteins other than Factors 8 and 9 contained in the final product, such as fibrinogen) had been reduced.[181] The quantity to be infused was no longer so great as to make injection difficult. A clinician could more accurately know the units of Factor 8 or 9 activity contained in a given volume in any injection, because of the averaging effect of the pools.[182] The preparation and administration of an injection was quick. It was convenient.

Though in the early 1970s injections of factor concentrates (Factors 8 and 9) were largely administered in response to a bleed, and in hospital, these characteristics meant that the phial of concentrate could easily be kept in a domestic fridge. Home treatment became easier. Initially, that too would have been responsive to a bleed.[183] But since the underlying constitutional problem was a relative lack of clotting factor in the patient’s bloodstream, it eventually began to be given to some patients in advance of any bleed, designed to stave one off by maintaining the level of clotting factor in the bloodstream sufficiently to cope with most bleeds.

Prophylactic treatment as a policy was not well developed until later in the 1980s or early 1990s,[184] but an element of prophylaxis occurred in some of the uses to which home treatment was put. Thus if a patient on home therapy knew that they were about to engage in an activity which had an elevated risk of causing a bleed, they might well choose to take some concentrate in advance. It seemed to some clinicians that patients chose to overprescribe to some extent, just to be sure they would achieve the desired effect. It soon became clear that home therapy, and occasional prophylaxis, together with the success of the factor replacement therapy adopted since 1967[185] in extending the lifespan of people with severe haemophilia (who needed the greatest number of units of product),[186] greatly increased the overall consumption of factor concentrates. The graph in Figure 1 shows that until 1973 almost all of the requirements for factor concentrate were met by NHS manufactured product. After that, the demand for factor concentrate was increasingly met by imported commercial concentrate: from 1977 until 1983 more than half of the Factor 8 concentrate supplied in the UK was commercially produced. The use of cryoprecipitate diminished until 1978, after which it comprised only a small portion of the total consumption of products supplying Factor 8.[187] Factor 9, though, was mostly domestically produced:[188] for that reason, there is no corresponding graph for Factor 9.

The graph shows that the most used product was cryoprecipitate until 1977 and then commercial Factor 8.  Total product usage increased 1969 to 1990.

Figure 1. Total UK Consumption (Factor 8)[189]

Cryoprecipitate nonetheless remained in constant use for treating von Willebrand disorder.[190] And it had the considerable advantage that it required little by way of equipment to manufacture it – a centrifuge was perhaps the critical technology – compared with the much more sophisticated processes used for making freeze-dried concentrate. An implication of these two facts (that expertise in making it was retained, and the equipment needed was relatively simple and commonplace) is that it would need little additional resource to scale up production of cryoprecipitate if it proved desirable to do so.

Consumption of factor replacement therapy[191] before 1971 is difficult to measure – after that there was an internationally agreed standard of Factor 8 activity by which quantities could reliably be assessed. From about 1973 (when the first commercial Factor 8 concentrate was imported for general use, rather than for use by specific patients on particular request[192]) there was a steady increase in the UK from under 20 million international units to just under 80 million international units. This was a fourfold increase. It was fuelled by a combination of (a) the availability of a product which in its concentrated form had become more available, more readily, and both easier to use and easier to store (in a domestic fridge); leading first to its use as a home treatment (which resulted in a greater quantity of concentrates being used than would be in hospital) and then to a sporadically increasing prophylactic use (which as it developed, led to some[193] people with severe haemophilia infusing Factor 8 at home three or more times per week to maintain their baseline levels[194]); and (b) an increasing number of people with haemophilia requiring treatment, since a consequence of the success of treatment both with cryoprecipitate and factor concentrate was an increase in life expectancy. Factor concentrate formed an increasing proportion of coagulation therapy, with cryoprecipitate supplying the balance.[195] By 1982, almost all the consumption of Factor 8 was supplied by factor concentrates, as the graph shows.

A common side effect of treatment for Haemophilia A was the development of Factor 8 inhibitors. These are antibodies produced by the immune system and can occur in people with (usually severe) Haemophilia A: the reason why some people develop inhibitors and some do not “is complicated and only partially understood.[196]Treatment options in the 1970s included Factor 9 concentrate, FEIBA[197] and Autoplex.[198] People with Haemophilia B and Type 3 von Willebrand disorder could also develop inhibitors though less frequently.[199]

In the late 1970s a synthetic product – desmopressin (known as DDAVP) – became available as a treatment for Haemophilia A (in particular, but not solely, people with mild Haemophilia A) and for von Willebrand disorder. In 1977 it was reported, following a trial, that DDAVP infusion caused a marked increase in Factor-8-related properties in patients with moderate and mild haemophilia and von Willebrand disorder. The administration of DDAVP before dental surgery and in the early postoperative period was followed by a two- to threefold rise in Factor 8 coagulant activity. It led Professor Pier Manucci to write in The Lancet that DDAVP could be “a promising pharmacological alternative to plasma concentrates in the management of some patients with haemophilia and vWd.[200] Von Willebrand disorder would otherwise be treated with cryoprecipitate or with Factor 8 concentrates.[201]

There is little doubt that concentrates were seen at first as a wonder drug which might revolutionise the lives of people with haemophilia. They might not need to take as much time out of school. They could play sports. Concentrates could ensure substantially less time off work.[202] They would not have to wait for their treatment to be thawed and prepared for use. People with severe haemophilia attended the clinics so frequently the doctors became well known to them, and they became well known to the doctors. It was to be expected that doctors saw and welcomed the significant improvements factor concentrates brought to the patients for whom they were caring. Sadly, this almost certainly fuelled a desire to reject any suggestion that the product might actually be doing more harm than good to those who received it, and an inclination to reject any critical appraisal of the value of administering factor concentrates as they then were.[203]

The principal developments in the 1980s involved two matters: first, the birth and development of effective techniques of viral inactivation in factor concentrates (for which, see the chapter on Viral Inactivation), which when established (after 1984 for HIV in NHS concentrate produced at PFC, 1985 for both hepatitis and HIV in NHS concentrate produced at BPL though in insufficient quantities for all English and Welsh patients, and mid 1987 for hepatitis in NHS concentrate produced at PFC, and at some later stage for commercial concentrates) ensured that cryoprecipitate was much less used in haemophilia therapy except for von Willebrand disorder, and second the development of “higher purity” concentrates. “Purity” referred not to freedom from external contaminants or microbes, but to the relative absence of proteins other than Factor 8 or Factor 9. These would be proteins such as fibrinogen or fibronectin, which were present in very much greater quantities in plasma than were clotting factors, which were trace proteins only. Throughout most of the period with which the Inquiry is centrally concerned factor concentrates were described as of “intermediate purity”; towards and moving into the 1990s “higher purity” products became more prevalent. The advantage of higher purity was that patients would receive a lesser amount of unnecessary protein from a human source foreign to the recipient than would be the case if being given intermediate purity products. This meant that they were theoretically less likely to suffer an adverse reaction to the factor product.

3.6 Knowledge of Risk Before 1970

This chapter assesses the state of knowledge up to 1970 of the risk of hepatitis and its transmission through blood.


Key dates

1923 report in Sweden suggests jaundice as an adverse effect of vaccination with a viral cause.

1942 recognition that human serum in yellow fever vaccine transmits hepatitis.

1942 Ministry of Health acknowledges connection between transfusion and jaundice and concludes that this association “may have been overlooked”.

1942 Emergency Blood Transfusion Service recognises that blood transfusion may result in delayed jaundice.

1944 MRC Jaundice Committee is told that hepatitis caused by serum transfusion after long intervals is “beyond doubt”.

1952 WHO Expert Committee on Hepatitis recognises serum hepatitis as a serious problem and suggests five preventative measures.

1954 The Lancet reports that larger pools cause higher incidence of hepatitis.

1964 Scottish Home and Health Department circular states that “All blood for transfusion must be regarded as potentially contaminated … No transfusion should be undertaken unless the benefits outweigh the risk of hepatitis.”

1965onwards blood donations in Germany are tested for raised ALT.

1970 onwards blood donations in Italy are tested for raised ALT.


Abbreviations

ALT alanine transaminase

MRC Medical Research Council

WHO World Health Organization


Just as a first principle of medical practice is often summarised as “First, do no harm”, [204] a primary duty of government is to protect the safety of its citizens. That involves keeping them free of unreasonable risks to their health and safety wherever possible.

Blood transfusions are intended to benefit the recipient. But they are not like most medications, which are derived from chemicals. They are of human origin, and as a result vary just as humans do. So, just as the recipient may be given blood or a blood product which restores their circulation or provides them with a protein they are lacking, so too will they receive whatever may be harmful in the blood of the donor. This may be such as a virus, a parasite (such as in malaria), a microorganism (as the spirochete in syphilis), a prion (as in variant Creutzfeldt-Jakob disease), or a protein which may cause a dangerous reaction. The risk that the benefit may be outweighed by the harm of a transfusion is ever present.

Any decision to administer a transfusion thus involves risks to the patient, which go well beyond the risks of infection from breaking the skin to insert a transfusion needle. There is no general principle that all risk must be avoided: some risks simply cannot be. It may be desirable to run inevitable risks because of the importance of the object which is to be achieved by doing so. A balance has to be struck between, on the one hand, the magnitude of the risk – itself a combination of the likelihood of the risk materialising and its probable severity if it does – and, on the other hand, the importance of the purpose to be achieved by incurring it, and the availability and expense of protective measures against it. Thus the risk of catching the common cold is almost certainly increased by working closely together in offices and travelling on public transport to get to those offices. It is however a risk of low magnitude, since the probability is that any cold will be short-lived and have no serious long-term consequences. Compared with that, the magnitude of even a small chance of incurring a terminal cancer is clearly very much greater, especially if suitable protective measures cannot be taken and if it is likely that any attempted cure will be ineffective. Avoiding the risk may then be appropriate.[205]

To assess where the balance lies between risk and benefit when transfusing blood or a blood product to an individual, one must first know that there is a risk, second how serious a risk that is, third what steps can be taken to avoid or minimise that risk, and fourth whether it is unreasonable, or unreasonably difficult, to take those steps. The question is not whether taking the steps is reasonable – the fact that a step could be, but is not, taken to prevent or minimise the risk of harm means it should be taken unless it would be unreasonable not to do so. That is a more demanding standard than simply doing what is reasonable in the circumstances.

The risks may not simply be to the individual patient. Patients have families. They live in communities. A risk of infection to a patient may also be a risk to the health and wellbeing of others. Though it is for the clinician to identify the clinical need for treatment of some kind, what the available treatments are, and to advise the patient about their risks and benefits, and those of any reasonable alternative, it is a fundamental ethical principle that the ultimate decision whether to take or reject that treatment is for the patient to make.[206] Where risk may also be caused to others in the patient’s family or community, the patient will almost always be best placed to know how significant that risk is to them.

The answers to the first two of these questions (“is there a risk, and how serious is it?”) slowly began to become clear a hundred years ago in relation to suffering hepatitis as a consequence of transfusion.

In 1885 it was reported that 191 of 1,289 shipyard workers in Bremen who had been vaccinated against smallpox by a particular form of lymph[207] developed jaundice; two other groups of workers vaccinated by a different lymph did not. It seemed that some infectious agent in the lymph had caused this.[208]

From at least as early as 1923 it had not only been reported that jaundice[209] might be caused by and after vaccination, but also that a virus was the probable cause.[210] Human serum was used to carry an inoculation into the bloodstream of the recipient: it appeared to be infective.

Well before the inception of the NHS in 1948, scientists, state authorities and medical practitioners in the field knew that after any transfusion of blood or plasma (of which human serum is part) there was a risk that hepatitis would develop. Knowledge of this “post-transfusion hepatitis” became well established during the Second World War at the latest, in particular after an epidemic of hepatitis amongst US and Allied troops inoculated against yellow fever. In April 1942 the US Surgeon General, upon determining that human serum could transmit hepatitis from donors to recipients of the vaccine, ordered the omission of human serum from yellow fever vaccine production. The epidemic stopped.[211] Later the same year, in June, the UK Ministry of Health reported an outbreak of jaundice amongst children given measles convalescent serum from a particular batch (K60).[212] There was “no conclusive proof”[213] that the serum transmitted hepatitis, but enough concern that it might have done so as to justify investigation, and recall of the implicated batch. These events were reviewed by a senior medical officer in the Ministry of Health, who summarised the K60 incident, and noted that jaundice had more recently followed receipt of mumps convalescent serum, yellow fever vaccine, and whole blood or plasma transfusion. He noted an “almost complete absence of reported cases” after transfusion of blood or plasma and concluded that the association between transfusion and jaundice “may have been overlooked”.[214]

Confirmation of the association followed swiftly. By 1 December 1942, the Emergency Blood Transfusion Service discussed the fact that blood transfusion might result in delayed jaundice, caused by human serum.[215] On 2 December, a review of previous reports of post-transfusion hepatitis began with the words: “It must now be recognised that under certain circumstances at present undefined hepatic necrosis may follow the parenteral administration[216] of human blood products.[217]

Serum hepatitis” as it was called,[218] was well recognised by 1944.[219] That label distinguished it from “infectious hepatitis” which was caused by environmental conditions and spread principally through the oral-faecal route.

It was known at least by the time that the NHS began in 1948, that:

  1. transfusions carried a risk of post-transfusion hepatitis;
  2. this was often a delayed complication;[220]
  3. it was transmitted by a virus;
  4. it could be fatal;
  5. those transfused with plasma from a pool of plasma comprising a number of donations (about 500, according to an article in The British Journal of Social Medicine in 1947)[221] suffered higher rates of jaundice than those receiving single donations of whole blood. In other words, that pooling plasma from a number of donors significantly increased the risk that anyone receiving a transfusion from that pool would suffer jaundice in consequence.[222] Avoiding a large pool lessened this risk.

Jaundice was the label by which hepatitis was then more commonly known, because of the yellowing of the skin which was a frequent and obvious diagnostic sign of it.

Thus, by 1951, large pool plasma filtration was abandoned in favour of small pools because the prevalence of homologous serum jaundice was 10% in the former compared with 1% in the latter.[223] The Lancet reported in 1954 that “large pool” plasma (by which it meant a pool derived from 300 or more donations of blood) caused the highest incidence of hepatitis, but that those receiving small pool plasma had only a slight risk, similar to that carried by single donations of blood, and that it had already been decided to reduce the size of plasma pools to small pools to “restrict the dissemination of the infective agent.[224]

Although serum hepatitis was known to be caused by viral infection, the viruses which were causative were not identified for many years. The precise microbiological configuration of the viruses did not however need to be known in order to understand that post-transfusion hepatitis could have serious consequences. This became very well understood, whether the transfusion was of whole blood or plasma. As early as 1946, Dr William d’A Maycock recorded an agreement that users of plasma “must be told that it is a potentially lethal fluid which should be used with discretion.[225]

In 1966, the British Medical Journal had reported that when relatively large plasma pools were used the incidence of serum hepatitis could reach the alarming figure of 11.9%, though if the pool was of ten bottles or fewer the figure fell to 1.3%.[226]

By 1970 it was authoritatively reported that serum hepatitis caused death at a rate of approximately 1 in every 150 transfusions in the US in people over the age of 40. An infection rate of 3.6% of all transfused hospital patients was reported. US studies showed that the mortality rate in late middle age of those infected could be as high as 40%, and in patients over 60 about 50%.[227]

Deaths were known to be the consequence of liver failure, caused by chronic infection leading first to fibrosis, chronic active hepatitis, then to cirrhosis of the liver, and then in a significant number of cases to liver cancer. Though in many cases of serum hepatitis the effects might appear short-lived (a noticeable but quickly fading jaundice) or barely apparent at all, it was known that in many cases chronic infection would follow. Liver failure and liver cancer were then almost inevitably fatal. The risks of these consequences of chronic hepatitis were sufficiently appreciated not only by scientists but by the Government, such that the Scottish Home and Health Department said in a circular in December 1964 that: “All blood for transfusion must be regarded as potentially contaminated … The most important transmissible disease in this country is homologous[228] serum jaundice or serum hepatitis … No transfusion should be undertaken unless the benefits outweigh the risk of hepatitis.[229]

Because the viruses had not been identified, it was impossible (a) to develop a test to show if they were present in any donation of blood or plasma; or (b) to treat infection effectively, since it could not be shown except by passage of a considerable time that the virus had been inactivated in the person infected. The lack of any known serological marker[230] was often echoed by an absence of any specific symptom – many of the consequences of infection, such as tiredness, itchy skin, depression, a muzzy head or “brain fog”, could so easily be ascribed to the general activities of life, to ageing, to the demands of childcare or to hormonal changes. This, coupled with the challenges of relating current infection to an event possibly ten or twenty years earlier that might have caused it, undoubtedly meant that it often went unrecognised by the person who was infected, was under-reported, and was often diagnosed on clinical grounds far too late.

However, the risks of post-transfusion hepatitis were appreciated not only in the US and the UK, but also internationally. In 1952, an Expert Committee on Hepatitis of the World Health Organization (“WHO”) reported serum hepatitis as a serious problem, which in many cases came on insidiously and in some cases persisted beyond five years.[231] It suggested five measures to reduce the risks of the disease: (1) the selection of donors; (2) the control of pool size; (3) the treatment of plasma; (4) the maintenance of records; and (5) reporting.[232] If these measures had been taken appropriately during the next 30 years, in relation to the supply of blood and blood products in the UK, it is reasonable to think that a significant part of the suffering on which this Inquiry is focussed would not have occurred.

Elsewhere precautions were taken to lessen the chance that those risks might materialise. Thus in Germany, from 1965 onwards, all blood donated for possible transfusion was tested to see if it contained abnormally high levels of a liver enzyme, alanine transaminase (“ALT”). Raised ALT levels are not specific for hepatitis – they may be caused by a number of other circumstances too, for instance the consumption of alcohol, or obesity – but they can indicate liver dysfunction.[233] In this country, some comfort was taken from the fact that donors had no motive in giving blood other than to benefit another human being: they donated as a gift. By contrast, in the US, many, if not most, of those who gave blood and plasma did so to benefit themselves directly or indirectly, and inevitably had far less interest in the quality of what they were providing. The incentive was instead for them to be dishonest about their state of health.[234] The first of the five measures recommended by the Expert Committee on Hepatitis of the WHO in 1952 was thus addressed (at least in part) by reliance on the voluntary nature of blood donation in the UK.[235] It might also be addressed by asking some questions of a would-be blood donor, to help assess whether taking and using their donation might be more risky than usual.

“Infectious hepatitis”, which became known as Hepatitis A, was not identifiable as such until 1973.[236] However, in 1965, researchers identified an antigen which was associated with serum hepatitis.[237] This led to a virus being isolated in 1967, termed Hepatitis B,[238] though it took until 1970 when the “Dane particle” was discovered for the complete virus to be demonstrated by electron microscopy. By around 1970, there was finally a test to screen blood for the presence of Hepatitis B, though it was not sensitive enough to identify every case of infection in the blood.[239]

The Department of Health and Social Security (“DHSS”) appointed an advisory group which advised in July 1971 that the regional transfusion centres should begin testing “at the earliest possible date” and the DHSS accepted the recommendation.[240] The regional transfusion directors met that month and noted that the transfusion centres at Sheffield and Edgware were testing all donations, some centres were testing some donations and some would not be able to start until late in 1972.[241] From November 1971 the Blood Products Laboratory tested the plasma used to make blood products and from December 1972 all blood donations in England and Wales were screened, then mostly by immunoelectrophoresis.[242] The Scottish National Blood Transfusion Association annual report for the year ending 31 March 1972 recorded that routine screening had been in place in Scotland for at least a year in all centres and longer in some.[243] By 1975, two years after Hepatitis A was identified, there was a test for both Hepatitis A and Hepatitis B.[244]

Despite the increasing ability in developed economies to be able to screen out Hepatitis B from the blood supply, Italy also introduced ALT testing of donations with effect from 1970. This increased the protection against infections which might cause hepatitis but which Hepatitis B testing, particularly in its early forms, failed to spot.[245] The UK never adopted ALT screening.[246]

3.7 Hepatitis Risks 1970 and After


Four themes are explored in this chapter: a developing awareness of non-A non-B Hepatitis transmission through blood and blood products and of the potential seriousness, the repercussions of introducing a screening test for Hepatitis B, the risks associated with paid donations, and the risks of pooled donations.


Key dates

February 1970 The Gift Relationship discusses the increased risks of hepatitis with paid donations.

1972 Professor Garrott Allen emphasises the increased risk of serum hepatitis “from transfusions derived from prison and Skid Row populations”.

November 1972 publication by Dr Alter and others recognises that Hepatitis B does not account for all post-transfusion hepatitis.

August 1974 Dr Prince and others report in The Lancet that an agent other than Hepatitis B is the cause of 71% of cases of post-transfusion hepatitis and may be implicated in chronic liver disease.

January 1975 Professor Garrott Allen attempts to persuade Dr Maycock of the dangers of importation of factor concentrates.

January 1975 Dr Owen pledges to fund self-sufficiency in blood products.

May 1975 WHO urges self-sufficiency and supports voluntary non-remunerated blood donation.

August 1975 The Lancet reports on a hepatitis outbreak at Bournemouth haemophilia centre following treatment with Hemofil.

December 1975 broadcast of Blood Money highlights hepatitis risks of commercial concentrates.

February 1976 article by Dr Alter and others suggests the long-term prognosis of NANBH may be similar to Hepatitis B.

July 1976 article by Dr Hoofnagle and others suggests that until the nature of NANBH is “elucidated”, blood and blood products should be considered as potentially infectious.

1977 articles by Dr Alter, Dr Hoofnagle and others emphasise that NANBH can progress to chronic hepatitis and cause liver disease.

September 1978 publication in The Lancet of study by Dr Preston and others showing chronic liver disease in patients with haemophilia treated with factor concentrates.

April 1979 Dr Kernoff describes NANBH as “a serious disease with long-term consequences”.

September 1980 international symposium in Glasgow considers risks of NANBH.

September 1980 memo from Dr Walford describes NANBH as a form of hepatitis that “can be rapidly fatal ” or “can lead to progressive liver damage”.

July 1981 The British Medical Journal reports hepatitis as “the major complication of the modern treatment of haemophilia.”


Abbreviations

ALT alanine transaminase

NANBH non-A non-B Hepatitis (Hepatitis C)


Four themes predominate during the 1970s: a developing awareness that viruses other than Hepatitis A and Hepatitis B – in other words, non-A non-B Hepatitis (“NANBH”) (which would later come to be known as Hepatitis C) – were being transmitted through blood and blood products; the repercussions of introducing a screening test for Hepatitis B; the knowledge of the risks associated with the paid donation of blood for imported factor products; and the risks that pooled donations presented.

In February 1970, Professor Richard Titmuss published a book which was recognised in The New York Times as one of the seven most important non-fiction books to be read that year. [247] In his evidence to the Inquiry, Lord David Owen said he thought every doctor would have been aware of it.[248] The Gift Relationship compared the system of voluntary non-remunerated blood donation in those countries such as the UK, which saw the giving of blood for the treatment of others in need as an important social obligation, with systems such as those in the US, which relied substantially upon “donors” being paid to give their blood. One saw donating blood as giving the gift of life; the other saw it as buying and selling a commodity. There was a market in blood, and in particular in plasma. Pharmaceutical companies in the US and elsewhere[249] used plasma in particular to produce products which could be marketed worldwide. Unless precautions could be taken, a virus prevalent in the blood and plasma of even one member of the donor cohort could find its way into a blood product, and hence to a recipient in a country where the virus was less prevalent, if present at all. Where no test is available to identify and exclude a virus, any single donation might carry that virus. It is more likely to do so where the donation is sold for private benefit, rather than given altruistically. Moreover, there is a commercial incentive to buy necessary raw materials as cheaply as practicable. The raw material for blood products was human in origin. In stark contrast to sourcing blood from unpaid donors, who will come from any part of society, rich or poor alike, US pharmaceutical companies bought plasma from cheap sources, including “down-and-outs” in particular in New York and San Francisco, prisoners, and those living in poorer countries such as Mexico, Belize and Lesotho.[250]

President Richard Nixon directed his officials to recommend an improved blood collection and distribution system to reduce reliance on commercial blood banks, which were described as often accepting “blood from such donors as derelicts and drug addicts who may be the transmitters of such diseases as hepatitis, syphilis and malaria. A study made two years ago indicated that 30,000 Americans contract hepatitis each year through transfusions of contaminated blood with 1,500 of them dying from the effects of the disease.[251]

In his book, Professor Titmuss drew attention to a series of studies demonstrating the risk of hepatitis incurred by those who received transfusions of whole blood or blood products, to support his case that a system relying on voluntary non-remunerated donations was very much safer not only in theory but in reality.[252] Very shortly after he published his book, Professor Joseph Garrott Allen of Stanford University confirmed exactly that same message: the hepatitis risk posed by blood commercially sourced was markedly higher than the risk from blood and plasma sourced from truly voluntary donors.[253] Nor was this a new warning: he had drawn attention to it as long before as September 1959,[254] and had returned to the theme in several following publications.[255] Professor Garrott Allen emphasised also the seriousness of transfusion hepatitis: “The numbers of patients with transfusion hepatitis … who will be able to show disability or who will die of this disease, will be approximately 0.9% of the total transfused.[256]

When Professor Titmuss wrote, he did not distinguish between the different viruses which together caused serum hepatitis. It is reasonable to think that until the early 1970s scientists, doctors, politicians, administrators, and manufacturers of blood products assumed that the inflammation of the liver which gave the disease its name was caused by just one virus.[257] It may be that they thought that with the discovery of the Australia antigen in 1965 and identification of the Hepatitis B virus by 1967 that the virus had been found. In 1970 it was shown to be possible to screen blood donations for the presence of Hepatitis B. The Report of the Advisory Group on Testing for the Presence of Australia (Hepatitis-Associated) Antigen and its Antibody noted that: “Knowledge of all aspects of Australia (hepatitis-associated) antigen is accumulating very rapidly” and that “Although the hepatitis agent may be less widely dispersed in the UK than in some other countries, the institution of testing donations for Australia antigen should reduce the incidence of serum hepatitis, which is the most serious complication of transfusion and so avoid suffering and disablement and even death.[258]

By October 1971, the Blood Products Laboratory was screening plasma for Hepatitis B.[259] Though the tests were imprecise,[260] it was not unreasonable to think that the cause of post-transfusion hepatitis had been identified and the risk of infection had been reduced in consequence. It was hoped that once the tests were improved the blood supply (and products made from tested plasma) would be relatively free of any significant risk of post-transfusion hepatitis. This was not to be. Any such optimism soon began to evaporate, starting from 1972,[261] and growing throughout the mid 1970s, when it was increasingly reported that hepatitis was occurring after transfusion, yet when tested the patient was suffering neither from Hepatitis A nor Hepatitis B.[262]

Of particular note was the report by Dr Alfred Prince and others in The Lancet in August 1974 that an agent other than Hepatitis B was the cause of 71% of cases of post-transfusion hepatitis: “The data suggest that a large proportion of long-incubation post-transfusion hepatitis is unrelated to hepatitis B and that control of post-transfusion hepatitis will require identification of a hepatitis virus(es) type C.” As to the potential significance of this virus:

“The fact that non-B hepatitis cases are less frequently associated with serious acute illness does not imply that such cases are of lesser importance. Long-term complications of acute hepatitis-B infection, such as chronic hepatitis, cirrhosis, and hepatoma, have been reported to follow mild anicteric infections more frequently than severe icteric cases; consideration must thus also be given to the possibility that non-B hepatitis may play a role in the aetiology of some forms of chronic liver disease.”[263]

This publication was one of the most widely-read journals by clinicians in the UK: no clinician dealing with transfusions had any reason to be unaware of this conclusion.

Though outbreaks of Hepatitis B continued in the 1970s,[264] the majority of infections post transfusion were of this third type. Not knowing whether it was one virus (which might have justified the label “Hepatitis C”) or more than one, science opted for the label “non-A non-B Hepatitis”. Though cases of serum hepatitis which were undoubtedly Hepatitis B continued, since the tests used to screen donations were imprecise and viruses “slipped through the net”, they formed a smaller part of serum hepatitis. But serum hepatitis continued, now principally in the form of NANBH.[265]

On 6 January 1975, Professor Garrott Allen wrote to Dr William d’A Maycock (the consultant advisor to the CMO[266] in respect of the Blood Transfusion Service) to attempt to persuade him to advise against the continued importation of some factor concentrates from the US to the UK. He said that one commercial product which “as you know” was sourced “100 percent from Skid-row derelicts” was extraordinarily hazardous – a 50-90% rate of hepatitis developed from its use. He added that “The other imponderable which has troubled most of us is the ineffectiveness in screening for the HB antigen … This failure, of course, dates back to at least 1971, and suggests that half, if not more, of the cases of posttransfusion hepatitis are caused by an agent other than Hepatitis A or B.[267]

Two other significant warnings were given in the course of 1975: in May the World Health Organization urged the development of national blood services based on advised self-sufficiency for all nations, and supported voluntary non-remunerated blood donations (ie self sufficiency for all nations),[268] and in December Granada’s World in Action highlighted on TV in two documentaries called Blood Money that there was a high risk of hepatitis in blood sourced commercially from prisons and “skid row” paid donors, and used to make coagulation products.[269] It was obvious that there was a continuing risk of serious hepatitis from the use of commercial products which on two bases were more dangerous than domestic products: (a) the source material came from donor populations where the underlying rate of hepatitis infection was between five to twenty times greater than that in the UK; and (b) the risk was amplified by the much larger pool sizes used in commercial manufacture.

A way of avoiding the need to rely upon the purchase of commercial products, and the risks they brought with them was to achieve self-sufficiency (producing enough of the necessary product domestically to avoid any need for importation). Since Self-Sufficiency has a chapter of its own, only a short sketch needs to be made of it at this stage.

The desirability of self-sufficiency had been recognised for over seven years by this stage. One of the most authoritative medical voices of the late 1960s in the field of haemophilia, Dr Rosemary Biggs, made a prediction in August 1967 that within the “next year or two” very large amounts of commercial product would become available in the US. “When this material comes on to the market we shall be obliged to buy it at a very high cost”.[270] Self-sufficiency in blood supply had always been a policy objective since the Second World War. Blood products were forming an increasingly important part of this supply by 1970. The goal of self-sufficiency remained, given impetus by Professor Titmuss’ book, national pride, and the belief, which it is clear Dr Biggs shared, that purchasing supplies from abroad would be more expensive than producing blood products domestically. This was in addition to the knowledge that if self-sufficiency were achieved the risk to any recipient of blood or a blood product would be derived from the domestic population, and not from a country where there might be different viruses, or different diseases, transmissible through blood.

Dr Owen was himself convinced of the need for self-sufficiency. He made a pledge to Parliament on a number of occasions in 1975 that the Government would fund self-sufficiency in blood and blood products.[271] He intended that it should be achieved by mid 1977 if not earlier, such that the importation of the riskier commercial products into the UK need continue no longer. However, self-sufficiency did not occur for well over a decade.[272] In that period,[273] risks from blood products eventuated which had in the main been imported. Suffering and deaths resulted in the population of the UK, as could have been (and was) both foreseeable and foreseen.

Hepatitis B often revealed itself clinically by a yellowing of the skin: it was “icteric”. NANBH was less likely to be icteric. Liver function tests showed consistently elevated enzyme levels (in particular of alanine transaminase (“ALT”)) in Hepatitis B infection, but generally levels which fluctuated and were less elevated in NANBH. Hepatitis B was often symptomatic in its acute phase (“acute” means occurring within six months of infection: it does not refer to the severity of the symptoms caused by the infection), whereas NANBH was more rarely so – though more likely to result in a persistent chronic infection (“chronic”[274] means lasting for more than the first six months of infection).

For some time a number of doctors held the view that NANBH was a mild or benign disease. The view that the disease was mild rested centrally on assertion and/or wishful thinking[275] rather than evidence, and was reached by a comparison with Hepatitis B in the acute phase rather than on epidemiological studies. Such evidence as there was for it amounted to noticing that the skin often did not yellow, as it usually did with Hepatitis B; that the extent to which liver function tests showed high levels of ALT or aspartate aminotransferase (“AST”)[276] was usually less than in the case of Hepatitis B, and that the symptoms arising in the first six months after a transfusion thought to be causative were usually less significant. This represented a serious and collective failure of judgement amongst the many who asserted it. At its heart was making an unjustified assumption, and asking the wrong questions.

The unjustified assumption was that most or all of the serious effects of what had previously been known as serum hepatitis were attributable to the recently identified Hepatitis B.

In the 30 years since the ending of the Second World War post-transfusion hepatitis was noted sometimes to have a short-lived, acute, flu-like phase, and then remain apparently asymptomatic for several years. Though the damage it caused, and its presence, were potentially detectable by liver biopsy, there might be little obvious reason to conduct such a test. Indeed, to do so in a person with a bleeding disorder might be risky, for some bleeding would be bound to occur and steps to control this might not work well.[277] Eventually, however, it was known that in many cases increasing liver dysfunction might appear, leading to cirrhosis and sometimes to cancer. That is why “serum hepatitis” was regarded as a serious disease. It was sufficiently so that it was thought that the costs in time and resources involved in screening all blood donations for the presence of Hepatitis B were justified, even though the tests initially used were able to identify only around one third of the infections.

By the mid 1970s, it was understood that serum hepatitis had at least two contributory viral causes. Unless there was good evidence to show that Hepatitis B on its own had caused almost all the serious effects previously attributed globally to “serum hepatitis”,[278] NANBH viruses simply could not be assumed to be any less harmful. The observed effects of serum hepatitis might just as well have been caused by the major[279] viral component[280] of serum hepatitis (NANBH) as by what now seemed the lesser (Hepatitis B). There could have been no proper confidence that NANBH would not have serious long-term consequences which would emerge only after an extended period of chronicity. After all, hepatitis by definition involves some damage to the liver. Damage to the liver is potentially serious.

It is not the judgement of enlightened hindsight, but a realistic appraisal to say that clinicians and researchers should have been alert at that time to what was at least a risk of chronicity, and receptive to reports that the long-term consequence of hepatitis was serious, earlier than was the case.[281] Some were. But by no means all.

The question that should have been asked, particularly given what was now at least a 25-year history of familiarity with serum hepatitis, was whether NANBH might have long-term consequences: put another way, whether the risk that it might do so could be excluded. Dr Pier Mannucci, a highly respected haematologist, noted in 1975, after considering liver disease in people with haemophilia which had produced no overt symptoms, the possibility “that repeated and prolonged contact with the infective agent(s) may cause chronic liver damage not associated with overt illness”.[282] Just a few months later Dr Robert Purcell, Dr Harvey Alter (later to be awarded the Nobel Prize for his work in this area) and Dr Jules Dienstag commented further that “Although type non-A, non-B hepatitis is associated with less severe acute illness than type B disease, as judged by frequency of jaundice and magnitude of SGPT [serum glutamic-pyruvic transaminase] elevations, the long-term prognosis for the two diseases may be similar”and “chronic non-A, non-B hepatitis is not necessarily a benign infection and may be the cause of a significant proportion of chronic hepatitis not identifiable as type B disease”;[283] and in their turn Dr Jay Hoofnagle and others, just a short while later again, added: “Until the nature of this [NANBH] virus and its disease is elucidated, it is important to consider human blood and pooled plasma products as potentially infectious. At the present time, fibrinogen, AHF [antihemophilic factor] and Factor IX concentrates remain ‘high-risk’ plasma products”.[284]

The alert might have been sounded yet more clearly by the report of an outbreak of hepatitis at the haemophilia centre in Bournemouth after the administration of Hemofil, a commercial factor concentrate, which was the subject of a report in The Lancet in August 1975.[285] Subsequently, Dr John Craske (of the Public Health Laboratory Service) and Dr Peter Kirk (at Lord Mayor Treloar’s College, Hampshire) produced a retrospective survey of Hemofil-associated hepatitis. They concluded that the incidence of chronic consequences due to NANBH “is at present unknown”, and that “Further follow up will be required to assess the incidence of chronic liver disease after non-B hepatitis”.[286]

By 1978 there were a number of reports showing that NANBH was linked to persistent liver damage.[287] Dr Hoofnagle and others wrote in 1977 that:

“Several clinical and epidemiological features of non-A, non-B hepatitis have become clear from studies such as the present one. First, non-A, non-B hepatitis closely resembles type B hepatitis. The incubation period, the clinical symptoms and signs, and the potential for chronicity appear to be similar to type B hepatitis … Undoubtedly, what was once referred to as ‘serum hepatitis’ included both type B and non-A, non-B hepatitis. Second, [most cases of] non-A, non-B hepatitis … have been described in association with transfusion, intravenous drug use, or serum inoculation … Third, non-A, non-B hepatitis appears to be associated with a chronic carrier state and chronic liver disease … Finally, non-A, non-B hepatitis appears to be common. Three of the five infectious donors studied here transmitted this non-A, non-B hepatitis.”[288]

To like effect, Dr Alter, also in 1977, emphasised that “Although non-A, non-B hepatitis is, on the average, less acutely severe than type B hepatitis, it can cause severe acute disease and, more disturbing, it appears to have considerable propensity to progress to chronic hepatitis.” He urged that the major thrust of post-transfusion hepatitis research be directed at developing methods for detecting the virus or developing a method of viral inactivation.[289]

In September 1978 The Lancet published a paper by Dr Eric Preston and colleagues in Sheffield, reporting on the “systematic screening of forty-seven haemophiliacs in Sheffield” which revealed abnormal liver function tests in 77% of the patients, “with a tendency for these abnormalities to persist.” Liver biopsies on eight symptom-free patients demonstrated a “wide spectrum of chronic liver disease … including chronic aggressive hepatitis and cirrhosis, which the authors attributed to treatment with factor concentrates. NANBH rather than Hepatitis B was thought to be an “important factor” in four of the eight patients.[290]

In his oral evidence, Dr Mark Winter said that this paper “blew out of the water instantly the idea that this [NANBH] was nothing to worry about because their study showed, as did other studies, that most of these patients had very significant chronic liver disease”. It was something haemophilia doctors could not simply ignore.[291] He ascribed doctors as having been unwilling to think that NANBH might be a problem because factor concentrate had brought “such spectacular benefits”: it was this reluctance to face the facts as portrayed in scientific journals that had therefore prevented earlier acceptance of the seriousness of the problem.[292] There had, in his view, been a “golden interval” after screening for Hepatitis B first began, during which the dangers of NANBH had not yet fully been understood, and it seemed as if Hepatitis B was going to be a diminishing threat as screening tests improved. It ended, as he saw it, in the publication by Dr Preston and others.[293]

In January 1979 Dr David Dane, of the Middlesex Hospital, after whom the “Dane particle” which had enabled successful screening tests for Hepatitis B was named, wrote to Dr Sheila Waiter at the Department of Health and Social Security (“DHSS”) about NANBH, with the observation that “If one or more of these viruses is responsible for the abnormal livers which are evidently common among haemophiliacs then chronic liver disease due to these viruses might also be found among other transfused individuals.[294]

By February 1979 the Medical Research Council had been told by the chief scientist of the DHSS that non-A non-B Hepatitis was “being given high priority by the Department”.[295] An ad hoc meeting was convened (on 12 March 1979) at which Dr Craske noted that studies in people with haemophilia showed that NANBH might severely damage the liver. He reported that US and German workers said that up to 40% of NANBH infections progressed to chronic liver disease.[296] Professor Arie Zuckerman told the meeting that “much non-A non-B associated PTH [post-transfusion hepatitis] might be anicteric, and that the risk of progression to chronic liver disease remained, however mild the initial infection.

Concern about the dangers of hepatitis was not consigned to the text of scientific journals. At least one commercial supplier thought them sufficiently serious to justify a price premium for a product which had less risk of transmitting hepatitis. Source plasma was thought far more likely to be infected if it came from the US than plasma sourced elsewhere: and it was reported in April 1979 to the UK Haemophilia Centre Directors’ Organisation reference centre directors that Immuno Ltd (based in Vienna) now sold their Kryobulin Factor 8 concentrate at two prices. The cheaper was that which had been manufactured from US plasma.[297]

At the end of April 1979 Dr Peter Kernoff wrote to Dr Brian Colvin, in the latter’s capacity as secretary of the North East Thames Region Association of Haematologists, observing that NHS concentrates were to be preferred to commercial concentrates because of:

“the growing awareness of the probability that commercial concentrates have a higher risk of transmitting non-A non-B hepatitis than NHS material. This is a serious disease with long-term consequences which, as far as is known, is at present much less common in the U.K. than in those parts of the world – particularly the U.S.A. – where donor blood for commercial concentrates is collected … The only medium to long-term solution to these problems is for the NHS to markedly increase production of factor VIII.”[298]

The Lancet added to the concerns being expressed about the long-term consequences of NANBH in a report in May 1979. It spoke of NANBH being related to a “high frequency of persistent hepatic dysfunction”.[299] In a report to the DHSS for 1978-79, Professor Zuckerman recorded the conclusion of research at the University of London that “until blood donors can be specifically screened for the virus(es) of non-A, non-B hepatitis, it would seem wise to restrict the use of blood concentrates to life-threatening situations.[300]

The wisdom of this course was echoed on 16 May 1979 by the North East Thames Regional Association of Haematologists Haemophilia Working Party when it issued guidelines on the screening and investigation of hepatic disease in patients with congenital coagulation disorders. They said: “Despite the generally mild nature of acute non-A non-B hepatitis it seems very possible that there may be serious long-term sequelae and the acute disease may sometimes be fatal.”[301] This drew an important distinction between acute NANBH and chronicNANBH: there is a suspicion that those clinicians who at the time described NANBH as “generally mild” were thinking of the acute phase – there could be little doubt that the chronic phase (which would follow in all bar some 20% of infections)[302] was potentially anything but mild.

In July 1980 the findings of liver biopsies undertaken at Sheffield Children’s Hospital were published. Five boys with haemophilia whose liver function was persistently abnormal underwent biopsies: in each case the biopsy “confirmed underlying chronic liver damage, ranging from chronic persistent hepatitis to chronic aggressive hepatitis with early cirrhosis.” Only one had serological evidence of previous Hepatitis B infection.[303]

An international symposium into “Unresolved problems in Haemophilia” was held in Glasgow in September 1980. During it, Professor Peter Scheuer from the Royal Free Hospital said that the American literature suggested that when compared with hepatitis B, “the non-A, non-B tends to be more symptomless. Secondly, the incidence of serious sequelae is likely to be higher than the B.” When speaking of “symptomless” he was plainly talking of the acute phase: his “serious sequelae” shows that. Dr (later Professor) Howard Thomas and Dr (later Professor) David Triger (hepatologists) agreed, for their part, that “we are just building up trouble” and “it is in 10 years time that we shall see the problems. Bearing in mind the proportion of the patients that are infected, or have persistent abnormal liver function tests, anything from 60 to 80 per cent, it will be an enormous problem when it happens.[304] Professor Thomas told the Inquiry that the messages from the symposium, the 1978 Sheffield study and other publications from 1981 and 1982 “were all in accord really.[305]

Because the symposium was attended by leading lights in the treatment of haemophilia and hepatitis in the UK,[306] and followed on from the annual meeting of haemophilia centre directors, there is strong reason to suppose that haemophilia centre directors were either aware, or would rapidly be made aware, of the views of the experts at the symposium. There is no good reason after that why they should have thought that NANBH was benign.

By 1980 Dr Diana Walford at the DHSS was confident enough of the position to write a memo during the course of which she said: “I must emphasise that 90% of all post-transfusion (and blood-product infusion) hepatitis in the USA and elsewhere is caused by non-A, non-B hepatitis viruses which (unlike hepatitis B) cannot, at present, be detected by testing donor blood. This form of hepatitis can be rapidly fatal … or can lead to progressive liver damage. It can also result in a chronic carrier state, thus increasing the ‘pool’ of these viruses in the community.[307]Notwithstanding the contents of the memo from Dr Walford, Lord Cullen, speaking in the House of Lords on behalf of the Government in February 1981, said that “There is a danger that Factor VIII, which has to be injected into haemophiliacs, can have in it a strain of hepatitis, and at the moment there is no way of testing for these strains. That is the one product as to whose freedom from infection we cannot be absolutely certain. However, every effort is made to see that it is not infected, and although occasionally something may happen, it is not of a serious nature.[308]

Only a matter of days after Dr Walford’s memo, Dr Craske told a meeting of UK Haemophilia Centre Directors that: “Large pool concentrates appeared to give a higher risk of hepatitis than small pooled concentrates and Dr Craske felt that increased usage of small pooled concentrates would help to reduce the incidence of hepatitis in the haemophilic population.[309]

At the very start of the 1980s, therefore, it was clear that hepatitis in consequence of a blood transfusion, or receipt of a blood product to treat haemophilia, carried with it a serious risk of long-term consequences.[310] It was known that this could not be ascribed simply to Hepatitis B infection, for that had been progressively screened out from the start of the 1970s (though it still remained as a risk and researchers were alert to the need to exclude it).[311]

It was clear that after transfusions (or blood products) liver damage or cirrhosis sometimes developed. It was clear that the cause in a number of cases was certainly not Hepatitis B. It was thus clear that there was some other cause – probably non-A non-B hepatitis. Thus when, after screening, it was apparent that hepatitis was still being transmitted and that it was not Hepatitis B, no assumption ought to have been made that it was less serious unless there was convincing proof that was the case.[312] It should have been assumed there was such a risk, rather than the opposite, unless it had become clear that the assumption was unjustified. It never was. It never could be. Those suggesting it was the case were not taking the careful, methodological approach of fitting the theory to the observed facts, but rather of wishing the world to be other than it truly was. They asked the wrong question. Instead of asking whether there was sufficient evidence that NANBH was a serious disease and/or as serious a disease as Hepatitis B they should have asked whether they could reasonably exclude the risk that it may cause the long-term consequences that serum hepatitis had been shown to cause. They might have added whether they could reasonably ignore the risks to which a number of well-regarded researchers had referred to in various prestigious publications.

The view that NANBH was relatively benign and non-progressive nonetheless remained difficult to shift in the early 1980s. Professor Charles Hay, whose personal view by 1984/5 if not earlier was that NANBH was neither benign nor non-progressive, nonetheless thought that was a generally held opinion.[313] He described it in evidence as “the consensus view”.[314] This has been frequently said, but it does not fit very closely with the contemporaneous evidence. That was to the effect that acute NANBH was less severe than acute Hepatitis B: but that in the longer term there was significant evidence accumulating that a large proportion would develop active hepatitis leading potentially to cirrhosis. It was not just the view of a handful of researchers or clinicians. Thus Armour Pharmaceutical published Plasma Perspectives in July 1981 to inform readers of the current state of knowledge, saying: “Studies of the histopathological sequelae of acute non-A, non-B infections indicate that chronic liver damage, which may be severe, may occur in as many as 40-50% of the patients whose infection is associated with blood transfusion or with treatment by haemodialysis.[315] In that same month The British Medical Journal editorial was devoted to post-transfusion hepatitis, referring to it as remaining “the major complication of the modern treatment of haemophilia.” Some reference was made in evidence to a paper by Dr Richard Stevens and others, headed Liver disease in haemophiliacs: an overstated problem? What the article did not suggest was that NANBH was a mild disease, since it concluded that the results found were similar to reports in larger studies where 16% (ie roughly one in every six people) had suffered chronic active hepatitis and cirrhosis.[316] The very fact that the article was termed “an overstated problem?” (when recording results consistent with between one sixth and one fifth of infected persons going on to develop active liver disease) suggests that the authors understood the general view at the time to be that, indeed, it was to be taken seriously.

There were some reasons for wondering if NANBH did cause serious deterioration – in 1981 Stirling et al in Liver function in Edinburgh haemophiliacs: a five-year follow-up compared results before introduction of NHS concentrates in 1974 with those in 1979. Liver function in those treated with Factor 8 concentrate had deteriorated; those treated with cryoprecipitate had not. The article commented: “there does not [at present] appear to be sufficient evidence of any serious deterioration in liver function from NHS concentrate to limit its current use for patients on home treatment, for whom the convenience of the product is all important. It would seem reasonable, however, that patients in hospital should whenever possible receive non-pooled cryoprecipitate instead.”[317]

However an editorial in the British Medical Journal in July 1981 chillingly observed: “in some cases early death from liver disease might prove to be the price paid by haemophiliacs for the improved quality of life afforded by the easy availability of clotting-factor concentrates.” It spoke of three practices which might reduce the risks, focussing on “the risks of collecting plasma from paid as opposed to volunteer donors; the optimum size of the plasma pool; and attempts at removing the several viruses of hepatitis from blood products.” As to the first of these, the British Medical Journal said the rate of infection from paid donor blood was six to seven times that from volunteers; the rate of infection dropped by 75% in a hospital which changed from paid to volunteer blood.[318] As to the second, Dr Stirling’s report on Scottish patients showed that use of cryoprecipitate did not affect liver function by contrast with commercial concentrate.[319] As to the third, “three recent reports [suggested] that viral contamination may be removed by specific processing by chemicals, ultraviolet light, or heating.[320]

It could have added that the “optimal use of blood and its components” would reduce the risk of NANBH.[321]

Since 1980 it has often been said, almost by way of apology or excuse for the failure of many to realise soon enough the risks posed by NANBH, that what was labelled either NANBH or Hepatitis C was a “new” or “newly discovered” virus, as if it had emerged for the first time in the mid 1970s, or later when the label Hepatitis C was first adopted. That is simply wrong. It was in truth almost as old as the hills.[322] Its genetic code was first established in 1988, and a test for its presence developed in 1989, and it was first labelled Hepatitis C shortly afterwards: so it became recognisable, and could be screened for, only then. It was thus “new” only in the sense of being newly coded and freshly labelled, but it was not new in the same sense that HIV, or more recently COVID-19, have been. Its consequences, wrapped up with those of Hepatitis B to similar effect and known as serum hepatitis, had been known of for a long time, before either Hepatitis B or C was identifiable (or labelled) as such.

3.8 Knowledge of the Risks of AIDS

This chapter examines the progression of the knowledge of AIDS between 1981 and 1984.


Key dates

5 June 1981 CDC reports a cluster of people suffering from immune system failure, with a further 36 cases in July and 70 more in August.

December 1981 CDC reports cases in intravenous drug users.

March 1982 infectious agent as the cause of AIDS is the leading hypothesis for CDC.

June 1982 AIDS discussed at Second International Symposium on Infections in the Immunocompromised Host, Stirling.

9 July 1982 CDC director alerts all US haemophilia centres to a new disease and requests notification of all new cases.

16 July 1982 CDC reports three confirmed cases of people with haemophilia who had developed AIDS; Dr Gunson alerts DHSS.

August 1982 CDSC introduces surveillance for AIDS.

3 October 1982 US National Hemophilia Foundation urges pharmaceutical companies not to use plasma from gay men, intravenous drug users or people from Haiti.

November 1982 report from Dr Craske indicates likeliest cause of AIDS is infectious agent.

November 1982 Observer reports “major speculation” that AIDS virus is “carried in the blood”.

December 1982 MMWR reports the “San Francisco baby” case, the death of the three patients from the July 1982 MMWR and more cases of AIDS in haemophilia patients.

13 January 1983 New England Journal of Medicine: “If the use of cryoprecipitate will minimize this risk, the current home-infusion program needs to be revised”.

19 January 1983 discussion of AIDS at UKHCDO Hepatitis Working Party meeting.

24 January 1983 meeting at Heathrow between haemophilia doctors and Immuno.

29 January 1983 Lancet reports that bloodborne agent seems likely cause of AIDS.

3 February 1983 New Scientist suggests “prime suspect” is blood borne virus.

7 March 1983 letter from Dr Evatt (CDC) to Professor Bloom says epidemic is evolving with a frightening pace.

26 April 1983 Professor Bloom reports probable case of AIDS to CDSC.

1 May 1983 Articles in Observer and Mail on Sunday describe risk of AIDS.

4 May 1983 Professor Bloom’s letter published by the Haemophilia Society.

9 May 1983 paper from Dr Galbraith (director, CDSC) recommends suspending importation of US blood products.

20 May 1983 isolation of a viral particle (then called “LAV”) in Paris by Dr Luc Montagnier. It turns out to be the same as HTLV-3 and HIV.

August 1983 first death from AIDS in the UK of a person with haemophilia.

December 1983 The British Medical Journal: no evidence any factor product is AIDS-free.

23 April 1984 Dr Robert Gallo announces he has found the virus which causes AIDS.


Abbreviations

CDC Centers for Disease Control, US

CDSC Communicable Disease Surveillance Centre


AIDS was unrecognised in the Western world until 5 June 1981, when the Centers for Disease Control and Prevention (“CDC”) in the US reported a cluster of five cases of people suffering from a failure of their immune systems, allowing a form of pneumonia (pneumocystis carinii pneumonia (“PCP”)) to develop. The cluster was in the gay community. Its discovery has elements of a detective story. The CDC had control over the supply of pentamidine. This was a drug used specifically to treat PCP. There was not much call for it across the US – until reports arrived of a spike in demand for it in Los Angeles. So the CDC investigated: Why there? Why all at once? What they found was that those who needed pentamidine were all men in early middle age. They had been previously healthy. They had no clinically apparent, underlying immunodeficiency. Yet their immune systems had failed to prevent PCP taking hold. The alarm bells sounded. There was initially a widespread view that what caused this must be something associated with “a gay lifestyle”, for apart from age and living in Los Angeles this was the one factor which they shared in common. [323]

The CDC inquired further. Four weeks later the CDC reported in the Morbidity and Mortality Weekly Report (“MMWR”) that it had found approximately 10 more such cases, again amongst gay men, and in addition 26 cases of gay men who had developed Kaposi’s sarcoma (“KS”), a skin lesion, which had been diagnosed within the previous two and a half years.[324]

By August 1981, an additional 70 cases of PCP and KS amongst gay men were reported; and by December 1981 the same failure of the immune system which gave rise to these characteristic opportunistic infections was seen in intravenous drug users.[325]

By March 1982, the possibility that the cause of AIDS was an infectious agent had developed as a leading hypothesis: the CDC investigators then reported as much to the Food and Drug Administration (“FDA”).[326] The CDC had experience of the way in which Hepatitis B was transmitted by blood, and by sex. The CDC saw the same epidemic pattern emerging when in the MMWR of 16 July 1982 it reported three cases which had been confirmed in people with haemophilia who had developed AIDS.[327]

This report had not come completely out of the blue. William Srigley of Cutter Pharmaceutical was aware before then (“very early”) that a homosexual donor who was hepatitis positive risked transmitting AIDS.[328] Dr Henry Kingdon, vice president and general manager at Hyland Division, recorded in January 1983 that they had been closely monitoring AIDS since December 1981.[329] Thus, in early 1982 at least two pharmaceutical companies could see that if the cause of AIDS was an infectious agent then blood – and therefore blood products – might transmit it. Their actions show they were not certain that an infectious agent could be ruled out. Moreover, the finding of three people with haemophilia who had developed AIDS was what the CDC expected having itself thought that an infectious agent was likely: for they saw the disease spreading in the same way as an earlier epidemic of Hepatitis B had done. Dr Bruce Evatt’s first-hand account shows the first report of signs of AIDS in a person with haemophilia came in early 1982 and then the question was what characteristic did people with haemophilia, Haitian immigrants to the US, intravenous drug users and male homosexuals share, for it seemed there would have to be a circumstance common to all.[330]

News of the spread of AIDS to people with haemophilia, and no other apparent risk factor, predated 16 July by some days. The CDC director and Assistant Surgeon General of the US wrote to all haemophilia centres to alert them to this on 9 July 1982, and to ask that they report any case of PCP amongst their patients.[331] On 14 July the National Hemophilia Foundation in the US alerted its members that there was a risk to people with haemophilia.[332] Dr Harold Gunson in the UK was alerted to this development, shortly before 16 July 1982, in his capacity as consultant adviser on blood transfusion to the Chief Medical Officer. The Department of Health and Social Security (“DHSS”) thus had knowledge of it even before those who read the MMWR in the world as a whole were told by the publication.[333]

Although the similarities with the way Hepatitis B had spread suggested a viral aetiology, the precise cause of AIDS remained elusive. Although the exponential growth from month to month was suggestive of an epidemic, no infectious agent had yet been discovered. A number of possibilities were advanced to explain why the immune systems of those who later developed AIDS had begun to fail. The first five victims identified were all linked by the fact that they were gay.[334] Some suggested that the use of amyl nitrite poppers, which were regularly used in the gay community to heighten sexual experience, might have had a role to play.[335]

The “S” in “AIDS” stands for “syndrome”. A syndrome is a combination of a number of characteristic illnesses: none is sufficient on its own for a clinician to diagnose with any confidence a failure of the immune system from any particular cause. It was noted that those people with haemophilia who appeared to have developed the syndrome did not, usually, show signs of KS as part of their mix of illnesses. It was suggested that the failure of their immune system might have been because it had slowly been worn down by having to cope with an excess of foreign protein.[336] This was because the rapid expansion of home therapy with factor concentrates, and the sporadic giving of it prophylactically to protect an individual from future bleeds rather than simply to respond to a current bleed, meant that their bodies had been exposed to a much greater range of proteins than others had experienced previously when the treatment regime involved mainly cryoprecipitate. The factor concentrates supplied were not “pure” (in the sense of being free of any protein other than the factor of interest). They contained “foreign” fibrinogen and fibronectin in particular. So, the exponential growth in demand implied that individual immune systems had a far greater load of “foreign” protein to cope with, and some could not manage it. Other suggestions too were advanced, such as a link with cytomegalovirus, though with less general support.

By June 1982 what was known about AIDS was already a principal subject of international scientific concern. In that month, the Second International Symposium on Infections in the Immunocompromised Host was held in Stirling, Scotland. Few haematologists attended, though Dr Ian Hann, and (he thought) at least one representative of the Scottish National Blood Service, did.[337] The meeting was shocked by developing information about AIDS.[338] Dr Hann left the conference realising that it might be relevant to haemophilia patients, and later was to comment that what he heard was “a bombshell” in terms of the nature and severity of the new disease. Though the central focus of the conference was not haematology, his evidence was that it filtered through – he called it “part of the burgeoning knowledge that began to explode at that time.[339]

By the end of July 1982 the US Public Health Service had set up a Committee on Opportunistic Infections in People with Hemophilia, to consider whether the use of certain blood products placed them at risk of developing AIDS. Three of the leading pharmaceutical producers attended an open meeting of the Committee together with representatives of the CDC.[340] The Committee concluded that AIDS was probably caused by an infectious agent, but it was unclear whether people with haemophilia were at risk. It recommended conducting studies, and that techniques be developed urgently to reduce the risk of infection from Factor 8.[341] Though it was not yet certain that AIDS was caused by an infectious agent, this was a clear recognition that at least a risk existed that this was the case.

It was certainly the view taken by the journal Science, which in August 1982 carried an article headed “New Disease Baffles Medical Community”. It described how more than 470 people in the US had AIDS, that almost half had died, and that an infectious agent appeared likeliest to be the cause. Ominously, it noted that the signpost “opportunistic” infections were first noticed in mid 1979[342] – ominous, because it suggested that if the cause was an infectious agent it had a long incubation period.

By September 1982, the US Committee had concluded that there was not enough data to suggest that immediate action should be taken in respect of licensed blood products.[343] However, on 3 October 1982 the US National Hemophilia Foundation urged pharmaceutical companies to stop using plasma collected from gay men, intravenous drug users and people who had recently been in Haiti for the manufacture of coagulation products as a “precautionary measure”.[344]

Pausing there, it is clear that in late 1982 steps were being suggested in response to the threat of AIDS to reduce the risk that factor concentrates might have been transmitting it.[345]

The debate in the US did not go unnoticed in the UK.[346] There had been the Stirling conference, mentioned above, in June 1982. As early as 16 July, Stanley Godfrey of the DHSS wrote an internal memo alerting the Medicines Division that he had been warned there might be adverse publicity about the risks of using US Factor 8 concentrates: it was being said that “400 haemophiliacs in the USA have exhibited signs of the virus” and that the licences of certain manufacturers of imported blood products might have to be revoked.[347] The Communicable Disease Surveillance Centre (“CDSC”) took steps to introduce national surveillance for AIDS in August 1982.[348]

On 5 November 1982 Dr John Craske of the Public Health Laboratory Service wrote a short report about the AIDS outbreak. He put forward three possible causes only to discount the first two (the use of amyl nitrite and the immunosuppressive effects of cytomegalovirus (“CMV”)). The third was an infectious agent. Of note, he concluded that there was a delay between symptoms emerging and diagnosis, and that as symptoms began to develop they might not be specific. He also concluded that mortality after the first development of symptoms was high (around 50%).[349]

Given this history, there can be little doubt that possibly by March 1982, and certainly from July 1982 onward, it was known in the UK to both some clinicians and some within government that there was a real risk that blood, and blood products in particular, might transmit the cause of AIDS. A wider audience was informed through The Observer, which in November 1982 reported that people with haemophilia in the US were suffering from AIDS and that there was a “major speculation” that the AIDS virus was “carried in the blood”.[350] Though there was no certainty as yet, the risk that this was the case was clear.[351]

In December 1982, a report was received of a baby who had received multiple transfusions shortly after birth, who had contracted the symptoms of AIDS and died aged two. One of the donors whose blood was transfused into the baby had been identified since the transfusion as having contracted AIDS.[352] Since plainly a baby could have neither a homosexual lifestyle, nor have been regularly exposed to an overload of foreign protein, nor used amyl nitrite, and was not born with a primary immune deficiency,[353] this – “the San Francisco baby case” – was powerful evidence in support of the infectious agent theory. The same MMWR disclosed that the three individuals whose cases had been reported in the 16 July 1982 MMWR had since died, and that four additional cases of AIDS in haemophilia patients had been identified, together with a further “highly suspect” case of a seven year old child with severe haemophilia. All had received Factor 8 concentrates and two of the five had died.[354]

The reality of the risk as it was there to be seen is underscored by the fact that on 29 December 1982, Edward Cutter, of Cutter Pharmaceutical, advised in an internal memo that his firm should thereafter include an AIDS warning in the product leaflets which accompanied its factor concentrate product.[355]

By the end of 1982, Dr Joseph Smith, the director of the National Institute for Biological Standards and Control had formed the view that the cause of AIDS was almost certainly a virus;[356] and according to Dr Charles Rizza, all UK Haemophilia Centre Directors’ Organisation (“UKHCDO”) directors knew there was a real risk that AIDS could be transmitted by an infectious agent carried by blood products.[357]

In the Netherlands, the Central Laboratory of the Blood Transfusion Services was convinced by the end of 1982 that people with bleeding disorders were at risk and started to coordinate a response with the Netherlands Haemophilia Society, haemophilia clinicians and the Netherlands National Institute for Public Health and Environmental Protection.[358] The medical director of the Central Laboratory, Professor Willem van Aken, was asked about this decision by the Lindsay Tribunal and said: “We struggled at that time … and in the end, we thought that it would be absolutely utterly irresponsible not to do it if there was a serious risk. So of course we agreed that at that time the evidence was still limited but on the other hand, there was strong indications that there was indeed a significant risk.[359] He also said that at the end of 1982 the Central Laboratory recognised that they needed to take steps to exclude high-risk blood donors.[360]

A viral cause was further implicated on 7 January 1983, when the MMWR reported that two female sexual partners of males with AIDS and four female sexual partners of intravenous drug users had developed either AIDS, or PCP, or opportunistic infections typical of AIDS: in short, the cause was highly likely to be viral, transmissible by sex as well as blood, and similar to Hepatitis B.[361]

On the same day Alpha Pharmaceutical issued a press release, which said: “The evidence suggests, although it does not absolutely prove, that a virus or other disease agent was transmitted to them in the Factor VIII concentrate, derived from pooled human plasma”.[362]

The New England Journal of Medicine, a prestigious, widely read US medical journal, carried an editorial in its 13 January 1983 edition which stated: “The fact that haemophiliacs are at risk for AIDS is becoming clear. If the use of cryoprecipitate will minimize this risk, the current home-infusion program needs to be revised.[363]

A Hepatitis Working Party which had been set up some time earlier by the UKHCDO met within a week of The New England Journal of Medicine editorial. This and another article in the same edition of The New England Journal of Medicine were discussed.[364] Dr Craske reported on what he had found out from the CDC: in the US there were now ten cases of AIDS in people with haemophilia of whom five had died. All had been treated with Factor 8 concentrate. The youngest was seven. “It seemed possible that factor VIII or other blood products administered to these patients might be implicated.” The CDC was working on the hypothesis that AIDS was caused by an infective agent, possibly a virus which attacked T-cells. This gained further support from the fact that there had been three cases of AIDS following transfusion of whole blood or platelets (one of which was that of the “San Francisco baby”). The incubation period ranged from six months to three years.[365]

On 24 January 1983, 21 haemophilia doctors, together with Professor Arie Zuckerman and Dr Craske, met a team from Immuno Ltd of Vienna, led by Dr Hans Eibl. The meeting, at Heathrow, lasted most of the day. It was chaired by Professor Arthur Bloom. Though the focus of the meeting was centrally on whether Immuno had found a chemical method which might inactivate non-A non-B Hepatitis and Hepatitis B in blood products, the afternoon session was given up to a discussion about AIDS.[366] No one appears to have questioned that it was likely to result from the transmission of a transmissible agent (though there was some question whether there might be more than one).[367] Importantly for understanding the risks if factor concentrates were responsible, Dr Craske informed the group that:

  1. the disease was “intractable” (ie it could not be treated if it arose);
  2. up to December 1982, 45% of those suffering from it had died;
  3. ten people with haemophilia in the US had been affected, of whom five had died;
  4. it appeared to have an incubation period of between six months and two years.[368]

The meeting was also told that studies reported in The New England Journal of Medicine showed that it was more likely that those people with haemophilia who were currently without symptoms but who had received factor concentrates had abnormalities of the T-cells of their immune system, whereas those who had received cryoprecipitate did not.[369]

An article in The Lancet of 29 January 1983 concluded that a blood-borne agent seemed likely to be the cause of AIDS.[370] The New Scientist of 3 February 1983 reported that the “prime suspect” was a blood-borne virus.[371] On 4 February 1983 The Journal of the American Medical Association reported under medical news both that: “During the past year, evidence has accumulated to suggest that the lethal and mysterious acquired immunodeficiency syndrome (AIDS) can be spread by infusion of blood and blood products”, and that “all major suppliers currently have licenses pending with the FDA to market new preparation methods that may result in lowered infectivity, such as heat-treated Factor VIII concentrate products.[372] The Deutsches Ärzteblatt (German Medical Gazette), distributed to all physicians in Germany, carried a report on 18 February 1983 from the Robert Koch Institute in Berlin that AIDS appeared to be caused by an unknown infectious agent transmitted through blood and blood products.[373]

By this time, Dutch patients had been advised that “AIDS is a very serious condition with a long incubation time. It is not yet possible to detect carriers or persons in which the disease is in the early stage. This makes it necessary to develop a policy so that the risk for haemophilia patients to be contaminated by this disease is as small as possible.[374] Dutch physicians treating haemophilia patients agreed to use only cryoprecipitate in children under four and newly diagnosed patients, desmopressin (“DDAVP”) for patients with mild or moderate Haemophilia A or if necessary cryoprecipitate, but for all other patients to consider cryoprecipitate as the treatment of choice followed by locally produced (small pool) Factor 8 and Factor 9 concentrates.[375]

On 4 March 1983 the CDC reported that blood products or blood appeared responsible for AIDS among haemophilia patients who required clotting factor replacement. It noted that the first signs of AIDS might take two to three years to emerge after exposure to a possible transmissible agent, and that 60% of those who had a diagnosis of AIDS for more than one year died. It also observed that it looked as if the agent was transmitted only by sex or blood, and that person-to-person contact did not transmit the causative agent.[376]

The Canadian Hemophilia Society Medical and Scientific Advisory Committee advised in late March 1983 that people with haemophilia switch from Factor 8 concentrate to cryoprecipitate whenever feasible.[377]

By April in the Netherlands, the Dutch Red Cross had secured agreement that high-risk donors from the homosexual community would not donate blood.[378]

The risk that Factor 8 concentrates might transmit the cause of AIDS was recognised by the FDA: it recommended that all blood and plasmapheresis centres should inform donors about AIDS and ask donors in high-risk groups (Haitians, male homosexuals, intravenous drug users, men who have sex with men, those with signs of AIDS, sexual partners of each of these) not to donate. Products from donations made before this had to be labelled to indicate this.[379]

Some haemophilia doctors in the UK began to check patients for signs of AIDS as a routine measure: doing this conveys that they must already have recognised that receiving factor concentrate therapy carried a real risk of leading to AIDS.[380] The evidence is that those directly engaged in the process by which concentrates were derived from blood (fractionators) thought it likely that AIDS was caused by a blood-borne virus.[381]

Professor Bloom wrote to Dr Evatt of the CDC to seek information about AIDS in the US and on 7 March 1983 Dr Evatt replied. He told Professor Bloom that the AIDS epidemic was evolving “with a frightening pace”: there were now 13 confirmed AIDS cases in the US among people with haemophilia (one with Haemophilia B), and a further five highly suspect cases under investigation; the clinical course was rapid after the appearance of an opportunistic infection; all had had Factor 8 concentrates and AIDS had developed in people both with mild and severe haemophilia. The ages ranged from 7 to 62 years. Preliminary data suggested that half the population of people with haemophilia in the US had T-cell abnormalities, and 13% were markedly abnormal, as in AIDS patients. So far as transfusion in the absence of haemophilia was concerned, approximately 12 patients had developed AIDS. Half were male, half female.[382] There is no evidence that Professor Bloom circulated the letter at the time. Given what he was to go on to say, it is to be inferred that he did not.

Dr Evatt’s letter probably informed Professor Bloom’s remarks when he raised AIDS at the end of a meeting of the Central Blood Laboratories Authority (“CBLA”) on 23 March 1983, though his words appear to have been an inaccurate summary of what he had been told. Dr Richard Lane, director of the Blood Products Laboratory (“BPL”) recorded:

“Professor Bloom drew to the attention of the CBLA … the problems that are becoming associated with blood transfusion and blood product administration with the increasing incidence of reported AIDS cases which continues to gain momentum in the United States on a monthly basis. The high mortality in reported cases is a cause for concern and is a primary factor behind what is described as the American over-reaction to the problem. The aetiological factor or factors remain unknown.”[383]

Dr Lane told his colleagues at BPL: “patients … are evidently concerned and resistance against the use of imported American coagulation factor concentrates is becoming apparent. Equally, there is a likelihood that a return to cryoprecipitate as a desirable form of treatment may become irresistible, whether logical or not”. He noted that many regional transfusion centres had not made cryoprecipitate for some time and so wanted to discuss “strategic alternatives at BPL for manufacturing small pool freeze dried cryoprecipitate to offset the requirement for manufacturing at [local] level.[384] However, production of small pool freeze-dried cryoprecipitate by BPL was ruled out “on logistic production considerations” and BPL adopted a “wait and see” policy with continued manufacture and also research into viral inactivation.[385]

Since the previous autumn, it had been known that the number of people known to be suffering from AIDS in the US doubled roughly every six months.[386] The disease was recognised as being epidemic in form. Because of the limited chances of survival if it developed, public concern was heightened. Though this chapter, and this Report, focus centrally on transmissibility by blood and blood products, it is important to see actions (or failures to act) in the context that a much larger number of people than those infected through blood and blood products suffered from AIDS in the UK. The Terry Higgins Trust was already set up, and on its way to becoming “reborn as a formal organisation”, known as the Terrence Higgins Trust, in August 1983 and the Scottish AIDS Monitor was also set up that summer.[387] UK citizens had already died of AIDS.[388]

It was at the end of April and the beginning of May 1983 that concern about AIDS and its causes intensified in the UK. On 25 April 1983 the BBC screened an hour-long Horizon programme, Killer in the Village, which identified “the 4 Hs” in the major risk groups – “Homosexual males, IV Drug (Heroin) users, Haitians and Haemophiliacs”.[389]

The press had also taken note. There was a flurry of reports in mainstream publications at the start of May 1983. The Observer said that the AIDS disease, “characterised by a collapse in the body’s ability to fight infection, is a medical time-bomb for Britain. Although only 15 cases have been reported since it first crossed the Atlantic in December 1981, many more people could be harbouring it, for the incubation period is up to three years.” It drew attention to a suspected case in a patient with haemophilia in Cardiff.[390] The Mail on Sundayrevealed exclusively” that two cases of AIDS in people with haemophilia treated in the UK with plasma imported from the US were already suspected.[391] The following day The Guardian again said that there were two cases of AIDS amongst people with haemophilia in the UK,[392] the Daily Mail referred to the possibility that Britain was importing blood products from the US “contaminated with the killer homosexual disease AIDS”, and the Daily Express called AIDS “The new killer-disease”.[393]

Despite all this, on 4 May 1983 a statement from Professor Bloom, who was in charge of the Cardiff Haemophilia Centre, and chair of the UKHCDO, and who had been asked by the Haemophilia Society for his view on the media furore, was sent to Haemophilia Society members. He said that the “cause of AIDS is quite unknown and it has not been proven to result from transmission of a specific infective agent in blood products”, that the number of AIDS cases was small, that “in spite of inaccurate statements in the press” he was unaware of any proven case in “our own haemophilic population”, and that none had been reported from Germany despite the fact that it consumed more commercial factor concentrate than any other European country. He appeared to suggest that the licensing system for blood products was such that any significant risk would have been picked up, and permission for those products to be distributed any further would be refused.[394] He advised no change to therapy with factor concentrates.

Saying that the number of AIDS cases was small was a surprising thing to write without any qualification: it may have been true that the number of people with haemophilia known (so far) to have developed AIDS was small, but the popular press were already reflecting a considerable concern that a deadly unidentified disease was growing amongst the general population in epidemic form.[395] If this was the case in the US, it was likely also to be the case in the UK; if it was the case in relation to the general population, it was likely to be the case in respect of people with bleeding disorders.[396]

This was a surprising letter for Professor Bloom to write for other reasons too. The press had reported that there were two probable cases of AIDS, one of whom was in Cardiff. As far as the Cardiff case was concerned, the report was true, and Professor Bloom had reported the case a week earlier as a probable one of AIDS to the CDSC.[397] Further, the first German person with haemophilia and AIDS died in 1982.[398] Reports of two German patients with AIDS became known during 1983, by late April.[399] Third, Professor Bloom ought to have realised that the number of cases in which the infection had become sufficiently developed to show themselves as cases of AIDS were potentially only the tip of a very much larger iceberg: The Observer had made this very point in terms clearly understandable by a layman let alone a scientist.[400] He ought to have realised this not just because of the clear terms in which the Sunday newspaper spoke, but also because of the particular knowledge he personally had been given. He had written to Dr Evatt at the CDC earlier in 1983, to ask about the spread of AIDS in people with haemophilia. In his reply Dr Evatt gave him the grim details set out above.[401] What Dr Evatt was saying should not have come as any great surprise: it repeated the essential messages which Professor Bloom and his colleagues had been given on 24 January in their meeting with Immuno at Heathrow.[402] It was wholly in line with the report Dr Craske had given to the UKHCDO Hepatitis Working Party on 19 January – a meeting to which Professor Bloom had specifically been invited.[403]

Further, it is to be inferred from the evidence given to the Inquiry that Professor Bloom may have been saying one thing for public consumption, whilst taking a different tack within his own haemophilia centre. This inference arises from four facts. First, in January 1983 a medical registrar in Professor Bloom’s team in Cardiff responded to a patient with haemophilia who, prompted by his knowledge of what the New Scientist was publishing, had raised with the registrar whether taking commercial factor product risked AIDS. Rather than disagree with his patient, say there was no good evidence that that was the case, and tell him that he should continue to take commercial factor concentrates, the registrar recorded that he was “well-read”.[404] The inference is that whatever may have been said for general public consumption by the leader of the unit, the risk that AIDS might be transmitted by commercial factor product was regarded within that unit as a real and serious risk. Second, on 18 May 1983 – only two weeks after his letter to the Haemophilia Society – Haemophilia Treatment Policy Guidelines for use in Cardiff were issued. They advised using DDAVP, cryoprecipitate and only NHS factor concentrates for children and those with mild haemophilia, and even when it came to those with severe haemophilia advised using only NHS concentrate for those who had never received commercial concentrate “where possible” and cryoprecipitate for in-patient treatment “where feasible”.[405] This can only have been on the basis that imported factor concentrates created such a risk of AIDS that they should not be used unless there was no alternative.[406] Third, he had already written a letter on 18 February 1983 to his patients saying in respect of AIDS that “the occurrence of these illnesses has been extremely uncommon and has been confined to the United States” which not only was wrong, but then suggested action which can only have been on the basis that he did consider there was a real risk rather than there being nothing to worry about.[407] Fourth, and possibly most compelling of all in revealing Professor Bloom’s two-faced approach, is that he had informed the Public Health Laboratory Service that one of his patients probably had AIDS whilst suggesting to the Haemophilia Society (and in a number of meetings) that there was no proven case of AIDS in the UK population.[408]

If what Professor Bloom had said had been faithful to the facts, and he had advised the Haemophilia Society that although it had not finally been established that there was a real risk that taking factor concentrates risked contracting AIDS, the likelihood of the risk becoming a certainty appeared to be growing stronger by the month, it is not difficult to see that the events that followed might have taken a different turn.

Dr Joseph Smith had been alerted, probably by the FDA’s request in March 1983 to the American fractionators not to accept plasma from high-risk donors, to a need to discuss AIDS in order to advise the licensing authority. Whether it was this, or press reporting that was the cause, word appears to have got round that there was a possibility that the UK might move to suspend the importation of commercial concentrates, at least in the short term. On 4 May, the day of Professor Bloom’s letter, the Association of Scientific, Technical and Managerial Staffs whose members staffed the NHS blood product facilities in Elstree, Oxford and Edinburgh called for a ban on the importation of blood products because they derived from sources likely to be rife with AIDS.[409]

Five days after Professor Bloom’s letter, Dr Spence Galbraith, who was the director of the CDSC for England and Wales, wrote a paper entitled Action on AIDS. He sent it to the DHSS. He said: “I have reviewed the literature and come to the conclusion that all blood products made from blood donated in the USA after 1978[410] should be withdrawn from use until the risk of AIDS transmission by these products has been clarified.[411]

This view – that the risk from importation of commercial factor product was so serious that it should be suspended – proved controversial. On the very same day as Dr Galbraith wrote to the DHSS, the co-ordinator of the Haemophilia Society[412] wrote to the Medical Advisory Panel of the Society to say that “a group of us will be meeting with Geoffrey Finsberg … [to seek, inter alia, an] assurance that there will be no immediate ban on the importation of US blood products.[413]

A special meeting of UKHCDO reference centre directors was convened on 13 May 1983. Though it considered it was “circumspect” to use only NHS materials to treat young children and people with mild haemophilia, it minuted that there was “as yet, insufficient evidence to warrant restriction of the use of imported concentrates in other patients in view of the immense benefits of therapy. The situation shall be kept under constant review.[414] Dr Diana Walford from the DHSS attended the meeting. In a memo of the same day she recorded that she regarded Dr Galbraith’s recommendation as “premature in relation to the evidence and unbalanced in that it does not take into account the risks to haemophiliacs of withdrawing a major source of their FVIII supplies”, and plainly preferred the conclusion of the UKHCDO directors.[415]

The background was this. Measures were being taken in the US to reduce the risk posed by the concentrates made there. The FDA had asked pharmaceutical companies in March 1983 to take steps to ensure that donors in risk groups (sometimes referred to as “the 4 Hs” – homosexual males, people from Haiti, intravenous drug users (the “H” stands for heroin addicts) and people with haemophilia)[416] avoid donation. It had asked those commercial companies to label their products to indicate whether they had been produced from plasma collected before these recommendations were implemented by the industry.[417] Secondly, also in March, the FDA had licensed Hyland Laboratories to market dry-heated concentrate.[418] Though the risk which the heating process had been intended to address was that of hepatitis, it was widely thought that if the cause of AIDS was indeed a virus, it too might be inactivated by the heat treatment. These two measures indicated that in the US there was not only known to be a risk, but it was a risk of such magnitude as to merit a regulatory response. By way of comment the principle in play should have been to ask – was there a real risk of cause and effect? If yes, then can we be certain that that risk has been reduced to a level at which no further precautions are needed. The principle was not – can we be certain there is cause and effect, and is there a chance these precautions might lessen the chances of the effect?

The voices urging the avoidance of the risk from imported concentrates increased in volume and authority. The Council of Europe Committee of Experts on Blood Transfusion and Immunohaematology was due to advise the Council of Ministers who were meeting in June. Prior to its meeting in May, it sought details from each member state as to the number of AIDS cases there had been. The answer as at 28 April was that Spain reported three cases in people with haemophilia, Germany two, Austria one suspected case, and Finland one suspected retrospectively after the person’s death.[419] In May the Committee concluded that although “Absolute proof that AIDS is caused by a transmissible infectious agent is not yet available” nonetheless “the consensus in the Committee was that it should be regarded as such and that a recommendation should be made to the Council of Ministers at the meeting in June to take necessary steps to minimise the transmission of AIDS by the transfusion of blood products.” It advised avoiding the use of coagulation factors prepared from large plasma pools except when such a product was specifically indicated for medical reasons, and it advised informing physicians and “selected recipients, such as haemophiliacs” of the “potential health hazards of haemotherapy and the possibilities of minimising these risks”together with pursuingthe“rapid and full implementation” of recommendations made earlier urging a need to attain self-sufficiency in blood product production from voluntary non-remunerated donors.[420]

The directors of the Scottish National Blood Transfusion Service prepared a leaflet regarding the risk of AIDS, identifying people with haemophilia amongst the classes of those who could get AIDS:[421] in other words, accepting that there was a risk of transmission by blood products.

On 23 June 1983 the Council of Europe followed the recommendations of the Committee of Experts.[422]

By July, Transfusion International, a journal directed to transfusionists, observed in an editorial by an eminent physician that:

“There is relatively strong evidence indicating that [AIDS] may be transmitted by blood. In the United States, it is reported that eleven haemophiliacs have contracted AIDS, and additional haemophiliacs with AIDS have been observed in Europe. There is a suspicion that commercial Factor VIII concentrate, prepared from large pools of US plasma, has been the source of infection. Although there is no absolute proof that blood really does transmit AIDS infection, there is one case where the causal relationship is highly suggestive.” [The editorial went on to describe the case of the San Francisco baby.][423]

It was on 13 July that the question whether the UK should suspend the importation of foreign concentrates was to come for formal consideration before the Biological Sub-Committee of the Committee on the Safety of Medicines met. In advance, Dr Leslie Keith Fowler of the Medicines Division prepared a paper for the Committee to consider. He dated interest in KS and PCP infections as having started “three or four years ago” (thus 1979-80), described AIDS as “clearly a transmissible condition”, but thought that “repeated exposure to allogeneic sperm together with cytomegalovirus (CMV) over a prolonged period impairs T cell mediated immunity, allowing reactivation of Epstein Barr virus” was the likeliest cause. He concluded that AIDS might be a “function of the concentrate itself rather than a specific agent”. This contribution shows that there was as yet no unanimity as to the cause of AIDS, though it has to be said that his particular theory was very much his own. It was undoubtedly heterodox. However, he also said, on firmer ground, that “The media concept of a ticking time bomb is a very real one for haemophiliacs and no effort should be spared to protect them by all practical means.[424]

The Sub-Committee decided not to recommend that imports be suspended. The correctness of this decision is considered in the chapter on Regulation of Commercial Factor Concentrates: for the purposes of this chapter, however, the first conclusion it reached was: “The cause of AIDS is unknown, but an infectious aetiology seems likely. A previously unrecognised or new agent may be responsible, but repeated exposure to, or reactivation of, known agents, (eg CMV, EBV) may be involved.”[425]

Also in July 1983 Dr Michael Rodell, who was the vice president (regulatory and technical affairs) of Armour Pharmaceuticals and representing the Pharmaceutical Manufacturers Association, told a standing committee advising the US Secretary of State of Health and the Commissioner of the FDA that on average persons who were paid for their plasma had it collected between 40 and 60 times per year.[426] At that rate, and given the pool sizes used for the manufacture of concentrates in the US, his presentation suggested that four infected persons could contaminate the entire world supply of Factor 8 concentrate.[427] It is clear that pharmaceutical companies regarded the risk that factor concentrates might transmit AIDS as real and substantial.[428]

rom certainty that there was a risk that AIDS was transmitted by blood to certainty that it was transmitted: mid 1983 to early 1984

Well before the autumn of 1983 the frontrunner among the postulated causes of AIDS had been an infectious agent. The strength of evidence and opinion supporting this hypothesis kept on increasing thereafter. The isolation of a viral particle (then called “LAV”) by Dr Luc Montagnier and his team in Paris on 20 May 1983[429] did not attract significant attention at the time, but was later said to have made it clear scientifically that the cause of AIDS was indeed a virus,[430] though there remained some suggestions until early 1984 that injection of foreign proteins could be a causal or contributory factor. Though there were many who were alert to the publication on 20 May 1983 of his findings, there were few wider references to its importance.

In mid August 1983, the Medical Research Council produced a Brief on AIDS in which it said: “Over the last six months the possibility of transmission by blood transfusion has become increasingly apparent … An infectious agent has therefore been implicated which is sexually, parenterally and even perinatally transmitted with a long asymptomatic infectious incubation period … Transmission of AIDS by transfusion and blood products may become a serious problem.[431]

Underscoring these last few words was the first known death in the UK of a person with haemophilia with AIDS. He died in August 1983, having been unwell since he received commercial concentrate some months earlier.[432] Nonetheless, when UKHCDO held its annual general meeting on 17 October, and despite the death which was widely reported in the national press,[433] Professor Bloom felt able to treat a suggestion that patients should revert to having cryoprecipitate by saying that “he felt that there was no need for patients to stop using the commercial concentrates because at present there was no proof that the commercial concentrates were the cause of AIDS.[434]

By December 1983, an article in The British Medical Journal commented that there was no evidence that any factor product, whether made from purchased or voluntarily donated plasma, was free from the risk of transmitting AIDS.[435] By way of comment, if the question had been put in this form at the start of 1983 – “is there any evidence that any product is free of the risk of transmitting AIDS?” – the answer would have been the same. None. The question should have been asked in that form once it became clear that there was a real risk that concentrates might transmit the cause: it should have been asked before the end of 1982.

On 12 January 1984 The New England Journal of Medicine published a report of a study of 18 suspected cases of AIDS associated with transfusion. None of the recipients possessed any risk factor other than being the receipt of blood components. The conclusion of The New England Journal of Medicine was that blood components could transmit AIDS, that exposure to one infected unit might result in transmission, and that symptomless donors could be infectious.[436] Dr Thomas Zuck, who was later to be the director of the blood and blood products division of the FDA, told the Krever Commission that the effect of the study was to “put the whole medical community and perhaps the world on notice that AIDS is transmitted by blood transfusions.”[437]

By March 1984 all blood products manufacturers in the US warned of the risk of contracting AIDS from using their products.[438]

One of the seminal dates was that of the identification of the virus which led to AIDS. Though Dr Montagnier had reported in May 1983 that he had isolated a viral particle in association with AIDS,[439] when Dr Robert Gallo in the US held a press conference on 23 April, and published on 4 May, and claimed he had found the virus which caused AIDS, terming it “HTLVIII”,[440] the scientific and medical community now took notice. Indeed, there was an element of theatre about the announcement – the US Secretary of State for Health came to a press conference at which Dr Gallo’s discovery was announced, and predicted that the test would be widely available within six months to screen blood to exclude the virus.[441] In fact, Dr Gallo had identified exactly that which Dr Montagnier had called LAV. However, from now on the overwhelming consensus was that the cause of AIDS was infection with HTLV-3 (LAV) which was later renamed HIV.

Commentary

In summary, the risk that blood transfusion or use of factor concentrates could cause AIDS (and hepatitis, mainly non-A non-B Hepatitis) was apparent in mid 1982. All those involved in treating patients with blood or blood products either knew, or should have known, of the risk by the end of 1982. It became increasingly apparent as a serious risk until it came to be regarded as a near certainty in April 1984. This risk (that transfusion or factor concentrates could transmit the cause of AIDS) had to be addressed when it first became recognised as a real risk, by the end of 1982 or January 1983 at the latest; all the more so when it became a near certainty, and all the quicker since the context was known to be a developing epidemic.

This chapter provides an account of how knowledge of AIDS arose and developed. Subsequent chapters in this Report demonstrate the reluctance of clinicians, and because of that, of the Haemophilia Society, to accept that a particular style of preparation of a product which they had strongly favoured might after all be more deeply harmful than alternatives which they had largely discounted. Later chapters demonstrate, too, elementary failures of analysis – conflating risk with incidence, failing to recognise the significance of long incubation periods, and somehow being reassured with the idea that the disease was confined to the US, rather than seeing the fact it was epidemic in the US as a red flag giving advance warning of what might happen here. There was a failure to ask the right questions: given sufficient material to show there was a real risk, was there any evidence that the products concerned were free of it? There was a lack of any sense of urgency. There was insufficient regard to developing experience, knowledge and wisdom elsewhere in the world supplied by US concentrates. And there was, over and above all, a search for certainty of cause-and-effect as if that were necessary before action should be taken.

On 3 May 1984, Professor Robin Weiss of University College London wrote in Nature that: “There no longer seems to be any doubt that AIDS is caused by an infectious agent”: the overload of foreign protein theory did not explain why single transfusions could transmit AIDS, nor how infants acquired infections from affected mothers. He commented that: “ELISA [enzyme-linked immunosorbent assay] screening tests for blood banks are urgently needed”. The word he used was “urgently”.[442]

This chapter, then, ends with an overdue call for urgency. How far that call for a screening test to be developed in the UK was answered is for a later chapter.[443]

3.9 Organisation of the Blood Services

This chapter reviews the structure and functions of the blood services as they developed throughout the UK and considers the impact of the way in which the blood services were organised.


Key dates

1940 SNBTS is set up.

September 1946 Ministry of Health creates National Blood Transfusion Service for England and Wales.

July 1948 with the establishment of the NHS, RTCs are managed by regional hospital boards instead of nationally; SNBTS becomes the responsibility of the Secretary of State for Scotland.

1970, 1973 RTDs advocate a national and centrally financed and administered blood service.

1974 NHS reorganisation results in RTCs coming under the control of regional health authorities; the Common Services Agency takes over responsibility for SNBTS.

1975 Central Committee for NBTS created.

1980 Dr Tovey proposes a plan for the reorganisation of the NBTS under a central coordinating committee. DHSS establishes Advisory Committee on the NBTS.

1987 a report on the organisation of blood services recommends reform.

1988 National Directorate is set up to coordinate blood services in England and Wales. Despite name, has no executive control.

1991 Welsh Health Common Services Authority manages services in Wales.

1993 National Blood Authority is established.

1994 NIBTS is established.

1999 WBS is established.


People

Dr John Cash national medical director, Scotland (1979 - 1997)

Dr Harold Gunson director, NBTS (1988 - 1993)

Dr William d’A Maycock consultant transfusion adviser to the CMO ( until 1978)

Dr Morris McClelland director, Northern Ireland Blood Transfusion Service (1980 - 1994)

Dr Tony Napier medical director, Welsh Regional Blood Transfusion Service (1977 - 1998)

Dr Angela Robinson medical director, National Blood Authority (1994 - 2007 )

Dr Geoffrey Tovey consultant adviser to the CMO (1978 - 1981)


Abbreviations

NBA National Blood Authority

NBTS National Blood Transfusion Service

NIBTS Northern Ireland Blood Transfusion Service

RTC Regional Transfusion Centre

SNBTS Scottish National Blood Transfusion Service

WBS Welsh Blood Service


England and Wales

As explained in the chapter on Blood and Transfusion and repeated here simply for easy readability, the Army Blood Supply Depot was established by the War Office at the start of World War Two. It operated from Southmead in Bristol. As well as meeting the requirements of the armed forces it supplied civilian hospitals in South West England. Treating London civilians in the war was to be the work of four blood depots established in 1939. By 1940, regional blood depots had also been established close to large district hospitals in the major cities and these later became known as regional transfusion centres (“RTC”). [444] The Scottish National Blood Service began on 11 January 1940, funded independently of its later English counterparts.[445]

As for blood products, the Treasury War Emergency Committee decided in 1941 to finance two facilities in the UK for the preparation of freeze-dried human plasma. One of these was established in an underground site at the Royal Infirmary of Edinburgh. The other was set up at Cambridge, under the auspices of the Medical Research Council.

By 1943, in England and Wales, it was recognised that the blood supply system had reached such a scale that national management was required.[446] The National Blood Transfusion Service (“NBTS”) for England and Wales was thus created by the Ministry of Health on 26 September 1946. There were 11 regional transfusion centres within the Service, including what were by then 2 London centres.[447] This later became 14 regional transfusion centres. It was managed directly by the Ministry of Health for a year and three quarters, advised by Dr William d’A Maycock, and a committee (which had no executive power). However, following the establishment of the National Health Service (“NHS”) under legislation which provided for regionalised medical services, the Ministry of Health no longer exercised unitary national control. Instead, regional hospital boards exercised functions with respect to the administration of hospital and specialist services in their respective regions.[448] These included blood services. Twelve such regional hospital boards were established, and the management of regional transfusion centres was transferred from the Ministry of Health to these regional hospital boards. Each regional transfusion centre was managed by a regional transfusion director who was medically qualified and was appointed by and accountable to the regional hospital board.[449]

Having been run under central executive control for a brief period until 5 July 1948 as a unitary National Blood Service it then became an amalgamation of autonomous regional services. The history of the next 46 years was one in which many concerned in collecting voluntary altruistic donations to supply blood and produce blood products sought to avoid the consequences that regional control brought in its wake. In particular, those who sought to facilitate a supply of blood products from the fractionation centres[450] argued, in increasingly strenuous terms, for a truly national system rather than one which was regionally funded, regionally directed, regionally controlled, and responsive in particular to the needs of the individual regions rather than to the nation as a whole.

Disquiet at the problems this fractured control created for a national service was recorded as early as 1954, with a sense that the service was being starved of the cash it needed to equip itself properly.[451] Nothing materially changed. By June 1961, the future of the transfusion service was the subject of discussion at a meeting of regional transfusion directors. Dr Neville Goodman, the Deputy Chief Medical Officer, chaired this discussion, and began by saying that he thought the NBTS “tended to be rather isolated from the rest of the N.H.S.[452] The views expressed at the meeting were not unanimous, save in one respect in which several attendees raised a concern. That was that a national effort needed to be made to improve research and development. They thought that as a consequence of the regionalised arrangements this was being sadly neglected. No dissenting voice was recorded on this point.[453]

Divergences in practice remained between regional transfusion centres. Efforts to standardise the medical selection of donors and other functions were only ever partially successful (according to Dr Harold Gunson, later to be director of the NBTS, and there is no reason to doubt his conclusion in this regard) as there was no obligation on regional transfusion directors to adopt national policy, particularly where it came into conflict with regional priorities.[454]

In the early 1970s restructuring of the NHS was under active discussion. At a special meeting between regional transfusion directors and Department of Health and Social Security (“DHSS”) representatives, held in 1970 to consider a green paper on the future structure of the NHS, the directors advanced a case for the reintroduction of a national service. They complained that the existing structure of independent regional transfusion centres was leading to fragmented administration. The meeting included:

  1. a unanimous rejection by the directors of a proposal that the NBTS should be administered by regional health councils, since they would lack executive authority;
  2. a unanimous rejection of the suitability of area health boards to administer the NBTS, for their focus differed substantially from that of the respective regional transfusion centres; and
  3. a positive acceptance that there should be a centrally financed and administered blood service which would allow national planning, performance of specialised functions and improved efficiency. It was noted that “many difficulties had arisen from the fact that administration and financing of the service were the responsibility of 13 different authorities.[455]

This was followed up by the directors in January 1973, when they presented a document to the Standing Medical Advisory Committee.[456] It began with the words:

“In the light of the proposed reorganisation of the National Health Service, Directors of the Regional Transfusion Centres and the two associated laboratories (Blood Group Reference Laboratory and Blood Products Laboratory) have considered the present and future structure of the National Blood Transfusion Service. They are unanimously against administration of the Centres by Regional Authorities. They strongly recommend that the National Blood Transfusion Service be reorganised round a unified system of central administration.”

In its detail the document complained that the proposal for regional control, contained in the draft statute about to be enacted, was not based on any expert opinion, but “purely in pursuit of the principle of devolution.”[457]

The National Health Service Reorganisation Act 1973[458] replaced regional hospital boards with regional health authorities. Regional transfusion centres remained under local management despite the views the regional transfusion directors had expressed. No national executive was established to control blood services policy or management.

Regional transfusion directors remained sceptical about the effectiveness of this structure. An ad hoc committee was set up to consider their concerns, and whether any change should be made in the organisation of the blood transfusion services in England and Wales. This committee, though recognising the force of many reasons for centralising administration of the national blood services, was unconvinced that responsibility for administration and providing services should be taken away from regional health authorities. Nonetheless, a degree of central coordination was thought desirable: and a Central Committee for the National Blood Transfusion Service was proposed.[459] This was created in 1975. It was to keep under review the operation of the blood service in England and Wales, including Blood Products Laboratory (“BPL”) and the Blood Group Reference Laboratory, and to advise the DHSS on the development of the blood service. This committee first met in June 1975.[460]

In April 1974, guidance was sought by the regional transfusion directors from the DHSS on the relationship between the regional transfusion centres and regional health authorities: such guidance was generally lacking.[461] By October 1974 they were still waiting. While they waited, the position was of such concern that it merited an editorial in The British Medical Journal in July 1974.[462]

The background to this editorial was that some had recently begun to argue that a voluntary system of donations did not, and could not, produce enough plasma to produce sufficient blood products domestically to satisfy the needs of patients. Hence, they argued, the gap needed to be filled by imports of blood products manufactured elsewhere. The imports were from the US, though the editorial claimed that paid donors in the US were not the only sources, and that the professional donors came “not from Leeds, Liverpool, or London, but from people in such countries as Puerto Rico, Chile, and Columbia.[463]

In the eyes of The British Medical Journal, the reason for insufficiency of plasma was not that the public were to blame for not being sufficiently forthcoming in donating, as this argument appeared to suggest. Rather, the editorial suggested that the “shortage” of blood (allowing entry to the UK of the products made by commercial pharmaceutical companies to remedy that shortage) was not a real one but was to be ascribed to:

“the quality of management (or lack of it) which has led to a steady decline in the British Blood Transfusion Service since the late 1950s. There has been no effective national planning; the regional and protein fractionation centres now lack sufficient staff, accommodation, equipment and the basic organisational units to do the job. Moreover, the medical staff in the centres are often geographically and administratively isolated from the care of patients. The remedy, then, is … an urgent appraisal (for the first time) of a national policy for the procurement and eventual distribution of a natural resource which, unlike oil, will be still readily available in 100 years’ time.”[464]

The view of the author of the editorial appears to have been shared by Dr Maycock, whose view for many years had been that “An efficient modern blood transfusion service does not really fit at all into the pattern of the NHS.[465] It was also and importantly shared by the then Minister of State for Health, Dr David Owen. In a parliamentary written answer within a week of the publication of the editorial he recorded that the production of Factor 8 in the NHS had increased significantly in the last few years, but that “Further increases will depend on the extent to which regional health authorities are able to expand facilities in transfusion centres for the production of plasma, from which factor VIII is derived.[466] In short, he did not blame the public for being unwilling to come forward to donate: rather, he saw the solution in regions providing the necessary finance to expand the capabilities of regional transfusion centres in England and Wales to process their donations. It lay in the hands of the regional health authorities to enable BPL and Plasma Fractionation Laboratory (“PFL”) to work effectively in the national interest, by supplying them with greater amounts of plasma.[467]

Dr Owen was right, the author of the editorial was right, and the majority[468] of regional transfusion directors were right, in thinking that it would not have been difficult to persuade people to come forward to donate in greater numbers. Some 4-5% of the general population donated repeatedly. Subsequent evidence shows that if a real need was perceived, as during the Gulf War, this proportion would rise markedly; and almost every transfusion director who was asked about this in oral evidence confirmed that they would have expected little difficulty in increasing the number of donations if they had been required to do so.[469] Insofar as there was a brake on this, it came from the availability of finance in some regions, and a need for capital investment in premises, plant and equipment and for higher staffing levels which were often unsatisfied. These are points expressed as conclusions here: they are considered more fully below.

Proposals for reorganisation to improve the national service by the provision of central finance, and management through a committee with executive powers, remained an active concern. In 1977 the NBTS submitted to the Royal Commission on the NHS that the NBTS had assumed an increasingly national role which had suffered from constraints arising from regional development, inadequate central coordination and financing, and a poor integration of the activities of regional transfusion centres.[470] The current structure of the transfusion service was described as a “loose confederation of 14 Regional Transfusion Centres, independently financed, each providing services which vary considerably from Region to Region, and three central laboratories financed by the DHSS.[471] The Royal Commission noted that blood transfusion was one of the services provided by regional health authorities but “common services like these do not require a regional tier.[472]

The structure of the blood services across the UK was linked to the supply to BPL in England (and Protein Fractionation Centre (“PFC”) in Scotland, though Scottish National Blood Transfusion Service (“SNBTS”) was and is distinct from NBTS) of sufficient plasma to enable them to utilise production facilities to the full. At the same time as these oft-repeated concerns were being expressed about the difficulties caused by the fact that transfusion centres in England and Wales were administered and funded locally (regional health authorities had little incentive to provide plasma for fractionation if those health authorities were not themselves going to see a benefit for patients within their own region) BPL itself was suffering from an extended period of underinvestment. There was an accompanying lack of funds and incentive for research to be conducted into producing safer products.[473]

Until 1978 the same person was in a position of central influence in respect of both the NBTS, and the BPL (then operated by the Lister Institute) at Elstree: Dr Maycock. He was both consultant advisor on blood transfusion to the Chief Medical Officer (“CMO”), and in that capacity had some oversight[474] of the NBTS, and the director of BPL.[475] His influence was wider still: he was an honorary consultant in blood transfusion and resuscitation to the War Office, later the Ministry of Defence; had been president of the British Society for Haematology; and was a recognised teacher of pathology in the University of London.[476] He was the dominant figure in the field of blood transfusion. His standing astride both the sourcing and supply and manufacturing bodies ensured a degree of coordination between BPL and NBTS, both of which could in addition potentially benefit from the influence he could bring to bear.

Dr Maycock retired in 1978. When that happened, his roles passed into different hands. Dr Geoffrey Tovey succeeded him as consultant advisor to the CMO; Dr Richard Lane replaced him as the director of BPL. This coincided with the Lister Institute running out of the funds necessary to continue operations at Elstree (known as “BPL”, “Lister” or “Elstree”) under its own auspices. The facilities at Elstree, sadly in need of upgrading given a lack of recent investment, and which had never been designed to provide accommodation for large-scale fractionation of blood products,[477] were now to be funded jointly by the DHSS and North West Thames Regional Health Authority.[478] Though production facilities in England had never formally been part of the same service as the NBTS, and the making of blood products had always been funded separately since 1948, they might have been seen as two aspects working together under one NHS to obtain and supply blood and blood products to those in need of them. They were now not only funded differently, but very clearly managed separately, though still both within the NHS. Though there is little evidence of direct conflict between BPL (producing blood products) and the NBTS (obtaining fresh blood and plasma from donations, supplying hospitals with blood, plasma and blood derivatives, and supplying BPL with plasma),[479] this separation cannot have helped the formulation of a unitary response to these challenges.

Separation had a wider impact, too, in respect of the UK as a whole. Even whilst he was only director designate of BPL and not yet in post, Dr Lane began expressing his view that England (together with Wales) should look after itself, and BPL should be upgraded and expanded to achieve this. He opposed what had been Dr Maycock’s and the government’s previously favoured approach, with which Dr (later Professor) John Cash (Dr Maycock’s equivalent in Scotland) had enthusiastically agreed, which was to develop two production units of roughly equal capacities, one in England and the other in Scotland, to meet the needs of the UK as a whole. Though PFC in Scotland had been provided with new buildings on a new site in Edinburgh, and was designed to operate there on a 24-hour continuous process on weekdays, with a view to processing plasma harvested from the four most northerly regions in England as well as the Scottish regions,[480] Dr Lane’s voice was persuasive in ensuring the processing of northern plasma to make factor concentrates in Scotland never happened.[481] Further detail of this is given in the chapter on Self-Sufficiency.

In February 1980 Dr Tovey took up the cudgels which regional transfusion directors had previously wielded when he proposed a plan for the reorganisation of the NBTS. He complained that increased national demand had only highlighted the “constraints arising from regional development, inadequate central co-ordination and financing and poor integration of activities of Regional Transfusion Centres.[482]

He was clear that the major defects were unlikely to be overcome without an Act of Parliament creating a committee or board with executive powers. As an “interim” measure “and as a matter of urgency[483] he suggested that there should be a central coordinating committee for the NBTS including the chairman of the joint management committee for the central laboratories of the NBTS (which had been established shortly beforehand), the consultant advisor in blood transfusion, three regional transfusion directors, directors of the central laboratories, a regional medical officer, a regional treasurer, a representative of the joint committee, and representation from the DHSS. He suggested the committee formulate and coordinate national policy, and through its membership achieve implementation of that policy at central and regional levels. The three regional transfusion director members were expected to be the chairs of the northern, eastern, and western divisions which Dr Tovey had established within the service.

It was also apparent from what he wrote that he saw a need for closer liaison between the NBTS and the SNBTS, extending beyond effective coordination between BPL and PFC at Liberton. He commented: “My colleagues and I have agreed full national co-ordination will not be accomplished and a maximally cost-effective Service obtained until the NBTS and the Scottish NBTS are welded into a truly national Blood Transfusion Service, embracing all four corners of the United Kingdom.[484]

The DHSS adopted his proposal (though a truly national, in the sense of “UK-wide”, service was not for it to establish since the Advisory Committee spoke for England and Wales alone) and established a new Advisory Committee on the NBTS.[485] It was, however, to be chaired by a DHSS representative. Its terms of reference were “To advise the DHSS and the Welsh Office on the co-ordination of (i) the development and work of Regional Transfusion Centres, and the Central Blood Laboratories in England and Wales; (ii) as necessary – the English and Welsh Blood Transfusion Service with that of Scotland.[486]

A number of national[487] policies were pursued by the Advisory Committee. However, difficulties in achieving national standardisation persisted, and Dr Gunson (who succeeded Dr Tovey as consultant adviser in 1981) was later to highlight that the inconsistent and inadequate supply of plasma to BPL, and the difficulties in implementing HIV testing were examples of problems caused by a lack of an effective national policy for decision-making.[488]

In 1982, a special health authority (the Central Blood Laboratories Authority: “CBLA”)[489] was established to manage the central laboratories. Its remit included BPL at Elstree, the Blood Group Reference Laboratory, and PFL in Oxford. It did not include PFC in Scotland. In the same year the blood transfusion research committee of the Medical Research Council, which had met since 1939, was disbanded. This severed the last formal link between the national blood service and the Medical Research Council, which had once been responsible, in large part, for much of the transfusion service.[490]

By the start of 1983, therefore, the UK service was still one in which there were a number of autonomous parts: regional transfusion centres, organised in England and Wales into three districts but still consisting of individual regional transfusion centres, each under the financial and administrative control of their respective regions, and the services in Scotland[491] and Northern Ireland. The Advisory Committee on the NBTS was advisory only, and remained unable to exercise executive control over the service as a whole.

The directors of all the regional transfusion centres in England and Wales repeatedly noted that if plasma supply to BPL were to be sufficient to support the goal of self-sufficiency this required national management. The DHSS was aware too that research and development were duplicated unnecessarily across regional transfusion centres, BPL, and PFC due to a lack of central coordination; and that successful introduction of heat treatment to inactivate viruses in blood products would benefit from such central coordination.[492] By 1986 it was agreed that the organisation of the NBTS should be investigated by DHSS Central Management Services. A report followed in October 1987, which reported “wide variations in policy, procedures and functions between the regional transfusion centres with no real evaluation of performance and little effective co-ordination either between the regional centres or between the centres and CBLA.”[493] There were four major problems: the absence of useful and reliable management information; the inability of London regions to meet the needs of their hospitals; the lack of coordination between RTCs and between RTCs and the CBLA; and “Inefficiency”.[494] It identified three alternatives: (a) maintain the present system with the introduction of reliable management information; (b) create a new special health authority to manage the system centrally; or (c) retain the regional health authorities’ management of RTCs, but with formal national coordination of their work.[495] The DHSS favoured the third option. Accordingly, a National Directorate, intended formally to co-ordinate the blood transfusion services[496] in England and Wales, was set up as from 28 July 1988, funded by the Department of Health.

The National Directorate (despite its title) still lacked executive authority, which remained with the regional transfusion centres and their regional health authorities. Dr Gunson, the first national director, described it as operating by “persuasion”rather than executive power. Any policy proposal which involved the use of additional resources by regional transfusion centres was liable to create difficulty, since their budgets remained controlled by regional health authorities. He reported that the National Directorate did achieve some things: the interregional transfer of blood; the establishment of a management information system; quality assurance at RTCs, supported by audit; and “Improved blood donor recruitment and retention”. Nonetheless, he said the National Directorate was “overtaken by national events.[497]

The NHS and Community Care Act 1990 had the effect that RTCs had to handle their operating costs through reimbursement for products and services. This inevitably led to RTCs working closely with the hospitals receiving their products and services. The National Directorate was marginalised: “the lack of executive authority became crucial.[498] In June 1990 this led Dr Gunson to propose a fully national service with central executive management.[499] He argued that this would increase efficiency, supply and quality of blood products. At the same time, Professor Cash, who headed up SNBTS, maintained a call for more effective integration of the blood transfusion services on both sides of the border, as well as further finance for research and development in blood products from voluntary British donors. Though the Government had indicated that it respected clinical freedom, Professor Cash thought that such an approach frustrated efforts towards self-sufficiency.[500]

Individual directors at regional transfusion centres, including Drs Marcela Contreras, Fereydoun Ala and Ian Fraser wrote in support of Professor Cash, recording their concerns about the limited authority of the National Directorate, which did not have the power to formulate national policy in the light of the independent management of the RTCs.[501]

In 1991 Ernst and Young conducted a structural review of the NBTS. Their report favoured a central contracting authority (unifying the National Directorate and the CBLA) rather than a centrally managed service with direct line management. The latter was more costly.[502]

A special health authority as a “contracting authority” was therefore proposed. It was to be a central NBTS body which would agree operating contracts with regional transfusion centres as a means of managing the national service (a “contract rather than persuasion” model), which left regional transfusion centres with local autonomy. As discussions proceeded over the next 18 months it became apparent that it was impractical for the new National Blood Authority (“NBA”) to contract directly with hospitals. Instead, the conclusion was that it should become a central strategic management authority, which should directly manage both BPL and the regional transfusion centres along with the Blood Group Reference Laboratory.[503] From 1 April 1993 the Department of Health finally established a single body with executive authority along these lines, the National Blood Authority.[504] It was established as a special authority.

Thus 46 years after the last date when there had been central control of the blood transfusion system, and the production of blood products had been under unitary national control in England, during which there had been repeated calls for its reintroduction, it was finally restored.[505] In those 46 years it is probable that the way in which the service had been diversely organised, and regionally funded, with an inbuilt reluctance to fund and direct centrally, created unnecessary shortfalls in the supply of plasma to, and in the productive capacity of, BPL and rendered the achievement of self-sufficiency in the supply of coagulation concentrates much more difficult than it needed to have been. It also failed to concentrate funds spent on research, so that it might be better equipped and resourced, because funding made available for research was provided regionally, and thus both spread more thinly, and capable of duplicating efforts in another region rather than supplementing them.

As the history set out above shows, warnings to this effect were not heeded until too late.

Coordination between England and Wales, on the one hand, and Scotland on the other hand, so that there could be one unitary service was never achieved, though on an individual level there was often significant cooperation. A consequence of Dr Lane’s looking at England and Wales as a single unit, influential in an abandonment of the idea of providing for the UK as a whole, meant that PFC was not developed as it might have been to ensure sufficient supplies of product to the North of the UK and Northern Ireland while BPL concentrated on Wales, the South and the Midlands of England.

From the formation of the National Blood Authority to the present day

In May 1994 Dr Angela Robinson replaced Dr Gunson as national director of the NBA. Following consultation, the NBA created three administrative zones (London and the South East; Midlands and the South West; and Northern, administered respectively from North London, Bristol and Leeds).

In 2005 the NBA was replaced by NHS Blood and Transplant (“NHSBT”), a special health authority. The functions of both the NBA and the NHSBT were collecting, screening and processing and supplying blood, blood products and plasma for the purposes of the health services. NHSBT also was to supply stem cells and other tissues, and to facilitate and secure the provision of services to assist tissue and organ transplantation.

On 1 January 2011, BPL was transferred from NHSBT to become a Department of Health owned limited company: Bioproducts Laboratory Ltd (“BPLL”). This was technically a wholly owned subsidiary of Plasma Resources UK Ltd (“PRUK”), 100% owned and managed by the Department of Health.

In 2013 private sector investment in PRUK was permitted and on 18 July that year Bain purchased an 80% stake. BPLL was thus no longer under full public control. BPLL is now a privately owned company. Full privatisation was achieved when BPL’s current owners, Creat Group Corporation, acquired Bio Products Laboratory Holdings Ltd (of which BPL Ltd is a wholly owned subsidiary) from Bain and the Department of Health in 2017.[506]

Scotland

The origins of the Scottish blood transfusion service echoed the way in which it began in England. A single man – in this case Mr Copland, a dentist – inspired a “group of walking donors” to give transfusions of their blood to the Royal Infirmary in Edinburgh. In 1936, the Lord Provost of Edinburgh formed a committee to take over responsibility for the service. By the outbreak of War on 3 September 1939, small blood banks had been set up in Edinburgh and Glasgow.[507] These individual initiatives led to the setting up of the SNBTS on 11 January 1940. The SNBTS consisted of five regional blood transfusion centres: Edinburgh and South East Scotland; Glasgow and West of Scotland; Dundee and East Scotland; Aberdeen and North East Scotland; Inverness and North Scotland. It was managed by a charity, the Scottish National Blood Transfusion Association (“SNBTA”). There was a national organiser, initially Mr Copland.[508]

Following the creation of the NHS in 1948 the blood service became the responsibility of the Secretary of State for Scotland. The Secretary of State took over the premises and equipment of the SNBTA, which remained an independent charitable body with an agreement with the Secretary of State underpinning its work to provide the service.[509]

By 1953, the unit set up in 1943 to produce dried plasma in support of the war effort was expanded to handle plasma fractionation.[510] It operated in premises which were on the site of the Royal Infirmary in central Edinburgh. In the early 1970s it moved to a new site in Liberton (no longer in the same building as the Royal Infirmary, though close by a new building constructed to house it which was also established on the south side of the city) and was named the Protein Fractionation Centre (“PFC”). As the name suggests, it was specifically designed for the fractionation of plasma, by contrast with its English equivalent.[511]

The agreement between the SNBTA and the Secretary of State formally ceased on 31 March 1974. In 1974, too, PFC was completed.[512] The Common Services Agency (“CSA”)[513] took over responsibility for the service, which thereafter was formally known as the known as the SNBTS.[514] The CSA was overseen by the Scottish Home and Health Department (“SHHD”) and later its successors: until devolution in 1999 the CSA was administered by the Scottish Office; since then it has been answerable to the Scottish Parliament.

Administration by the CSA, which had responsibility for a range of services other than blood supplies, led to complaints that its management committee no longer had, or had available to it, independent specialist advice about blood transfusion. Directors of the five RTCs continued to press their point, and in April 1978 this led to the establishment of a sub-committee of the CSA to be known as the Blood Transfusion Service Sub-Committee. This included specialists in clinical medicine, laboratory medicine, a medical officer from SHHD, a representative of donor interests, and the national medical director of the service who had the right to attend or be represented at each meeting.[515] Following this, in 1978, Professor Cash was appointed as the national medical director. A national headquarters research laboratory, national reagents unit, and national quality unit were also created around this time.

Although there was no formal liaison committee between SNBTS and NBTS until January 1989, SNBTS and the Belfast regional transfusion directors participated in national meetings of the regional transfusion directors in England and Wales and, reciprocally, a regional transfusion director from south of the border, or representative of the NBTS National Directorate, after that was formed, attended some meetings of SNBTS directors.[516] The Advisory Committee on the NBTS dealt with matters concerning England and Wales but the SHHD, Welsh Office and DHSS Northern Ireland were represented by observers (as well as, in the case of Scotland, by Professor Cash) who would participate fully in its discussions.[517]

In 1990 the position of general manager was created – renamed national director in 1996 – and the national medical director (Professor Cash) became the national medical scientific director. The SNBTS management board included both the general manager/national director and the national medical scientific director, and the five regional transfusion directors amongst others. It was restructured at the start of the millennium, to move away from the previous regional structure toward a national functional structure. Since then, all blood donor services have been managed nationally, whereas before they were managed regionally. Regional transfusion directors thereafter became known as clinical directors; a national quality directorate was formed, a director of operations was created to manage donor services, manufacturing and logistics, and the number of blood processing and testing units was reduced to two.[518]

The governance arrangements through the CSA were strengthened in 2003, when the SNBTS director became an executive director of the CSA board. It remained and remains responsible for the provision of supplies of human blood for transfusion and related services, but in 2008 the production of blood fractions was removed from the functions of the CSA.[519] Wider organisational change occurred in 2012/13, in which divisions of the CSA[520] were consolidated into strategic business units, and support services were centralised. SNBTS was considered of sufficient size and specialty to retain its own identity; in 2013 its “board” was renamed the Senior Management Group, chaired by the SNBTS national director, and the Medical and Scientific Committee was renamed the Clinical Governance and Safety Group, and was chaired by the SNBTS medical director. SNBTS was thereafter organised in a number of directorates: donor and transplant services; blood manufacturing, tissues cells and advanced therapeutics; patient services, quality assurance and advanced regulatory compliance; strategy, planning and performance.[521]

There have been no further significant changes to structure since then.[522]

Throughout the relevant period all funds have been provided centrally, rather than from regional budgets as was the case in England and Wales.[523]

Northern Ireland

With the start of the National Health Service, the Northern Ireland Blood Transfusion Service (“NIBTS”) became the responsibility of the Northern Ireland Hospitals’ Authority. In 1953 a new headquarters was established in Belfast.[524] In many ways, it was ahead of its time: in 1953 it started to use a mobile donation unit.

The blood transfusion laboratories were at Royal Victoria Hospital. In 1961 they moved to the Belfast City Hospital.

In 1973, when the NHS was reorganised, it merged in Northern Ireland with the broader social care system. It became called the Health and Personal Social Service and later the Health and Social Care (“HSC”) System. At that time, and indeed from 1972 until 1999, the system was managed by the UK government through the Northern Ireland Office. Public and social policy decisions were taken at Westminster, and communicated through the Secretary of State within the Northern Ireland Office. They answered directly to the UK government. In general, policy and strategy in health and social services developed in Northern Ireland to mirror English policy decisions.[525]

On 1 June 1994 the Northern Ireland Blood Transfusion Service (Special Agency) was established as a special health and social care agency.[526] Its functions were to ensure that all hospitals and other clinical units in Northern Ireland were provided with adequate supplies of blood and blood products and that those “comply with all current national standards of safety and efficacy”.[527] It was to assess and anticipate the needs for blood and blood products in Northern Ireland, recruit and maintain adequate numbers of healthy, voluntary, non-remunerated donors, ensure the health and safety of donors during contact with the NIBTS, and to provide counselling to those found to have abnormalities during routine screening. It was also to provide “an education and advisory service on the utilisation of blood and blood products by clinicians.[528] A purpose-built facility opened at the Belfast City Hospital site in 1995. The NIBTS moved there, and it remains their headquarters.[529]

The director of the service between 1969 and 1980 was Colonel Field, followed from June 1980 to May 1994 by Dr Morris McClelland.[530] From June 1994, when NIBTS was created, he became known as the chief executive and medical director. He stepped down in 2009.[531]

Until the early 1980s no plasma was sent from Northern Ireland to the mainland of the UK for fractionation. However, Northern Ireland did receive some blood products manufactured by BPL, as well as purchasing commercial blood products. In the early 1980s it began sending plasma to Scotland for fractionation at PFC.[532]

Wales

In 1946, NBTS was formally established in England and Wales. The first blood banking service opened in Wales in 1946 located in Newport Road, Cardiff. During this time, it was under the supervision of the Emergency Medical Service.[533]

In 1948 the Welsh Hospital Board was formed as one of the regional hospital boards established under the National Health Service Act 1946. The board, known up to 1961 as the Welsh [Regional] Hospital Board, was responsible to the Welsh Board of Health. It had oversight of the South Wales RTC, and the Liverpool Regional Hospital Board managed the supplies for North Wales through the Mersey RTC. The Cardiff regional transfusion director initially reported to the Welsh Hospital Board and then, from 1974 when regional health authorities were established in England, to the Welsh Office.[534]

Essentially this arrangement continued until 1991 when the management passed to the Welsh Health Common Services Authority (“WHCSA”).[535] When the National Blood Authority was set up in England, NBTS (Wales) did not become part of it but remained under the control of the Welsh Office; management remained with WHCSA. A new purpose-built building for the South Wales RTC was completed in 1996, and the service moved from Rhydlafar to the new site at Talbot Green, Pontyclun, in 1997 where it remains.[536]

With devolution and the establishment of the Welsh Government, the Welsh Blood Service (“WBS”) was created in 1999, with the director reporting to the Velindre NHS Trust.[537] It was not until 2016 that management responsibility for the blood service in North Wales moved to the WBS. Until then, blood service policies for South and Mid Wales were agreed and funded by the Velindre Trust and the Welsh Government and for North Wales by the NBA and then from 2005 NHSBT.[538]

For most of the time with which the Inquiry is centrally concerned, Dr Tony Napier was director of the RTC in Cardiff. He commented on the disadvantages of the system of regional autonomy (which in practical terms affected South Wales):

“They [the South Glamorgan Health Authority] had their own problems in funding services, but it was always a struggle … If the big issues that we’re concerned with are … to be self-sufficient in terms of England and Wales … then the accountability discussions and the funding discussions should have been taking place centrally with whatever authority was set up to do that … if you were to have a sort of analogy, in World War II times, you would never expect enough Spitfires to be made if the responsibility for doing that was … handed out to different parts of the United Kingdom. It has to be an efficient, centrally managed operation. And the arrangements of accountability being held locally, the sort of accountability discussions didn’t … greatly concentrate on the issues that were important for the service itself: safety, self-sufficiency and suchlike.”[539]

His perspective on BPL, and the ability of the service to deliver self-sufficiency, is also worth recounting:

“Q. … in terms of the obstacles to achieving self-sufficiency, do you have any perspective on the relative significance of, on the one hand, the ability to supply enough plasma and, on the other, the ability of BPL to fractionate? …

A. … I think it could have been run better. I was never completely certain how much the limitations in capacity at BPL played a factor in this. I think we got different signals at different times. Sometimes it appeared that they were our target to support self-sufficiency and at other times we seemed to get messages that there were operational problems which might constrain that.”[540]

Commentary

The way in which the transfusion services and fractionation plants[541] were organised played a part in the failure of the UK as a whole to achieve the level of safety in blood and blood products it could have done.

Thus the problems of a lack of coordination, highlighted by Dr Napier’s “Spitfire” analogy, and the lack of any one person or body exercising executive authority contributed to the failure of the UK as a whole to achieve the goal of self-sufficiency in blood products to which many had aspired even before Dr Owen made it official government policy. A lack of coordination between England (and Wales and Northern Ireland) and Scotland from the early to mid 1970s onwards added to this.[542] Part of the reasoning which led Dr Lane to persuade others that English plasma should not be sent to Scotland for fractionation, even if it came back to the supplying regions as manufactured blood products, was that the regions had to pay out of their own funding for this. If the services had been operated as initially envisaged in wartime[543] they would have been run and financed to serve the interests of the UK as a whole.

As for NBTS in England and Wales, supplies of plasma for fractionation could not be mandated centrally, and the willingness of regions to use their resources to provide it to BPL was dependent not only on the good will but also the resources available to the region at the time, balanced against other pressing health needs arising locally. The Royal Commission on the National Health Service reported in 1979 that common services such as blood transfusion did not require a regional tier, and pointed to the Common Services Agency in Scotland (with similar arrangements for Wales and Northern Ireland) which provided services to the health boards “without exercising the monitoring role of RHAs in England”.[544]

The separation of a controlling influence over both the transfusion and production parts of the service which occurred when Dr Maycock retired, to be replaced by Drs Lane and Tovey, exacerbated the problems caused by regional autonomy in provision.

The lack of centralised funding led to research being a poor orphan of the service so far as funding and resources were concerned.

As the following chapters will explain, it also contributed to a failure to achieve as much plasmapheresis as would probably have resulted had there been one, national point of executive control, in turn leading to less plasma for fractionation, and a greater waste of blood donations.[545] It similarly made it more difficult to mount an effective coordinated national approach to achieving a greater use of packed red blood cells, and to introduce measures to help minimise the use of transfusions where possible. When it came to a need to pass information on to donors (through such as the donor leaflets produced in response to the HIV epidemic), to screen donors consistently, and to facilitate the screening of donations, a nationally coordinated response would likely have achieved more than was done: it was difficult to achieve it under the system as it stood.

3.10 Regional Transfusion Centres

This chapter describes the way in which RTCs were organised, their facilities and funding and their role in the purchase and distribution of blood products. It also describes the components which they produced, their approach to donor recruitment, arrangements for the supply of plasma to BPL and steps taken to obtain more plasma.


Key dates

February 1972 report by Professor Cash “The Principles of Effective and Safe Transfusion” promotes use of red cell concentrates to help achieve self-sufficiency.

April 1981pro rata system” introduced: amount of concentrate returned to an RTC now reflects the amount of plasma provided to BPL by that RTC.

Autumn 1982 SAG-M available to make red cell concentrate easier to administer: reducing demand for whole blood, increasing blood available for plasma fractionation

1982 Belfast RTC starts sending plasma to PFC for fractionation.

April 1989 cross-charging system introduced: RTCs sell plasma to BPL and buy back factor products.

1989 National Provision of Donors Committee established.


People

Dr Tony Napier medical director, Welsh Regional Blood Transfusion Service

Professor John Cash medical director, SNBTS (1979 - 1997)

Dr Harold Gunson Manchester RTD (1980-1988), Director of National Directorate for the NBTS (1988-1993)

Dr Morris McClelland Northern Ireland RTD (1980-1994)


Abbreviations

BPL Blood Products Laboratory

RHA Regional Health Authority

RMO Regional Medical Officer

RTC Regional Transfusion Centre

RTD Regional Transfusion Director

SAG-M mixture of saline, adenine, glucose and mannitol


Regional transfusion centres and their functions

England and Wales

The main function of the 14 regional transfusion centres (“RTCs”), [546] each of which had its own regional transfusion director, was as summarised in a 1987 NHS management consultancy study:

“RTC Functions

    2.13 There is no comprehensive and common definition of what the functions of an RTC are or should be.[547] There is a range of ‘core’ functions common to all centres namely:

      2.13.1 collecting blood;

      2.13.2 testing the blood collected;[548]

      2.13.3 separating blood into components and freezing harvested plasma for transmission to BPL [Blood Products Laboratory] for fractionation;

      2.13.4 issuing blood and products to hospitals;[549]

      2.13.5 providing a reference service to hospitals on grouping problems;

      2.13.6 providing a source of medical advice on transfusion and product related problems.[550]

Each RTC, as the name suggests, served its own region. Each developed in accordance with (a) the services required by the hospitals they served,[551] and (b) the adequacy of the funding provided by the regional health authority (“RHA”).[552] Many of the RTCs were in old, cramped, inadequate buildings, which curtailed what they could do.[553] Some were inadequately staffed.[554] This meant that only some of the RTCs carried out functions such as cross-matching and blood-grouping, acting as a regional reference centre for the hospitals in the event of transfusion problems[555] and autologous transfusion.[556]

RTCs were funded by, and thus accountable to, their RHAs,[557] with medical and scientific oversight from the regional medical officer (“RMO”),[558] or the regional medical director. The relationship between the RTC and the RHA and RMO varied according to the region. Some RTCs had very little to do with their RHA whereas some had close working relationships. In addition to oversight from their RHAs, RTCs were also subject to inspections from the Medicines Inspectorate. During the 1970s these were informal inspections and in a letter dated 18 May 1981, Dr Diana Walford wrote “the Medicines Inspectors will be starting their formal inspections of Regional Transfusion Centres later this year.[559]

In Wales, where the South Glamorgan Health Authority was responsible for the Cardiff RTC after 1974, Dr Tony Napier, who was the regional transfusion director, described “reasonably harmonious and efficient arrangements”.[560]

A common arrangement was one where hospitals within the region had their own blood banks, which were supplied by the RTC. This could lead to a disconnect between those using the blood (at the hospital) and those who were concerned with securing future blood supplies (the RTC), which may in part be responsible for what was a sluggish change of practice from the over-enthusiastic use of whole blood for transfusion to both the more general use of concentrated red blood cells and a more sparing approach to giving transfusions.

So far as Wales was concerned, the Welsh Hospital Board was responsible for the Cardiff RTC from 1948 until 1974, when the Board’s functions were assumed by the Secretary of State for Wales who established area health authorities, including the South Glamorgan Health Authority. The regional transfusion director reported to the Welsh Office until 1982 when this passed to the South Glamorgan Health Authority. In his evidence, Dr Napier said “My personal medical accountability was to the Chief Medical Officer … in Wales but, in terms of the operational logistics of the service, it was under the care of South Glamorgan Health Authority.[561]

Scotland

There were five RTCs in Scotland.[562] They were part of what became the Scottish National Blood Transfusion Service (“SNBTS”), run by its medical director Professor John Cash from 1979,[563] which had its own headquarters.

One of the main differences between the RTCs in Scotland and the RTCs in England and Wales, was that as well as the functions of an RTC, four out of the five Scottish RTCs (all except the West of Scotland) carried out blood banking for the hospitals in which they were based. This meant that“if a doctor required blood for patients, the request would go to the transfusion service … The transfusion service blood bank laboratory would carry out the compatibility tests to ensure that they provided blood that was matched to that patient’s blood type, et cetera, and they would then, depending on the details of the request, they would either hold it for the patient, or they would send it to the hospital destination where it was to be transfused.[564]

All five of the RTCs (and the National Headquarters) were centrally funded. Despite this, Professor Cash did not have any power to compel the RTCs to adopt any particular policy and so any decisions that were to be implemented nationally had to be reached by consensus.[565] It follows from this that the Scottish RTCs did not all provide the same services as each other. The RTCs, like their English counterparts, suffered from poor facilities[566] and inadequate staffing which in some instances limited what they could offer.[567]

The Common Services Agency and the health boards took the view that the RTCs were not entitled to claim Crown Immunity, which if they had been would mean they need not be inspected nor would they require licences.[568] Accordingly the Scottish RTCs subjected themselves to formal inspections from the Medicines Inspectors to obtain manufacturing licences. Inspections started in January 1980 and all centres had been inspected by March 1982.[569]

Northern Ireland

There was only one transfusion centre in Northern Ireland and this was based in Belfast. The Eastern Health and Social Services Board (“EHSSB”) held the budget for it and provided some management support. Thus while the EHSSB was not involved in the day-to-day running of the RTC, if the RTC wanted (for example), to recruit someone, they had to seek authorisation from EHSSB.

The first external inspection of the RTC in Belfast was a 1981 visit by the Protein Fractionation Centre (“PFC”).[570] The first inspection by the Medicines Inspectorate of the RTC was December 1982. Thereafter, inspections took place about every two years.[571]

The granting of a manufacturing licence to the Northern Ireland Blood Transfusion Service (“NIBTS”) was delayed due to the inadequacy of the premises. Indeed, this was a crucial factor in securing the eventual funding for a new NIBTS headquarters unit. The service relocated to the new (current) centre in 1995, and was granted a manufacturing licence once this had been inspected.[572]

The role of the RTCs in purchase and distribution of factor concentrates

England and Wales

Most of the RTCs (prior at least to the introduction of cross-charging)[573] provided plasma to the Blood Products Laboratory (“BPL”) for fractionation, received BPL’s fractionated products back, and then distributed them.[574] Practice as to how they disseminated BPL’s products within their region differed.[575]

Practice also varied widely between RTCs as to whether or not they became involved in the purchase of commercial Factor 8 for the region. Some had no role at all,[576] some performed a warehousing function, ordering what the haemophilia centre directors told them to and then distributing to the relevant haemophilia centres and hospitals,[577] whereas others had a central role in their purchase.[578]

There were differences of practice too as to who paid for the commercial products: in some areas the RHAs funded this directly[579] whereas in others they did so indirectly: the cost came out of an RTC budget.[580]

The system of obtaining and funding the provision of factor concentrates to patients had consequences. In 1989 there was concern at BPL that there was a lack of uptake of 8Y,[581] and that commercial concentrates might be being used instead. This seemed to at least one haemophilia centre director[582] to be a consequence of the awkward arrangements for ordering 8Y (three months ahead, via the RTC) and obtaining it (indirectly, through the RTC, rather than directly to the hospital). Dr Harold Gunson explained that this was because the Blood Transfusion Service held the budget for Factor 8 on behalf of the RHAs.

Scotland

Prior to the PFC taking over the ordering of products nationally in the late 1980s, the different RTCs had different arrangements for the purchase and supply of commercial concentrates, and the distribution of PFC concentrates.[583]

Northern Ireland

Despite the fact that from mid 1982 Dr Morris McClelland attended an annual meeting with Dr Elizabeth Mayne to try to coordinate supplies of Factor 8 with usage and demand,[584] it seems that the Belfast RTC had no role in the purchase of commercial product (albeit that from 1985 the cost of the commercial products came out of the RTC’s budget). However, both BPL at first and then PFC thereafter did supply the Factor 8 allocated to Northern Ireland to the Belfast RTC, who would then in turn supply it to Dr Mayne.[585]

Targets and donor recruitment

Increasing demand by clinicians for blood products[586] throughout the 1970s and most of the 1980s was accompanied by increasing demands for blood components from the blood services throughout the UK. Each RTC in the UK had two central targets to meet: one to satisfy the hospitals in the region each served; the second to cater for the demands of the two fractionators – BPL in England and PFC in Scotland. Its first aim was to collect the donations of blood the RTC needed to meet the demands of the hospitals in its area for whole blood and labile blood components.[587] These products the RTCs prepared and issued themselves. Its second aim was to meet the plasma needs of the fractionators, not just for the production of Factors 8 and 9, but also immunoglobulins, albumin, and other blood products.

In order to achieve these aims, the RTCs had, of course, to ensure that they had sufficient donors.

Blood itself had a short shelf-life even in the presence of an appropriate anticoagulant, and could neither be heated nor frozen. It had to be used, at first within three weeks, later within five or sometimes six. Accordingly, arranging supply was always complicated by the need to provide for a constant flow of donations to meet the anticipated needs for blood to be transfused, from season to season. It is a testament to the abilities of the blood services that they have managed this with as little panic and concern about shortfalls as there has been, though there have been “moments”.

England and Wales

As early as 1964, there were misgivings amongst regional transfusion directors about the ability to recruit sufficient donors necessary for the expansion of the National Blood Transfusion Service (“NBTS”). “Many Directors felt that N.B.T.S. was little known or taken for granted and that national publicity … and appeals by well-known personalities were wanted.[588] They felt that “Annual recruitment of donors at the rate maintained over recent years was not enough; the rate should increase with the constantly growing demand for blood.[589]To achieve this, it was decided that “An increasing proportion of donors had now to be recruited from less socially conscious members of the public, and more intensive and costly publicity was necessary in order to reach them.[590]

At a regional transfusion directors’ meeting in 1965, Dr William d’A Maycock presented a short paper on the future development of regional transfusion centres, the aim of which was to attempt “to reach a better estimate of the blood needed in the next decade.” The information suggested that “in Canada, Australia and U.S.A. the number of donations collected annually was at present proportionally considerably greater than in U.K.”and on that ground it might be expected that the demands for blood would continue to rise: however the graphs presented at the meeting suggested that the amount used per patient had plateaued, and there was concern about being able to use all the blood which one set of estimates suggested.[591]

Despite this early focus on the national picture on donor numbers, until 1994 when the National Blood Authority started a national campaign,[592] campaigns were run at a regional level,[593] it being the responsibility of each centre to decide how best to recruit and keep new donors.[594] In particular, prior to the establishment of the National Directorate in 1988, centralised efforts to develop national advertising materials and strategies paid for from central government funds had limited success. Funding for recruitment campaigns was often an issue.[595]

Donor numbers did, on the whole, increase year on year. The fear of AIDS had some adverse effect on this. There was an initial decrease in the numbers of donors but in November 1985 Dr Gunson felt able to report that this drop had now righted itself.[596] There is nonetheless some evidence that donors continued into the 1990s to link giving blood and getting HIV: the Yorkshire RTC experienced a sharp decline in donor attendance in the late 1980s, and undertook some research which showed that a number of donors felt that by donating they were at risk of being infected with HIV. This led to the RTC mounting a multi media campaign including adverts on buses, at railway stations and in leaflets at doctors’ and dentists’ waiting rooms to reassure potential donors that there was no risk of infection involved in giving blood.[597]

Donor numbers were also affected by the makeup of the donor populations. Some people were reluctant to give blood for cultural and socio-economic reasons.[598]

In February 1989 moves were made to centralise recruitment and retention of donors by forming a National Provision of Donors Committee.[599] It was made up of representatives of RTCs from England and Wales, with observers from the Scottish and Northern Irish blood services. One of its successes was to get the National Directorate to use a national phone number for potential donors to call to make an appointment. This centralised system had a positive impact on the blood supply. At a meeting of the Committee on 10 October 1990, “A comparison with 1989 reveals that stocks are 40% higher and the bleed rate is up by 3%. If this rate is maintained the NBTS will collect 63,000 more donations this year over last year.[600]

Keeping donors happy and returning to give blood was good not only for retention of donors, but also for their recruitment. It was widely believed that a happy donor would be more likely to become an ambassador for the blood service and thus help generate new donors.[601] Regional transfusion directors described how they had programmes to reward loyal donors with certificates and even gifts.[602]

Targets for whole blood and packed red cells

RTCs produced a range of labile blood components for the hospitals in their area, including whole blood, red cell concentrates and cryoprecipitate. They set their own targets for these products each year in consultation with their haematological colleagues in the hospitals, and the RHA as the funding body.[603]

While the planning for this aspect of an RTC’s work may have been relatively straightforward, the execution of it was not always so. For some RTCs blood shortages were not uncommon. North West Thames RTC was one such centre, despite, according to Professor Dame Marcela Contreras, collecting “more than anybody else per thousand population.[604] The blood shortages at North West Thames RTC were addressed in part by the RTC contracting with the Oxford RTC to collect donations on its behalf.[605]

Why did some RTCs experience shortages, while others did not? Some RTCs faced issues particular to them.[606] Others suffered shortages due to lack of staff and diminished budgets.[607] Professor Contreras’ explanation for the problems in the North London RTC was that the Centre served a number of London teaching hospitals, who made great demands for blood.[608]

In May 1985, Professor Cash was invited by both North East Thames and North West Thames RTCs to examine the problems that they faced in supplying sufficient blood to hospitals within their regions. In the course of his investigation, he also visited the South West Thames RHA. He reported in March 1986 that there were “very grave problems in maintaining the supply of blood and blood products throughout the London (Home Counties) area” and that “If no collective action is taken then within 3 years the matter will become one of genuine public concern and alarm.” Professor Cash described “acute shortages” occurring every day and clinical requests frequently not being met.[609]

The position he described had “arisen as a result of a long period of neglect with regard to management, primarily at RHA but also at RTC levels”, and he looked for a “plan to reorganise the London Transfusion Services as a whole so that they are … co-ordinated into an operationally cohesive consortium.[610] He identified that North East Thames RTC (Brentwood) had untapped potential to increase donations; that North West Thames RTC (Edgware) was servicing some London hospitals which ought to have been served by Brentwood; and that it would benefit by an extension of the area from which it collected donations towards Oxford.

A particular problem was that of making provision for private hospitals, which disproportionately featured in London.

It is unsurprising that some RTCs were able to collect more donations than they actually required[611] while some RTCs were unable to collect sufficient donations to meet their needs – unsurprising because the characteristics of the donor population were different in each area,[612] as were the requirements for blood and its components. What is perhaps surprising however, is that there was no national system in place for RTCs who collected more than they required to help out RTCs suffering from acute shortages. Professor Cash, discussing the blood transfusion services in England and Wales as a whole, was extremely critical of a system which made it “possible, and on many occasions, for severe shortages of blood to arise in one part of the country while less than 10 miles away (in another region) the regional health authority is dismantling part of its blood collection programme because of sustained excesses.”[613] This criticism was not accepted at the time by Professor Contreras who accused Professor Cash of “gross exaggeration”.[614]

Without a national system to redistribute donations, there was no pressure on RTCs to exceed their own regional targets in order to assist other RTCs.[615] The need for a national system is highlighted by one remarkable finding made by Professor Cash. He recognised that in Hampstead, Bloomsbury and Islington there were a large number of private (haematological) patients from outside the UK. Because the RTC could on occasion not supply enough blood from its own resources, the private hospitals concerned obtained blood supplies from Europe. He added: “Of no less interest has been the finding that on occasions this European blood has been transferred from the private sector to NHS hospitals that cannot get sufficient supplies from their local RTC. It is my understanding that this latter feature may contravene the Medicines Act.”[616] A failure to organise blood supply on a national basis had in this instance breached an article of faith for blood transfusionists in the UK, by permitting the entry of blood from abroad. Further, reliance would necessarily be placed by the RTC on the private hospitals to satisfy themselves as to the safety of any source from which the blood had been taken, for it seems unlikely that if blood was needed urgently to meet a temporary regional shortfall the NHS hospital using it would carry out those checks.

Once Dr Gunson had been appointed the national director in October 1988, he set up a programme to coordinate the stocks of blood and its components in England and Wales on a daily basis. Each RTC would fax daily stock lists to Dr Moore at the National Directorate, so that if anybody was short of anything centrally they would know who could bail them out, which was much more efficient than just ringing round, which was what would have gone on before, choosing your nearest neighbour first, of course.[617]

While this clearly made a difference in plugging short-term crises,[618] arising ad hoc, it was not a strategic approach.

Plasma for BPL in England and Wales

Setting targets

The setting and meeting of targets for the plasma to be sent to BPL for fractionation was altogether more complex.

Unlike the targets for blood and blood components which could be supplied directly from local transfusion centres, they were suggested to RTCs by BPL and the National Directorate, without much input from the regional transfusion directors.[619] As to how the targets were arrived at, the evidence suggests that this changed over time.

The starting position was for BPL to work out the amount of plasma it considered it needed to meet the needs of the populations of England and Wales (and Northern Ireland up until 1982 when they began sending their plasma to PFC in Scotland) for factor concentrates and other fractionated blood products. That overall target was set in consultation with (in particular) the Department of Health and Social Security (“DHSS”) and the haemophilia centre directors.[620] The overall target was then divided between the RTCs on the basis of the size of the population of the area covered by the RTC.[621]

There were three issues with this way of organising plasma collection:

  1. The target was not needs based.
  2. The RTC could not expect to receive back an amount of concentrate proportionate to the plasma it had collected and provided to BPL.
  3. Neither BPL nor the NBTS could require regions to fulfil their targets: they had no power to do so. They could only request.

The system was thus one of taking plasma from a region in proportion to its population; but delivering the product back in proportion to the numbers who had been treated in the last year, for which there were records. Yet the system for funding this approach (which was both national, and not obligatory) was regionalised. This meant, for example, that an RHA which funded its RTC to collect large quantities of plasma for supply to BPL, could find itself in effect subsidising areas of the country which did not produce as much plasma.[622] Yet it could not be required to collect as much. It is therefore unsurprising that RHAs were unwilling to meet requests by regional transfusion directors for money to finance ambitious plasma targets.

This was not a system that was conducive to achieving self-sufficiency.[623]

In April 1981 a new system was therefore introduced whereby the amount of concentrate that was returned to an RTC reflected the amount of plasma provided to BPL by that RTC.[624] This was known as the “pro rata system”.

While the pro rata system went some way to addressing the problems identified above, it still had a number of flaws:

  1. The targets that were set for each RTC by BPL and Dr Gunson (and after July 1988 when it was established, by the National Directorate) were not based on the amount of concentrate a region needed to achieve regional self-sufficiency in factor products. It was based instead on the population of the region. So for example in 1984/1985 the Trent RTC was seemingly able to meet its plasma target (and the finances to do so had been agreed by the RHA) but it was considering reducing the amount of plasma it was going to send to BPL “since both P.P.F. and Factor VIII would be produced at a level greater than for regional needs”.[625]
  2. The pro rata system did not provide RTCs with certainty as to the amount of factor products they would receive back from BPL in any given year. Although there was a correlation between the amount of plasma an RTC provided to BPL and the amount of concentrate they received back, what they received back was dependent on two other factors. The first of these was the amount of plasma other RTCs provided. This was because BPL worked out the allocation of concentrates by dividing the total quantity of plasma supplied by all of the RTCs and then allocating concentrates in proportion to the amount of plasma each centre had contributed to that total. Thus in some years where the Northern RTC sent the same amount of plasma to BPL as they had done in previous years, they actually received 50% less product back from BPL, because other RTCs had increased their production that year, and Northern RTC had thus supplied a smaller proportion of the total plasma.[626] The second was that the system also had to accommodate very particular demands: thus the demand for supply from Treloar’s was so great that, if it were to be satisfied out of the allocation to the Wessex region, it would impoverish the supply to other patients in that region. Accordingly exceptions (such as Treloar’s) had to be made. This in turn reduced the certainty of return for an RHA financing the supply of plasma to BPL.
  3. Since the targets were set by BPL rather than drawn up by the RTCs, some RTCs pushed back against them because they did not agree they were set at the right level.[627]

A scheme which involved the setting of plasma targets for BPL on a national basis while trying to get regional bodies to fund the necessary work was fundamentally flawed and was ultimately unsuccessful in achieving self-sufficiency.[628] As Dr Gunson noted even the more committed regions would only provide finance for plasma supply in respect of their own population.[629]

In April 1989 a cross-charging system was introduced. This fundamentally changed the way that the RTCs were funded and blood and blood products were paid for. RTCs were now expected to sell their plasma to BPL, and then buy back the product they needed from BPL once it had been fractionated, selling those BPL products on (along with the products the RTCs were producing such as red cell concentrates and cryoprecipitate) to the hospitals in their region.[630] Thus RTCs were no longer fully funded by their RHAs. However, even the introduction of cross charging did not remedy issues in the system that acted as a disincentive for RTCs to increase their plasma offering to BPL. This was because the Department of Health set the cost BPL paid the RTCs for plasma without reference to the actual cost to the RTCs of collecting that plasma.[631] Thus the cost to many RTCs of obtaining the plasma exceeded what BPL would pay for it and RHAs had to subsidise the collection of plasma to be supplied to BPL.[632]

Steps taken to meet targets: England and Wales

The drive to provide more plasma to BPL led to RTCs changing their practices in two key ways.

First of all RTCs aimed to wean hospitals off their reliance on using whole blood for transfusions where what was needed by a patient was the red blood cells in the donation. If the red blood cells and the plasma in a donation were separated, the plasma component could then be sent to BPL, and the red blood cells used for therapy. If platelets were required for a patient, they could also be separated out, leaving plasma and red blood.

Red cell concentrates were increasingly used; whereas in 1975, 90% of transfusions in England and Wales were of whole blood, and 10% red cell concentrates, by 1985 around 50% of blood issued for transfusion consisted of the red cell component.[633]

Regional transfusion directors had a role in trying to persuade their clinical colleagues to make the switch.[634] Different centres managed to achieve a reduction in the use of whole blood at different times. Scotland was well ahead of England throughout. As for England, the North London (Edgware) RTC was one of those centres which had early success. By contrast, in 1986 Dr Lloyd noted that the Northern area was one of the biggest users of whole blood in the country.[635] Likewise East Anglian RTC was still issuing as much as 40% of the blood it issued as whole blood in 1990.[636]

Secondly, as the chapter on Self-Sufficiency explains, an additive solution called SAG-M[637] was added to blood from (probably) the autumn of 1982 to make red cell concentrate easier to administer and thereby reduce demand for whole blood, increasing the amount of donated blood available to be used as plasma for fractionation. However, the introduction of SAG-M was initially slowed because it was regionally and not nationally funded. It appears that some RHAs were reluctant to finance the additional up-front costs.[638] A report from January 1984 stated that there had been difficulties in obtaining funds from RHAs in the latter part of 1983 and several regional transfusion directors had cancelled orders of SAG-M packs.[639]

Thirdly, RTCs had to consider whether to obtain plasma by way of plasmapheresis. Again, this is addressed in the chapter on Self-Sufficiency. All the RTCs had some form of plasmapheresis programme by the end of the 1980s or beginning of the 1990s, save for the North East Thames RTC.[640]

Self-defeatingly, the South West RTC in 1975 increased its plasma offering to BPL by getting agreement from treating haematologists in its region to restrict the amount of cryoprecipitate (produced from locally sourced plasma) being infused in favour of the use of Hemofil.[641]

There were a number of barriers to RTCs reaching their plasma targets, quite apart from the systemic problems set out above. For example, the Yorkshire RTC failed to meet its targets in 1989 because of an industrial dispute, an increased use of cryoprecipitate and fresh frozen plasma locally, and the loss of 10,000 donations.[642] This resulted in their target being revised down the following year.[643]

Dr Lloyd in his written and oral evidence identified three barriers to the Northern RTC meeting its plasma targets. First (and prior to 1985 when a new centre opened), was the limited and outdated nature of the facilities at the RTC. Second was the belief of those at the RTC that the hospitals needed whole blood, rather than red cells (and in any event they did not have the bags and the funds to use SAG-M which would have increased the plasma they could have collected in this way). The third was “prior to 1988 the RHA’s approach to funding plasma collection.[644] In a document he produced in 1989 he mooted some different reasons for this third barrier:

  1. the preference of the Newcastle Haemophilia Centre for commercial Factor 8, which may have led to an increased Factor 8 budget for the Haemophilia Centre and so no additional funding for the RTC;[645]
  2. the RHA may have been unhappy about the way the RTC ran and considered that money was better spent on commercial products; and
  3. his predecessor may have advised the RHA that BPL had insufficient capacity to process the plasma in any event.[646]

Northern Ireland

Targets for donors

Blood shortages were not unknown in Northern Ireland. There were occasions when supplies to hospitals had to be rationed or elective surgery had to be postponed.[647] Certainly in the late 1970s and early 1980s the closure of many factories made it more challenging to meet targets[648] since factory donation sessions were relatively easy to organise and generally very well attended by would-be donors.

The Northern Ireland RTC used telemarketing, which involved eligible donors being telephoned immediately prior to donor sessions.[649] As in England and Wales, media advertising in newspapers, on radio and on television was used to try to encourage more donors.

As to plasmapheresis, in the 1970s there was occasional collection by manual plasmapheresis for the collection of special reagents when donors or antenatal patients were identified as having a particularly valuable antibody. Later, collections were made by machine pheresis, and by 1990 NIBTS was collecting over 3000 plasma donations per annum in this way. This was the maximum throughput that could have been achieved in the (old) NIBTS building and represented about 10% of the total amount of fresh frozen plasma which was sent to PFC in Scotland.[650]

The Belfast RTC was one of the earlier centres to adopt SAG-M. By 1986 it had increased its output of red cell concentrates from something like 20% of the units sent out for transfusion, to 75-80%, in the space of about three or four years.[651]

Targets for plasma in Northern Ireland

Prior to 1982, the Northern Ireland RTC sent its plasma to BPL for fractionation. Due to difficulties shipping fresh or frozen plasma it was able to send only liquid plasma which was “time expired” - that is, had deteriorated over time such that (for instance) the Factor 8 activity in it had ceased. Liquid time expired plasma therefore could not be used to make Factor 8 concentrates.[652] Yet the haemophilia centre needed to provide treatment to its patients and before the pro rata system received a small amount of Factor 8 from BPL, probably in the region of about 200,000 units per year.[653]

Following the introduction of the pro rata system and from the end of 1982, all Northern Irish plasma was sent to PFC in Edinburgh.[654] Arrangements were made to transport it in a frozen state, so that it did not denature. There was however a slight delay in making the arrangements for this since the testing regime to which Northern Irish plasma had been subject was not identical to that adopted in Scottish centres, and there was concern about mixing Northern Irish and Scottish sourced plasma in the same pools.

Steps taken to achieve targets in Northern Ireland

Dr Morris McClelland told the Inquiry that “Education of clinical users of blood/ red cells and persuasion towards the use of red cell concentrates instead of whole blood was a key part of the strategy towards achieving self-sufficiency.” The rapid uptake of red cell concentrates from 20% of the Belfast output to 75-80% in the space of three to four years, was in his view, testament to the success of their work.[655]

Scotland

Targets for donors

Voluntary organisers from the local community helped to recruit donors. This approach was described by Professor Stanislaw Urbaniak as being “highly successful in maintaining a ‘repeat donor’ base of regular donors”.[656] Dr Brian McClelland described the role of volunteers being phased out in the 1980s in South East Scotland and an increasing role for radio and television advertising.[657] Dr Jack Gillon, the consultant in charge of the donor programme working with Dr McClelland, said this was effective at reversing dips in donor attendances in the 1980s and 1990s.[658]

There does not appear to have been an initial drop off in donors during the years that AIDS leaflets were first being provided to donors. On the contrary, attendances and donations increased up to 1985 in South East Scotland.[659] There was however a sharp drop off in donor numbers between around 1985 and 1987, which Dr McClelland and Dr Gillon attributed to the effects of AIDS messaging.[660] The position was sufficiently concerning that Professor Cash estimated that if the trend continued “sometime in mid 1988 the demand for products will exceed plasma supply.[661]In response to this, a national media campaign to promote blood donation was agreed and at a meeting in April 1988, Professor Cash confirmed that donor attendances had increased in the first quarter of 1988.[662] He was able to report at the end of 1988-89 that “we now have some reason to believe that the decline [of the preceding 18 months] has been checked and certainly in some parts of Scotland it appears to have been reversed.” Professor Cash went on to say: “We cannot be content to rest on our laurels for we believe we need to increase our blood collection programme by a further 40,000 donations per annum to meet the many and varied needs of patients in the 1990s … the effort required to stand still seems to be greater than it was in the 1960s and 1970s.[663]

As part of this initiative, Dr Cash secured the appointment of a national donor programme manager in Scotland.[664]

Supply of whole blood and blood components in Scotland

The Scottish transfusion centres did not seem to suffer difficulties of supply in the way described in England and Northern Ireland. In fact, the South East Scotland Transfusion Centre in Edinburgh, would sometimes send their surplus red cells to an English RTC to address their shortages.[665] Professor Urbaniak said that where North East Scotland could contribute “over and above” national targets based on population, they received “the funding required to facilitate this.[666]

Targets for plasma in Scotland

It was a national objective for Scotland to be self-sufficient in plasma products.[667] It was generally successful in this, as set out in the chapter on Self-Sufficiency.

A national target for plasma collection for Factor 8 was arrived at by the Scottish haemophilia directors estimating the number of international units of Factor 8 that would be required. This was then converted by the SNBTS into the weight of plasma in kilograms required to meet this, assuming a yield of approximately 200 international units of Factor 8 per kilogram. That target was then shared out between the five Scottish transfusion centres on the basis of population with some adjustments made for historical production.[668]

Dr McClelland’s recollection is that the SE Scotland RTC was hitting its targets fairly regularly and that generally, all the centres met theirs, save for the Glasgow/West of Scotland RTC which sometimes struggled.[669]

Steps taken to meet targets in Scotland

SNBTS took steps to maximise the use of blood components similar to those described above for England and Wales: encouraging the use of red cell concentrates over whole blood, using SAG-M, and adopting plasmapheresis. In 1974 the DHSS understood that 30-40% of donations in Scotland were issued as red cell concentrates, compared to less than 10% in England and Wales. By 1976, 46% of donations in Scotland were issued as red cell concentrates and by 1982, 60% of donations were being processed into red cell concentrates with 220ml plasma removed.[670]

From early on, Professor Cash considered that encouraging the use of red cell concentrates was an important part of the strategy to achieve self-sufficiency. In February 1972, he published a report entitled “The Principles of Effective and Safe Transfusion”, in which he urged clinicians to use red cell concentrates whenever patients did not require whole blood. He argued that the practice of giving whole blood had led to the wastage of several thousand litres of plasma and billions of platelets, and the alternative of using red cell concentrates was simpler and safer. In his opinion, the practice of using whole blood had become entrenched because SNBTS had, up to this point, seen no reason to conserve plasma and so had not made red cell concentrates available. This was not helped by the isolation of Scottish RTCs “from the bedside”, which meant it was difficult effectively to encourage the use of red cell concentrates. Professor Cash argued that clinicians and SNBTS ought to see routine whole blood transfusion as a “thoughtless habit”, and should not shy away from the realities of efficiency, which involved the optimal use of raw material.[671] In August 1980 he wrote to The British Medical Journal with a letter headed “Factor VIII supply and demand” criticising doctors who treated patients with whole blood rather than red cell concentrates, which he claimed resulted in thousands of litres of fresh plasma a year being diverted away from producing Factor 8 in the UK.[672]

The evidence on plasmapheresis is mixed. Professor Urbaniak said that with the drive for self-sufficiency in Factor 8, the target for North East Scotland was “to produce as much as we could, within logistical and financial constraints. This we did, producing more plasma per head of population than other Scottish centres, because of our stable donor population.[673] Dr McClelland’s evidence was South East Scotland initially had three plasmapheresis beds in the early 1980s for the collection of plasma from donors who had high levels of antibody either to tetanus toxin or to the rhesus antigen to make specific immunoglobulin products, but then started a small automatic plasmapheresis programme. He would have been “keen” to expand it and submitted business plans but there was no enthusiasm at SNBTS or the Scottish Home and Health Department to pursue it.[674]

Commentary

This chapter has described a system of collecting whole blood or plasma from voluntary donors and then dealing with it in two different ways. Blood went for transfusion or local production of cryoprecipitate or fresh frozen plasma for therapy in response to regional needs. Plasma, either obtained from whole blood, or as plasma following plasmapheresis, went in England and Wales to BPL, and in Scotland to PFC in Edinburgh, in response to national needs.

In fulfilling these tasks, the RTCs were custodians of a valuable natural resource (blood and plasma). As part of the careful husbandry of blood which this necessitated, they had a role in educating or persuading clinicians using blood for transfusion to do so less often, and when doing so to use it in lesser quantities.

The fact that the system was regionally funded and regionally controlled in England and Wales led to problems, and tensions within the service. So far as blood was concerned, there was a tension between the supply of it, and the clinicians’ demand for it. This rarely caused problems on its own, but when demand for plasma for fractionation rose, this demand, together with those of the clinicians, had to be balanced against the available supply.

The system was inefficient when it used whole blood for transfusion where only red blood cells were required, wasting the plasma, and also when whole blood was used as the source of plasma for fractionation or cryoprecipitate, wasting the red blood cell component. This chapter has also described how efforts were made to persuade clinicians to change their prescribing habits, and to adopt the use of SAG-M to enable the red blood cell component to be more easily used. Education and persuasion were ultimately reasonably successful. But this success could and should have been achieved more quickly than it was. Delay in this had an effect on the health of people with bleeding disorders. It meant that self-sufficiency was more difficult to achieve and more were likely to be put at risk.

Nonetheless there was general success in matching supply with demand so far as blood for transfusion was concerned. The UK was almost entirely self-sufficient in its supply for blood for this purpose. There almost never needed to be any reliance on imported blood, as such.[675] This is in many ways a remarkable achievement, especially in days when “just-in-time” manufacturing and distribution was a thing of the future: for blood is a short life product, which had to be obtained from willing donors in sufficient quantity to meet what would inevitably be a fluctuating demand. In any part of the country, emergencies might arise which would impose a strain upon the regional resource. It could however have been improved if a more systematic way of meeting a shortfall in one region could be met by providing blood from a region in surplus. This only proved possible when the National Directorate began in the late 1980s.[676]

This chapter has shown that some of the tensions would have been eased, and quicker progress towards self-sufficiency would have been achieved, if NBTS had been organised on a national basis, centrally funded.

This is not simply a decision of hindsight. The inadequacies of a system of regional autonomy beyond the late 1960s were memorably described within the DHSS by Thomas Dutton in 1976 in part of a report where he said “As long as the collection, testing and despatch of whole blood was the predominant occupation of blood transfusion centres they were able to function as independent regional units which were largely self-sufficient except in times of emergency. The adequacy of independent self-sufficient regional units was however greatly reduced with the introduction of component therapy on a large scale.[677]

It took too long to remedy this. Responsibility is shared between those responsible for setting up the system as it was, and those who resisted repeated calls for change (successive governments), and by many in the medical profession who insisted on using whole blood when it was unnecessary (and wasteful) to do so.

Finally, although interventions by a Chief Medical Officer should be limited in number if they are to be at their most effective, there is a powerful case here that the Chief Medical Officer should have made it clear that, unless there were good reasons not to do so in a particular case, the red blood cell component should be used in transfusions rather than whole blood. This would probably have had the effect of accelerating the change of prescribing habits which was desirable.

3.11 Response to Risk by the Blood Services

This chapter considers the blood services’ knowledge of viral risks and their response to those risks over time. It discusses in particular donor selection and donor screening, the organisation of donor sessions, the role of the blood services in educating clinicians about the responsible use of blood and the production of AIDS donor leaflets.


Key dates

1946 first guidance on donor screening Medical Examination and Care of Donors.

1952 WHO Expert Committee on Hepatitis recommends preventative measures to reduce the risks of serum hepatitis.

September 1973 RTDs discuss the continued collection of blood from prisoners.

May 1975 CMO advises that collection of blood from prisoners can continue.

1976 ISBT advises against collecting blood from prisoners and from donors with history of viral hepatitis at any time.

November 1976 DHSS circular allows those with history of hepatitis or jaundice to donate provided no symptoms in last 12 months and a negative HBsAg test.

February 1983 DHSS issues circular on record keeping and stock control arrangements.

May 1983 first substantive discussions on AIDS at RTDs’ meeting and at SNBTS meeting.

1 September 1983 first AIDS donor leaflet available for distribution.

March 1984 Collection of blood from prisoners in Scotland ceases; end of 1984 collection of blood from prisons in England and Wales ends.

February 1985 second AIDS donor leaflet available for distribution.


People

Professor John Cash medical director, SNBTS (1979 - 1997)

Dr Marcela Contreras medical director, North London Blood Transfusion Centre
(1984 - 1995) and executive director London and South East Zone, NBA (1995 - 1999)

Dr Harold Gunson director, NBTS (1988 - 1993)

Dr Patricia Hewitt lead consultant for Transfusion Microbiology, London and South East Zone (1995 - 2000) and national lead consultant, NBA (later NHSBT) (2000 - 2005)

Dr Brian McClelland Edinburgh & South East Scotland RTD (1979 - 2001)

Dr Morris McClelland Northern Ireland RTD (1980 - 1994)

Dr Tony Napier medical director, Welsh Regional Blood Transfusion Service (1977 - 1998)

Dr William Wagstaff Sheffield RTD (1974 - 1994)


Abbreviations

BPL Blood Products Laboratory

CMO Chief Medical Officer

ISBT International Society of Blood Transfusion

NBA National Blood Authority

NBTS National Blood Transfusion Service

RTC Regional Transfusion Centre

RTD Regional Transfusion Director

SNBTS Scottish National Blood Transfusion Service


Addressing the risk

The risks that blood transfusion or use of factor concentrates could lead to serum hepatitis (or as it became known in the late 1960s Hepatitis B, and in the early to mid 1970s non-A non-B Hepatitis (“NANBH”)) were known before the Second World War, and became indisputable during it.

What the preceding chapters have shown is that the risk that blood transfusion or use of factor concentrates could cause AIDS was known in mid 1982, and became increasingly apparent as a serious risk until it came to be regarded as a near certainty in April 1984, after a press conference in the US at which it was announced that Robert Gallo had discovered a virus which was the cause of AIDS. [678]

Yet although it was strongly suspected before April 1984 that AIDS had a viral cause, the microbiological configurations of both HTLV-3/HIV and also NANBH/Hepatitis C (as they turned out to be) were unknown.

How best could such a risk have been reduced, or even avoided, when no one could either test definitively for the disease (they had, rather, to wait for symptoms to emerge) or screen what was being transfused to eliminate it (because there was no definitive test)?

The answer to this vital question is addressed in this chapter.

The first comment to be made is that the difficulties can be overstated. The problem is not a new one. For centuries, humanity has struggled with strange disease and found ways of warding it off. It has done so without needing precisely to understand how precisely it was caused. Knowledge is elusive: there is almost always more of it we do not yet know. What is important in ensuring public health is having enough knowledge to understand that there is a risk (and therefore to begin to search for ways which prove effective in warding it off) rather than to have certainty of the precise cause and effect. An effective public health system does not demand certainty before responding: though more knowledge may hone the response, it must not shrug and say “it’s not quite clear what’s happening here. It looks as if X causes Y, but we can’t be sure; and we’re certainly not sure how it can”, for such defeatism leads, as the word itself signals, to defeat. Thus, in the past we have been successful in recognising that something in the water supply caused cholera; that exposing people to cowpox protects against smallpox; that pasteurisation destroys the prospect of milk transmitting tuberculosis; that taking measures against insects will halt the diseases they carry – and so on.

In short, a lack of knowledge about the exact nature of the infective agent was no barrier to steps that could be taken to reduce the risks of the infection being caused.

The risk of contracting AIDS or hepatitis, whether by transfusion in the course of surgery or treatment for illness, or because of treatment with blood products, could be avoided if the transfusion or treatment simply did not take place at all. That is important to remember, given that single transfusions of blood carried a risk that the recipient might contract these diseases as a result, and there is considerable material to show that blood transfusions were often given unnecessarily and/or too many units of blood were given at any one time.

Otherwise, some element of risk almost certainly remained.[679] However, though a risk might not be eradicated completely, the risks which had been apparent since 1939 when the transfusion services were first set up[680] could be reduced.

How? An answer which has stood the tests of time was given by an Expert Committee on Hepatitis of the World Health Organization (“WHO”) in 1952.[681] In their report, the committee of experts dealt with the prevention of the spread of serum hepatitis “by Human Blood and its Products”.[682] After the general comment that national health authorities should alert the medical profession in their countries “to the dangers of transmitting hepatitis by transfusion of plasma and whole blood, and also by the use of certain blood derivatives, and should advise that plasma, particularly large-pool plasma, should not be used unless the advantages likely to be gained by its transfusion outweigh the risk of transmitting the disease[683] it went on to identify the “preventive measures” that might be taken.[684] These were:

  1. the selection of blood donors[685]
  2. the control of pool size[686]
  3. the treatment of plasma[687]
  4. the maintenance of records[688]
  5. reporting[689]

The Expert Committee’s report, and the measures it identified, acts as a blueprint for what follows.

The voluntary donor

A general measure which would help to ensure that the blood given to a recipient was as free of infection as could be expected was to ensure that it was given by people who had no incentive in making the donation other than to do good to a fellow human being (the “voluntary non-remunerated donor” principle). It was a true donation. By contrast, in many other countries blood and plasma were purchased from the “donor”. This was not in truth a donation, though it has been called that: it was a sale.[690] The motive in selling it was to make money, rather than to do good.

The blood services in the UK adopted the voluntary non-remunerated donor principle. Though some participants have reported that they were told by hospital staff that they had received “American blood”, the Inquiry has found no evidence to confirm that this was the case (except on one occasion, footnoted below).[691] If it had been, there would have been at least some records. There would have been some recorded or remembered need for it to have been imported.[692] There is neither, since the blood transfusion service was able to satisfy all the transfusion needs for whole blood or packed red blood cells (save in the case, very occasionally, of an exceptionally rare blood group, which was then sourced from elsewhere) – that is, with the exception of plasma. “Self-sufficiency” as a concept has sometimes been understood to apply to “blood and” blood products. In the context of this Report it refers to blood products alone.

So important was the voluntary non-remunerated donor principle that even when universal screening of blood for the presence of the Hepatitis B antigen was introduced, the prestigious US physician Dr Harvey Alter said that nonetheless “By far, the single most significant measure for the reduction of posttransfusion hepatitis is the total exclusion of the commercial donor.[693]

Beyond that, in line with what the Expert Committee assembled by the WHO had identified in 1952, the lines of defence the UK had against hepatitis (and later, HIV) viruses, before universal screening for all of these viruses was not only available but effective, were as outlined below. Since they were the only tools available, it was incumbent on the blood services to apply them with rigour.

Donor selection

Even though reliance on voluntary donations was of huge importance, the risks could be reduced further. Careful recruitment of voluntary blood donors (“donor selection”) reduced those risks that were greater in some groups than others. This involved excluding people not because they were themselves individually shown to be a higher risk, but because they belonged to a group identified as giving rise to one, or were placed in a situation which experience suggested was liable to generate more risk. This is, in effect, a process which excludes donors on a “group” or “collective” basis.

Donor screening, and then the screening of donations

At an individual, rather than at a group level, “donor screening” could be conducted. It differs from “donor selection” because it focuses on the individual who comes to make a donation, rather than the person not so much as an individual but more as a member of a group. Would-be donors could have been asked questions to exclude those whose recent medical or social history suggested that they might pose a particular risk. For instance, those who had recently suffered a bout of hepatitis, who were currently running a temperature, had a persistent medical condition which might indicate a disease, who had raised glands, or who had recently returned from a place where diseases such as malaria or dengue fever were rife.

It was possible, too, for “donor (or donation) testing” to occur – a donor could have their blood screened to see if it carried any virus or parasite which could be identified, though, generally, testing the individual donation after it had been given, rather than the donor beforehand, was the preferred course.

Surrogate testing

Where there was no test specific to a particular virus, because it had not yet been identified sufficiently to enable a test to be formulated, it could be possible to test for another virus or blood marker which appeared to be linked to the virus of interest. This (“surrogate testing”) would indicate a real risk (though not the probability) of infection, or at least that the donor or their donation was of a “higher risk”. The link might exist because it was indicative of a certain lifestyle: for instance, it might be thought that some people whose medical history was typical of many who have lived a “skid row” existence, whose blood test suggested they had been in contact with Hepatitis B, might be more likely than others to have come into contact with NANBH or the virus (if it was one[694]) which caused AIDS; the same could be true also of those whose liver function tests showed elevated levels of alanine transaminase (“ALT”) or aspartate transaminase (“AST”).[695]

Treatment of plasma

The steps mentioned could be performed irrespective of whether the purpose of taking the blood donation was to transfuse it as whole blood, or to separate the red blood cells and platelets from the plasma, and use the latter to make a blood product. The protective measures already mentioned would apply to both. However, further steps (such as physically heating donations of blood components or chemically treating them) almost entirely relate to blood products, for they involve taking steps which will damage red cells, platelets or whole blood intended for transfusion. Red blood cells, and therefore “whole blood”, cannot be heated, for to do so destroys the red cells. Platelets too are susceptible to heat. It is not generally possible to treat red blood cells or whole blood, as opposed to plasma, with a chemical, or solvent detergent, which might inactivate a virus. Another theoretical possibility – irradiation – was shown in the early 1950s to have no effect in inactivating serum hepatitis. The viral inactivation of blood products is the subject of the chapter on Viral Inactivation.

Quarantine

The fourth measure emphasised by WHO in 1952 was keeping records, and reporting. A report from a donor that they had suffered jaundice, or begun to show signs which might be indicative of AIDS, could only lead to the withdrawal of their donation from the system if records had been carefully kept (records) and reporting of any symptoms arising after donation had been encouraged (reporting): but it would also require that the blood donation or any pool of plasma to which it had contributed had not as yet been distributed for use. This leads to the question of whether, and to what extent, blood might be quarantined.

Assuming reasonably careful donor selection, donor screening and donation screening (and any surrogate test if that is adopted as a system) it is possible to quarantine blood or blood products, and simply not issue blood which has been bagged ready for transfusion. If it has been issued already to a transfusion centre, ready for use, it may be recalled. These actions can follow if information reaches the transfusion centre that a donor may have given blood during a period when the donor has been incubating a virus, although displaying no symptoms of this at the time of the donation. For this reason, whole blood is not generally issued immediately for use.

Quarantine is not a wholly satisfactory solution to the problem of late-materialising infections. It can last no longer than the shelf-life of donated whole blood. Throughout most of the period with which the Inquiry is concerned, this was 21 days. That has now been extended to 35 days with the use of more advanced coagulant treatment. In contrast, plasma (included in the plasma portion of “time-expired” whole blood[696]) lasts for several months and if freshly frozen, up to three years. Ideally, both are better used fresh – like any natural product their quality gradually declines over the period within which they may safely be used.

Recall

Product recall relates principally to blood products made from plasma: the tracing of donors to ensure that they remain free of disease and have not contributed to a batch during an infective incubation period is more practicable than if the donation was to be used as whole blood. Recall will apply in particular to as yet unused batches of products or the batches which have been shown or are suspected to be infected.

Know your donor

At the start of the 1970s, a major challenge was that of hepatitis being transmitted by blood. As long before as 1952, the Expert Committee on Hepatitis of the WHO had, as set out above, identified, amongst measures to be taken to combat the risk of a transfusion leading to liver disease, the need to select donors. The Inquiry heard in evidence that Dr David Dane (who discovered the “Dane particle” which allowed Hepatitis B to be identified and screening tests developed) repeatedly drilled into his students that they should “know [their] donors”:[697]

“you must know your donor, you can’t know anything about … things which shouldn’t be in the blood … unless you know something about the donor: who they are, what they do, where they do it, how often they do it, who they do whatever it is with whoever they’re doing it … unless you know your donor you won’t know what transmission of agents they are at risk from … If you don’t know your donor, you’re relying entirely on testing.”[698]

The process of selecting a donor involves choosing whom to approach to give blood, screening those who answer the call. This chapter examines the approaches of the blood services to these two processes, and whether they effectively reduced risk as far as they reasonably could and should have done.

The context was the overall requirement for donations to be made in sufficient quantity to meet clinical need. According to a publication for the WHO an estimate frequently quoted was: “in countries with fully operational public health and blood transfusion services, the need for blood can be met if about 2% of the population are regular blood donors.” However, the development of open heart surgery, renal dialysis and other procedures requiring transfusion meant that by 1971 it was considered probable that this figure was too low.[699]

Who was approached to be a donor?

The considerable advantages of the voluntary, non-remunerated donor are set out earlier in this chapter. Reliance was typically placed on these advantages in inviting members of the public generally to come forward to give their blood freely.

Some may wonder why the words “voluntary” and “non-remunerated” appear together in the usual description of UK donors, and in the advice of the WHO; they may think that surely they express the same idea. That is not entirely true.

In one sense it can be said that where soldiers were lined up at a barracks to donate blood, for no financial reward, their donations were voluntary: but one only has to picture a regimental sergeant major (“RSM”) ordering the process to see that it may not truly be voluntary, since if (for instance) the soldier concerned, being a male, had engaged in sex with another man it would be improbable he would admit it with all the adverse consequences that doing so would probably bring for him; and he would be unlikely to defy an order of the RSM to give the donation.

Other pressures might similarly mean that a would-be donor could not easily refuse to donate without the refusal causing personal difficulty or embarrassment. The prisoner may be subject to the pressures in a penal institution to give blood to order, rather than by their own unfettered will; the worker in a large factory may find it awkward to refuse to donate, or, if interviewed by the person in charge of the donor session, to give them reason to think that their blood may be a higher risk to an end user than that of co-workers. Explaining why they did not on that occasion give blood, when it would be clear to the other workers coming forward that they had not, could create problems for their future relationships at work. Regional transfusion centres (“RTCs”) needed to be alert to the possibility of pressures such as these.

High-risk donors

Quite apart from the possibility of such pressures resulting in some donations being accepted which would not have been had the position of the donor been fully appreciated by those taking their blood, some groups who posed a higher risk to eventual recipients were not only accepted but targeted to provide donations.

Prisoners

Prisons were regarded as a valuable source of blood donations from the 1950s until the 1970s. A 1971 publication described prisons as one of the institutions from which initial steps to form a panel of donors could best be taken, along with the armed forces, the police, large industrial or commercial undertakings, universities and social or religious foundations.[700] However, following the introduction of the screening of donors for Hepatitis B in the UK from late 1970, it became clear that there was a markedly higher incidence of Hepatitis B amongst prisoners than there was in the general population.[701] Indeed, in the US the American Red Cross stopped collecting blood from prisons on 1 July 1971 because the incidence of hepatitis was ten times greater among prisoners than among voluntary unpaid donors.[702]

Despite the knowledge of some of the regional transfusion directors (“RTDs”) about the higher incidence of hepatitis amongst prisoners, all RTCs still collected donations from prisons, borstals or equivalent institutions in October 1971.[703]

On 26 September 1973 National Blood Transfusion Service (“NBTS”) directors discussed whether they should stop collecting blood from prisons in England and Wales. Seven directors (Sheffield,[704] Edgware,[705] Brentwood, Cambridge,[706] Tooting,[707] Cardiff[708] and Birmingham[709]) voted to stop prison sessions and seven voted to continue (Newcastle, Leeds, Oxford, Bristol, Manchester, Liverpool and Wessex) on the grounds that Hepatitis B screening gave “adequate protection[710] and the statistical significance of the data suggesting a higher incidence among prisoners compared to new donors from the general public should be examined.[711] The meeting agreed that if it were decided to discontinue bleeding prisoners, the Department of Health and Social Security (“DHSS”) should inform the Home Office before any local action was taken.

Following further research by the Advisory Group on Testing for the Presence of Hepatitis B Surface Antigen (“The Maycock Group”) on the continued collection of blood from donors in prisons, Dr Henry Yellowlees, Chief Medical Officer (“CMO”) wrote to all regional medical officers in England on 1 May 1975:

“There is a relatively high risk of hepatitis B being transmitted by the blood of prisoners. But there is probably an equally high risk in other groups of the population, eg drug addicts, who are not so easily identified in advance as prisoners, if they can be identified at all. The advice we have received is that it is not necessary to discontinue the collection of blood at prisons and similar institutions provided all donations are subjected to one of the more sensitive tests.”[712]

By way of comment, the idea that because there is a risk from one source, you may ignore the fact that there is a risk from a second source, makes no sense at all if the object is to avoid all risk as far as reasonably possible. The logic was clearly faulty. The object should have been to eliminate both risks as far as that could reasonably be done – and the first step in achieving that would be to cease seeking donations from prisons. The advice that it was unnecessary to discontinue donations looked at the wrong question: it should have been whether it was safer to do so. And the reason that it was “not necessary” was based on the efficacy of the latest tests for Hepatitis B. Not only did this entirely miss the possibility of NANBH which by now was known, and also missed “window period” infections, but it also assumed a reliability in detecting infective units of blood which the tests at this stage simply did not have. In short, the CMO missed an important opportunity to ensure that safety was the primary consideration and instead permitted – indeed, it could be said, encouraged – the continuation of a dangerous practice.

His letter left the question of continuing to take blood from prisons up to the transfusion directors. They had been split on the issue in 1973, partly on uncertainty about the reliability of the statistics. That uncertainty could no longer be a reason given the opening sentence of Dr Yellowlees’ letter, for he appears there to accept that he is satisfied that the risk is “relatively high”.[713] The tide of evidence, and sense of proper practice, was turning against the continuation of prison sessions. Though some transfusion directors acted reasonably quickly to stop such sessions, others did not.

I note that the Scottish National Blood Transfusion Service (“SNBTS”) in their closing submissions considered that there was probably an overreliance on Hepatitis B surface antigen (“HBsAg”) testing to provide safety for donations from prisons, due to an underappreciation of the concept of the window period and of the risk of transmission and severity of NANBH. As is clear from what is said above, I agree: my comments which go further, about the CMO, are of course in respect of the CMO in England, about whose particular actions I would not expect SNBTS to express an opinion.

Returning to the factual narrative: in 1976 the International Society of Blood Transfusion published guidance advising against collecting blood from prisoners,[714] and in 1981 a further study was carried out which demonstrated that in the west of Scotland HBsAg positive donations in the male prison donor population were almost five times higher compared with the general male donor population.[715]

It was not simply the high incidence of hepatitis that made prisoners unsuitable as blood donors. There were real concerns that they may not always be true volunteers, and that the environment in which they were donating was not conducive to them telling the truth about any risk factors they might have (such as being an intravenous drug abuser or having a viral infection).

Despite this, as late as the early 1980s, it was still Home Office policy to encourage prisoners to become donors as it was believed to help with their rehabilitation.[716]

A survey conducted by Dr Ewa Brookes of the East of Scotland Blood Transfusion Service on 13 September 1983 showed that Wessex, Bristol, Cardiff, Liverpool, and Leeds were still conducting donor sessions within prisons and that Oxford, Newcastle and Birmingham had only stopped that year.[717]

When understanding developed in the UK that HIV was transmitted by blood, that there was a relationship between hepatitis carrier status and AIDS, and that there was an “extremely high incidence of hepatitis carrier status” amongst the prison population, continued donations from prisons could be seen as fraught with risk. The realisation of a growing epidemic of AIDS signalled the beginning of the end for prison sessions for those RTCs still holding sessions in corrective institutions.[718]

The collection of blood finally stopped from closed prisons and borstals in England and Wales at the end of 1984 and from the last open prison in 1986.[719]

By way of comment, the predominant consideration in acquiring and supplying blood for transfusion should have been safety – of the blood, principally, for those who would receive it as a transfusion, or test or handle it in its journey from vein to vein, as well as the safety of the donor. Between it becoming clear to transfusion directors in late 1973 that there was probably a higher risk of hepatitis amongst prisoners,[720] as it was to at least seven of the fourteen directors at that meeting, and confirmation of the reality of that higher risk by the opening sentence of the CMO’s observations in his letter of 1 May 1975, they should have begun to phase out prison sessions. They were inhibited in part from doing so by the attitude of the Home Office (then the government department responsible for prisons) seeking to advance the rehabilitation of prisoners. However, between taking a step which would improve the safety of donated blood and minimise damage to the public health, on the one hand, and taking a step which might help contribute to the rehabilitation of a prisoner (only in a very few cases likely to contribute to public health) on the other, there should have been little contest. Blood safety and public health should have been predominant. I do not blame the Home Office for advancing its own cause: but the blood services should have resisted earlier and more effectively than they did, for they had the better case. Prison donations continued for much too long.

Scotland

In Scotland, blood was collected from prisons between 1957 and March 1984.[721]

Similar debates occurred about whether prison collections should be stopped because of the increased risk of hepatitis. The decision, as in England and Wales, appears to have been left up to the individual transfusion directors.

Views as to the social utility of collecting blood from prisoners were also held in Scotland. For example Dr John Wallace, director of the Glasgow and West of Scotland Transfusion Centre, wrote in 1977 in the textbook Blood Transfusion for Clinicians that as “the incidence of HBs antigenaemia among male prisoners in Scotland is less than 1 per cent using the most sensitive techniques of testing … it is socially and psychologically undesirable to exclude prisoners … acceptance of prisoners helps to rehabilitate, and some of these volunteers become regular donors after release.[722]

On 29 March 1983 Dr (later Professor) John Cash reported that the Medicines Inspectorate had commented adversely on the practice of collecting blood in prisons and borstal institutions, and he invited directors to comment on the practices in each region and to give their view on the Medicine Inspectorate’s criticism. It was reported by the directors present that sessions were held in penal institutions in all regions, although Dr Brookes and Dr Stanislaw Urbaniak intended to review the situation in their regions. The directors were unable to agree on future policy at that meeting.[723]

By December 1983, Dr Brookes told the SNBTS directors meeting that “the only Scottish region to continue holding sessions” was now Glasgow,[724] which held its final session on 25 March 1984.[725]

I have already noted the views of the SNBTS expressed in final submissions, and accept them. I would add only that the risks of NANBH were, for similar reasons, likely to be higher in penal institutions, and to take donations from them risked not only Hepatitis B, known to be present at higher rates, but those risks too and (as should have been appreciated after the start of 1983) implied a greater risk of the putative virus which caused AIDS as well.

Northern Ireland

In Northern Ireland, prison sessions took place across the country and the last session occurred in Belfast on 26 October 1983.[726] In his evidence to the Inquiry Dr Morris McClelland was asked: “Looking back now, and having regard not only to the fact that prisoners might be regarded as being a higher risk group, but also the fact that they may be less well placed to give candid answers to questions, may be less truly voluntary as donors, do you think that prison donations should have stopped long before October 1983?” He responded: “I think there is an argument for that, yes. I think there is an argument that they should have.”[727] He was frank in acknowledging this. He was right to do so. Prison donations in Northern Ireland should have stopped long before October 1983.

Armed forces

England and Wales

The armed forces were another source of blood donors throughout the 1970s and 1980s for a number of RTCs, for example Sheffield (formerly Trent).[728] Despite this cohort of donors giving rise to risks not far removed from those seen in the prison donor population, in particular the fact that they were in an environment that made it difficult to tell the truth about their personal risk factors, such as being gay, or being or having been an intravenous drug user,[729] there was less concern expressed by RTDs about donor sessions being held at military bases.[730]

Scotland

Blood was also collected from the armed forces in Scotland. These sessions were less important to the service than the prison sessions (approximately 0.2% of donations was collected from the military). SNBTS has told the Inquiry that they are not aware of evidence that military personnel based in Scotland were at higher risk of Hepatitis B, HIV or Hepatitis C infection than the general Scottish or UK donor population.[731]

As in England and Wales, the Scottish transfusion directors did not consider this cohort to give rise to any special risks, albeit Dr Jack Gillon expressed a concern about the enthusiasm of the officers in charge of the blood collection in wanting the sessions to be a huge success. He recalled returning from a session “really quite concerned because I couldn’t be 100 percent sure that all of those donors were truly volunteers. I just had a feeling at the back of my mind that it was ‘You, you and you’ and I’d spoken to one or two of the donors personally. I think some of them did have reservations about answering the questions, and I came away from that session really feeling very uneasy.[732] Dr Gillon also recalled that it was difficult to find an area private enough to be sure that the donor was telling them everything that they wanted to say.

Northern Ireland

In Northern Ireland, collections from the armed forces made “quite a significant contribution, especially in the 70s continuing in the 80s.” Dr Morris McClelland recalled “Quite a lot of sessions.” This is perhaps unsurprising because of the large army presence in Northern Ireland as a result of the Troubles. These sessions were, in the evidence of Dr Morris McClelland, a valuable source of donors at a time when the blood service was struggling to maintain the blood supplies.[733]

Unlike the RTDs from the rest of the UK, Dr Morris McClelland gave evidence that he was aware that “there was a higher incidence of hepatitis B among army donors, certainly”. He said that was his experience. However, he was unsure how much consideration he would have given to this, given how important these donors were.[734]

Workplace sessions

Workplace sessions were another important source of donors throughout the 1970s and 1980s.[735] Sessions took place at factories and other large workplaces during the normal working day. These sessions were planned by the donor management departments at the regional transfusion centres.[736]

The decline of heavy industry in the 1980s reduced these donor sessions, particularly in the more industrialised regions such as the North East of England.[737] Similarly, in Northern Ireland the largest factories had been a very valuable source of donors. Their closure, as well as the Troubles, had an adverse impact on donations.[738]

While there is no evidence to suggest that there were higher levels of blood-borne viruses in these donor populations,[739] given the context in which sessions were organised, there were risks that employees were (or at least felt) pressured to donate, and may have found it difficult to admit to risk factors that made them unsuitable to be a donor. Little consideration appears to have been given to this by any of the RTDs in the UK.[740] Much might depend on how a donor session was best to be organised: the next section in this chapter.

Donor sessions and how they were arranged

RTCs held donor sessions in a mixture of static and community sessions. The static sessions were those that took place in bespoke clinics.[741] In larger regions, smaller centres were set up across the region to make it easier for donors to give blood.[742] Plasmapheresis sessions took place in static clinics because plasmapheresis machines could not be easily moved.[743] For most centres, the majority of their donor sessions took place in the community and while practice varied from RTC to RTC, most centres tried to hold sessions in every part of the region they covered.[744] Sometimes sessions were held in community or church halls, sometimes at a place of work (such as a factory), sometimes in a public building such as a library and sometimes in a prison or a military establishment. Increasingly RTCs had mobile centres (“blood mobiles”)[745] that could be parked in a car park, and donors bled within the vehicle. Some RTCs used blood buses to transport donors to sessions.[746]

The number of sessions held by each centre depended on that region’s needs. Thus, for example, by 1988 the North London Blood Transfusion Centre had three static centres open between five and six days a week, together with twenty-three community sessions a week and five sessions undertaken in the blood mobile vehicle.[747]

Apart from casual or new donors attending drop-in centres, donors were invited by call-up cards. These were often postcards (North London for example used this system).[748] Other centres invited donors by letter: this was more expensive, but gave them the opportunity to provide leaflets which a donor could read in advance of a session.[749] These leaflets generally asked donors not to attend unless they could satisfy the criteria set out (in practice, that meant not being excluded by any one of a range of conditions, illnesses, or circumstances).[750]

As a general rule of thumb, between 100 and 200 donors would be bled at each session.[751] Given the voluntary nature of the donation, it was obviously important that the donor was not left to wait in a long queue before being bled. All of this meant that there was pressure on those running the sessions to process donors as quickly as possible.

Each donor would be booked in by a member of clerical staff who should have had a copy of the most up-to-date national guidelines produced by the RTDs.[752] Donors would be given written material on arrival, which set out the eligibility criteria. It was then the role of the clerk to ascertain (in the first instance at least) whether a particular donor was eligible or should be refused.[753] Although practices varied from centre to centre as to how this was done,[754] the only centre where these questions were asked orally rather than by presenting them in a written form appears to have been the Glasgow RTC.[755]

All donors would be asked to sign a form affirming that they had read and understood the information provided to them, albeit in the West of Scotland, the donors were asked to sign a separate register.[756]

The presence of some of the health conditions listed in the guidelines meant an immediate refusal. Some required referral to the medical officer for determination.

Given that the process of establishing whether a donor was eligible in the main involved either a non-medically qualified clerk asking the donor a list of pre-prepared questions about her or his health or the donor reading a leaflet about the eligibility criteria,[757] it was important that the written material was clear. However, the form that the donors were asked to sign, the NBTS 110, stated that donors must be asked whether they had ever had a number of infectious diseases including “jaundice”, but did not mention hepatitis on the revised form issued in 1985, despite the fact that at that time a donor would be ineligible to donate if she or he had had hepatitis in the previous 12 months.[758]

A medical officer[759] was to be in attendance. They would deal with any queries from the clerk (or the donor) about a particular donor’s eligibility. Such queries would in some circumstances have required the donor to be asked further questions about their health. In some cases, the query would be referred to the RTD for a decision to be made at a later date.

The fact that the majority of sessions were held in open spaces in community buildings, such as church halls and workplaces, meant that there was often, if not usually, little opportunity for private discussion between the donor and the clerk or medical officer. Thus, questions about whether or not the donor might have an infectious disease would often in effect be taking place in public,[760] and might easily be overheard. Some centres took screens so that donors could at least be separated from the main area, albeit presumably the conversation that gave rise to the need for privacy would have happened in the main area.[761] But as Dr Lorna Williamson candidly admitted “a rural setting is difficult for people who don’t want others in the village to know about their personal lives.[762] The same could of course be said of a session that took place in a workplace, a military barracks or a prison. Following a visit to the New York Blood Center in 1984, Dr (later Professor Dame) Marcela Contreras developed an innovative system which was not adopted elsewhere. The North London Blood Transfusion Centre operated a system like a polling booth in which a donor would tick “Yes” or “No” to a questionnaire asking them whether they belonged to seven risk groups and post it in privacy into something like a ballot box, as a result of which a donation could be withdrawn if necessary.[763]

Different centres had different procedures for dealing with donors whom they considered (or suspected) were unsuitable to donate. At the Northern RTC for example, they had a book which was used to make notes about any particular donor that the clerical staff or the nursing staff felt warranted further investigation.[764] In South East Scotland there was, for a short period, a procedure where if staff had concerns about a particular donor, the donation would be taken and marked “? infective donation”. The donation was then not used.[765]

Even where donors were being asked to declare health conditions as a result of reading a leaflet rather than being orally questioned, most RTCs had no mechanism by which donors could declare themselves ineligible to donate, without losing face in front of other donors (and so perhaps fuelling speculation about the reason for their ineligibility). This problem became particularly acute when the blood service introduced leaflets setting out eligibility criteria for donors in response to the threat of HIV and AIDS, given the significant stigma associated with those infections. The exception to this was the North London Blood Transfusion Centre under the directorship of Dr Contreras, who in addition to other measures,[766] had the “polling booth” system already described.[767] This allowed donors who felt under pressure to donate (because for example they had come with work colleagues), to do the right thing and declare their health conditions without losing face.

The various different processes by which blood for transfusion was obtained for donors left open several possibilities that a donor might not realise they should not be donating blood, or felt compelled, having come to a donation session, to “go through with it” rather than withdraw. Donors had no easy way of alerting the donation team taking their blood to the possibility it might pose a risk. Opportunities to explore any issues with the medical professional supervising the session would be limited not just by time but by circumstance, for privacy could not be ensured. The constraints within which the blood services operated were not of their choosing, but there always remained loopholes through which an infected donation might slip. In better, or better resourced and equipped, sessions this might not have happened, or at least the possibility might have been reduced. It is surprising that more innovative approaches such as those implemented in North London were not adopted to provide donors with greater privacy in which they could say if their blood might pose a risk.

Donor screening

Donor screening is the process of assessing the risks and suitability of the individual donor who attends to make a donation.

From the early years of the blood service in England and Wales, there were attempts made by RTDs to set national standards for the selection of donors. They drew up guidelines. The first version, dated 1946, was entitled the Medical Examination and Care of Donors.[768] These guidelines were of course not binding on regional transfusion centres, for the reasons set out in the chapter on the Organisation of the Blood Services. The guidelines were updated on an ad hoc basis.

Initially donors in all four nations had to be between the ages of 18 and 65. By 10 October 1990, the North West and North East Thames RTCs had agreed that if donors had given blood at least four times in the previous five years, including once in the previous twelve months, they could continue to give blood for up to a further five years.[769] This became national policy in England and Wales (although not in Scotland) in December 1993.[770]

By the middle of 1990 in Scotland the SNBTS were recruiting donors from the age of 17.[771] The policy was described by Mairi Thornton, the National Donor Services Manager, as “an unqualified success.[772] In Northern Ireland, they reduced the donor age limit to seventeen and a half with parental consent.[773]

In general, whole blood donors were called every six months but some centres believed that this was too conservative and encouraged more frequent donations. Dr Huw Lloyd told the Inquiry that once per every six months “was generally less frequent than was considered acceptable[774]and by 1988/89 leaflets were amended to make it possible for donors to give blood every seventeen weeks.[775] At the National Management Committee of 16 April 1991 a decision was taken that a minimum interval of 12 weeks between donations for male donors would be accepted as a national standard.[776]

There was an emphasis on donors being healthy, or in “normal health”. This was primarily to be assessed by the donor him/herself who was said to be “the best judge”.[777] The blood services relied on donors answering the questions they were asked truthfully. This made it all the more important to set the donor sessions up in such a way that donors could refrain from donating without loss of face or worse in front of work colleagues, employers or members of their community.[778]

As the 1946 edition of “Medical Examination and Care of Donors” frankly admitted, the superficial physical medical examination carried out during the session was “in general, so incomplete and unrevealing that is in most cases not of great value.[779] Nevertheless it was claimed that “The experienced doctor can detect at a glance the potentially unsuitable donor” and this language was modified only slightly in later editions.[780] It was acknowledged by some of the RTDs who gave evidence to this Inquiry that this is simply not so.[781]

Criteria for excluding donors with a history of hepatitis or jaundice

England, Wales and Northern Ireland[782]

The guidelines set out a number of different health conditions or lifestyle markers that would result in a donor being either rejected as a donor, or deferred for a period of time. From as early as 1945, when the English and Welsh RTDs began meeting, they expressed concerns about the transmission of jaundice to those receiving blood and blood products. As a result, the 1946 guidelines required first-time donors to be asked specifically whether they had had jaundice within the previous six months. The inference was that a donor who had a history of jaundice more than six months previously might be bled.[783]

On 8 August 1952 Dr William d’A Maycock, who had been made aware that the WHO’s Expert Committee on Hepatitis was about to recommend the exclusion of donors with a history of jaundice “at any time”, wrote to the RTDs recommending that the blood service should begin excluding “at once” any donor who gave a history of jaundice.[784] A decision was formally made by the RTDs to accept this recommendation at their meeting on 1 October 1952.[785]

The criteria for the deferral or rejection of a donor with a history of hepatitis or jaundice was revisited from time to time by the RTDs but remained largely unchanged[786] until November 1976 when the DHSS issued a circular which set out the recommendations of the Advisory Group on Testing for the Presence of Hepatitis B Surface Antigen and its Antibody that: “The practice of permanently excluding from the panel donors with a history of jaundice may be discontinued provided that HBsAg is not detected by reverse passive haemagglutination [(“RPHA”)] (or a test of equal sensitivity) and that the donor has not suffered from hepatitis or jaundice during the previous 12 months (paragraph 18).[787]

This led the RTDs to update the Memorandum on the Selection, Medical Examination and Care of Donors in 1977. This updated version provided that individuals who gave a history of jaundice or hepatitis could be accepted as donors as long as they had not suffered from jaundice or hepatitis in the previous 12 months; or had not been in close contact with hepatitis or received a transfusion of blood or blood products in the previous six months and as long as their blood gave a negative reaction for the presence of HBsAg when tested by reverse passive haemagglutination (“RPHA”) or radioimmunoassay (“RIA”).[788]

While the donor selection criteria were subject to a number of further updates,[789] this particular provision remained unchanged[790] until after the introduction of Hepatitis C screening in the blood service.[791] The relaxation in 1977 was based upon a belief that tests for Hepatitis B would offer protection[792] (though these were notoriously poor at identifying all cases of infection when first introduced and, though improved, were still imprecise after introduction of the RPH test for it) and did not consider the possibility that donors who had been positive for Hepatitis B might be at increased risk of suffering from NANBH. The link between the lifestyle choices which often led to an increased risk of the former and the chances that they might suffer from NANBH was the basis for suggestions made not long after this that a surrogate test for NANBH in donations involved identifying those who had markers for Hepatitis B.[793] The relaxation should have been reconsidered much earlier than it was.

The evidence of Dr Colin Entwistle as to how this particular part of the guidelines came into being is informative. He was chair of the working committee tasked with producing these guidelines.[794] He nevertheless gave evidence that he himself did not agree with the practice of allowing those with a history of jaundice or hepatitis more than 12 months previously to donate. The guidelines were, according to him, “a common agreed policy which everyone can agree to”, in other words they should not be seen as a record of best practice.[795] This is open to the comment that the search for a policy all could sign up to can become a search for the lowest common denominator.[796] It ought to be the case that documents such as this were reflective of up-to-date and developing knowledge, so as to provide a lead – and it is plain from what Dr Entwistle had to say that it was not. I do not criticise him for this, for it is in the nature of working parties that some views must yield to others, but the result is not one which the working party should have adopted.

It is worth noting that both the DHSS circular and the subsequent publications of the Memoranda on the Care and Selection of Donors in 1983 and 1985[797] were at odds with two important international documents in so far as the acceptance of donations from those who had previously suffered from jaundice was concerned. First, the recommendations of the International Society of Blood Transfusion set out in its 1976 publication Criteria for the Selection of Blood Donors its recommendation that any prospective donor who gave a history of viral hepatitis at any time should be excluded (save where the history was within the first months of life).[798] Second, the equivalent American Red Cross leaflet envisaged the permanent deferral of persons with a past history of viral hepatitis, and was appended to the recommendations of the Committee of Ministers to the Council of Europe in June 1983 as an example of good practice for national blood transfusion services wishing to draw up their own leaflets.[799]

It is particularly concerning that the 1983 and 1985 repetitions of this approach came at times when there was widespread concern about keeping blood safe from being a carrier of the suspected (and latterly known) viral cause of AIDS. The introduction of a surrogate test for the cause of AIDS which was under consideration in 1983 aimed to use Hepatitis B markers as indicative of certain lifestyle patterns which it was thought would be more conducive to viral infections, such as the putative viral cause of AIDS. It is inconsistent on the one hand to accept evidence of having had hepatitis as a reason to screen out donations because of a possibly increased risk they might cause AIDS, and on the other to maintain criteria that at an individual level permitted the continued donation of blood by someone who fell in that category.

Other exclusion criteria relevant to the risk of hepatitis

By at least 1960 the guidelines also included a provision that those who had been transfused with blood or plasma within the last six months were to be deferred.[800] Dr Tony Napier in his oral evidence explained that the rationale for this was the potential opportunity for the transmission of hepatitis and that by six months “it would become apparent whether infection had taken place or not.[801]

From 1977 those who were suspected of or admitted to illicit drug-taking were to be debarred.[802] However, it is unclear how those who might have fallen into this category would be identified (unless there were needle track marks in the arm from which the donation was to be taken), or even what attempts might typically be made to identify them. The form which donors were asked to sign made no reference to such drug use as an exclusionary factor. Dr Patricia Hewitt could not recall any measures in place (other than in relation to the AIDS leaflets)[803] to prevent those with a history of intravenous drug use from giving blood and did not recall whether any consideration was given expressly to formulating leaflets directed at those who might have used intravenous drugs.[804] Dr Morris McClelland thought it would be based “on interview and the general assessment” but could not remember “that there would have been very much specific beyond that asked of donors that might have uncovered that kind of thing.[805]

Scotland

Like England, Scotland had, prior to at least 1980, rejected all those with a history of hepatitis, but by March 1980[806] were only excluding those blood donors with a history of jaundice within the previous 12 months. This was primarily because in Scotland they followed the English selection criteria.[807] This was unsurprising given that the working party responsible for drafting the guidelines had a representative of the Scottish RTDs on it.

In 1982 Dr Brian McClelland and his team at the South East Scotland Transfusion Centre at Edinburgh drew up their own guidelines entitled Guide to Selection of Blood Donors.[808] This stated that those with either jaundice or hepatitis should be deferred for a year, any donor who knew she or he was a carrier for serum hepatitis should be put off service and otherwise the guidance was “at their first donation 1 year after their recovery record ‘Hepatitis’ on donor’s name slip and inform Hepatitis lab.[809] Dr Brian McClelland in his oral evidence said he could not remember why this was the case but thought it was so that the donation could be taken out of circulation.[810] If this was indeed the practice, it is not clear why the guidelines did not make it clear to the donor attendant (who could in turn inform the donor) that in fact a history of hepatitis debarred individuals from donating.

These A-Z guidelines identified some conditions for which the advice was to defer and some for which the advice was that the donor be regarded as permanently unfit to donate. In relation to drug abuse, however, the stipulated action was that the doctor or nurse should be consulted.[811] This was accompanied by advice that at least six months should have elapsed from the use of parenteral drugs before a donor could donate because of the risk of serum hepatitis, and that there should be borne in mind “the possibility that the history given by these donors regarding the abuse of drugs may be unreliable.[812] In other words, the guidelines did not provide for the automatic exclusion or deferment of those with a past history of drug misuse. Dr Brian McClelland in oral evidence thought that donor attendants would defer such donors but could not say when permanent deferral became the practice, noting that: “It was always a problematical thing to judge and we did discover that there were people who told us they were not drug abusers, who turned out to be hepatitis C positive, and it transpired on full questioning that they had injected drugs perhaps once 20 years ago and did not consider themselves to be drug abusers. But I cannot give you an answer to your specific question.[813]

In April 1986 the Scottish centre directors agreed that SNBTS should produce its own selection criteria for donors, based on South East Scotland’s A-Z document.[814] SNBTS published Guidance for the Selection, Medical Examination and Care of Blood Donors in November 1987.[815] This allowed:

UK as a whole

The blood services of the UK came together in 1989 to publish under the UK Blood Transfusion Service/National Institute for Biological Standards and Control (“UKBTS/NIBSC”) Liaison Group, the first edition of what became known as the “Red Book”.[817] The criterion with respect to hepatitis and jaundice (they were separately listed) was that a donor should be allowed to donate 12 months after recovery.[818] This left open a possibility that carriers of Hepatitis C cases might continue to donate. That was unwise, since in the case of a chronic infection persisting after an initial acute phase, it depended on how “recovery” was defined in practice.

It was not until the second edition of the Red Book was published in 1993,[819] by which time Hepatitis C screening had been introduced throughout the UK’s blood services, that the guidelines explicitly rejected those infected with Hepatitis C.[820]

Commentary

Because of the generality with which groups are defined, it is almost inevitable that individual – indeed, most – members of some groups may not be infective, though the group as a whole shows a higher prevalence of disease than the general population. These individuals may feel blamed for the attributes of others, and regard spurning a donation which they wished to give to benefit fellow humans as being churlish, discriminatory or worse. In the 1970s and 1980s gay communities were groups which viewed overall had within them a higher prevalence of infection. The same was certainly true of those who took (or had ever taken) drugs intravenously. Those in prison, viewed overall, were a group in which there was a markedly increased prevalence of infection. It has been suggested it was true of members of the armed services. Merely to be a member of such a group did not mean that an individual was infected. Nonetheless, in the absence of a reliable test for either HIV or NANBH, it reduced the risk generally to decline donations from anyone who was a member of such a group.

Unfortunately, blood continued to be taken from those in prison until the end of 1984 in the UK, long after it had been recognised that prisoners as a group were much more likely to suffer from hepatitis than those in free society. Whilst sourcing blood from serving prisoners persisted, the risk was increased, perhaps only by a little, but undoubtedly. The moral argument that prisoners should have the opportunity of being altruistic in donating blood, as part of the rehabilitative process, has much to recommend it. It echoes the rationale for having the sentencing option of requiring community service as one way in which a convicted person may be helped to live a better life in future. Where there is a serious threat to health, though, which can be reduced by avoiding taking donations from higher-risk groups, the altruism of the would-be donor is better served by standing aside from donation,[821] rather than giving blood, and the system as a whole should not ask such a donor to provide it, for to do so compromises both the health of the general population and suggests erroneously that the moral thing to do is to donate rather than not to do so.

The voluntary donation system helped the process of donor selection. In general, donors would not want to give a blood donation if they felt they might be creating a disproportionate risk of harm by doing so: spreading disease, rather than saving lives. Accordingly, information sheets advising would-be donors whether they should defer donating for the time being were of critical importance in helping the system to identify the most appropriate donors – or, put the other way round, to avoid donations from those who might be the source of increased risk. A significant emphasis was thus placed on “donor leaflets” in the fight against the spread of (first) hepatitis and (second) the cause of AIDS.

The Expert Committee on Hepatitis of the WHO, in 1952, identified not just donor selection but donor screening as necessary steps in seeking to reduce the impact of serum hepatitis. Both hold good, too, for seeking to reduce the impact of AIDS. Just as is the case with NANBH/Hepatitis C, so too HIV infection may have a long phase when few if any symptoms are apparent. Care needs to be taken, therefore, not to limit donor exclusions to those who are visibly suffering, or report significant symptoms. Taking personal history is a relevant part of donor screening. It is plain that this is best done by personal donor interviews, coupled with carefully worded donor questionnaires – if time, space, and resources permit it. They did not always do so. In Dr Brian McClelland’s evidence he spoke of the attempts made in Edinburgh and the South East of Scotland to introduce a comprehensive questionnaire (of especial value where the cramped nature of many places where donor sessions were held) which had to be abandoned due to insufficient staff; the best that could be achieved was a personal interview of all new donors, finally introduced in 1992.[822] This evidence shows that it would not necessarily have been easy to introduce direct confidential questioning of the private lives of donors: but it is regrettable that it was not done, at least in some form, when AIDS first broke, even if it were some combination of a personal questionnaire and a face-to-face encounter informing a donor of the need to signify[823] if their blood should be used for research purposes rather than transfusion because they were or had been associated with a group at higher risk of infection.[824]

Efforts to persuade treating clinicians to change their practice

The role of the RTCs in educating clinicians to minimise their use of blood

England and Wales

In 1982, the DHSS Central Management Services published a report called Blood: Record Keeping and Stock Control.[825] This made a number of recommendations as to the role of the RTCs in promoting good practice in blood transfusion, including “economies in blood usage.[826]

The DHSS issued a circular on 28 February 1983 to (amongst others) regional administrators which refers to Blood transfusion: Record-keeping and Stock Control Arrangements. This stated:

“To facilitate a Regional review of policies, it is suggested that RMOs [regional medical officers] should convene regular meetings between their Regional Transfusion Directors (RTDs) and the consultants responsible for the hospital blood banks in their Regions to consider matters such as current and future requirements for blood, the scope for economies in blood usage, the proportion of plasma-reduced blood to be supplied, the use of ad hoc deliveries and the amount of stock which becomes time-expired in blood banks. The meetings should also provide the forum for the exchange of ideas as to what constitutes ‘good practice’ in the Region with regard to blood supplies.”[827]

In June 1989 Dr Harold Gunson produced a report for the NBTS Co-ordinating Committee in which he stated that the NBTS saw its professional role as encouraging the minimum use of blood and blood products consistent with clinical need and patient safety. He encouraged RTCs to “question atypical and abnormally high orders” and issue some products on a “case by case basis”. He also envisaged the RTCs performing a monitoring role, which would only be effective if the RTCs remained the sole suppliers of Blood Products Laboratory (“BPL”) products to hospitals in their regions.[828]

The extent to which individual regional transfusion directors considered they had a role in educating their colleagues on the use of blood varied from RTC to RTC and also developed over time:

Of particular note was Professor Contreras’ oral and written evidence, which made it clear that she considered it to be a key role of the RTCs to educate clinicians about the responsible and ethical use of blood. In so doing she:

  1. Introduced a joint transfusion medicine consultant, which was a post for a consultant half funded by the RTC who spent the other half of their time in the hospital blood bank.
  2. Started hospital transfusion committees for education in transfusion medicine and to monitor blood component usage.
  3. Performed audits of the usage of red cells, fresh frozen plasma and platelets and showed that there was a great deal of unnecessary transfusion.
  4. Wrote a number of publications and gave numerous lectures regarding the risks of blood transfusion and measures to increase safety.
  5. Organised meetings on transfusion-transmitted infections, to educate and update the medical community.[835]

At the other extreme, Dr Entwistle, the director of the Oxford Centre, did not consider the education of his colleagues on the use of blood to be part of his role.[836] This is despite the fact that in Oxford the haematologists in the region used to meet every three months or so.[837]

Scotland

The SNBTS directors examined the DHSS publication Blood: Record Keeping and Stock Control and produced modified recommendations applicable to Scotland, starting with a recommendation that RTCs “accept a formal responsibility for encouraging good practice” in the hospital blood banks they supplied.[838]

Dr Gamal Gabra recalled that Dr Wallace, then director for the West of Scotland, was an early promoter of using blood components rather than whole blood and that clinicians “gradually” accepted this approach.[839] Professor Urbaniak set up a Hospital Transfusion Committee in the Aberdeen Royal Infirmary to monitor usage and compliance with maximum surgical blood ordering schedules for each operation and procedure that might require a transfusion, apart from those with risk of massive rapid blood loss. As a result, the use of whole blood rather than red cell concentrates reduced and the total number of transfusions per operation was reduced.[840] Dr Boulton explained in his oral evidence how during his time in Edinburgh he did audits of blood usage around the hospital, working closely with cardiac surgeons to achieve a more rational use of red cell concentrates with less emphasis on the freshness of the blood.[841]

Northern Ireland

Dr Morris McClelland considered that he had a role in persuading colleagues to use less whole blood and more red cell concentrates; that “The most effective route of influence” was the haematologists and laboratory staff in charge of hospital blood banks who in turn influenced the clinicians in each speciality.[842] He told the Inquiry that he and Dr Chitra Bharucha took every opportunity to influence these staff who, in turn, were in a position to influence the clinical users of blood in each speciality. The rapid uptake of red cell concentrates from 20% of the Belfast output to 75-80% in the space of three to four years was, in his view, testament to the success of their work.[843]

The role of the RTCs in persuading clinicians to use safer products

One of the obvious steps that treating clinicians could (and should) have taken once it was understood that HIV was a blood-borne infection, was to reduce all patients’ exposure to blood and blood products as far as reasonably practicable consistent with providing treatment. This could be done in two ways. First, less product could be given.[844] The second way this could have been achieved was by prescribing products that had a lower chance of being infected with HIV, namely single-donor products, or small pooled products.[845]

RTDs were asked about the extent to which they played a role in persuading treating clinicians (whatever their specialty) to take either of these steps.

In England and Wales

They did not, on the whole, consider that they had a role in trying to influence treating clinicians to prescribe one product over another on the grounds of safety. A principal reason for this was respect for the clinical freedom of doctors.[846] Two exceptions to this were both directors of the Trent RTC. Dr Charles Bowley sought to make a case for using better quality cryoprecipitate instead of imported factor concentrates.[847] His successor, Dr William Wagstaff, gave oral evidence that he tried to persuade haemophilia clinicians to revert to cryoprecipitate and keep away from commercial products as a result of HIV/AIDS. However, this had no impact on their prescribing policies.[848]

All RTDs who were in post in the years 1982-1984 before blood products were heat treated against HIV gave evidence that had they been asked to increase their production of cryoprecipitate (as a much lower-risk product) during the mid 1980s, they would have been able to do so, and quickly.[849] They were all clear that no such request was made of them by treating clinicians and so no steps were taken by RTDs to achieve this.[850]

Northern Ireland

Dr Morris McClelland, like his colleagues in England and Wales, did not consider it to be his place to question the prescribing practice of Dr Elizabeth Mayne,[851] the Belfast haemophilia director, stating: “I wouldn’t have seen that it was appropriate for -- that I could really influence such change in prescribing patterns.[852]

Like the RTCs on the mainland, Belfast RTC would have been able to increase its production of cryoprecipitate to 20,000 packs per annum if they had been asked to. No such request was made.[853]

Scotland

Dr Cash was not afraid to express his views about the circumstances in which different blood products should be used. For example:

  1. In 1976 he authored an article published in The British Medical Journal in which he stated that cryoprecipitate was suitable for home treatment.[854]
  2. At a meeting in January 1981 of the SNBTS directors and haemophilia centre directors, he is recorded as emphasising “the important part cryoprecipitate could play in haemophilia treatment” and he suggested considering it for home therapy. The haemophilia directors are reported as not being in favour.[855]
  3. In March 1981 he raised concerns about the amount of commercial product being used at a meeting of the Haemophilia and Blood Transfusion Working Group.[856]
  4. In February 1984 at a meeting of SNBTS directors and haemophilia centre directors he was recorded as asking “members to consider whether, given the present SNBTS production level of factor VIII concentrates, it was necessary to purchase commercially unless exceptionally a superior product was available.” Importantly he also recommended reducing the number of batch exposures per patient per year.[857]

Following a visit to New York in October/November 1983 and his participation in the WHO meeting on AIDS in November 1983, Dr Brian McClelland prepared a paper entitled “Acquired Immune Deficiency Syndrome and Transfusion” which he circulated to SNBTS and his haemophilia director colleagues. Unsurprisingly, the paper focuses on the steps SNBTS could take to reduce the risk of AIDS getting into the blood supply in Scotland. However, it does contain a section entitled Measures to Promote the Safe Use of Existing Blood Products which makes a number of proposals including that “The use of single donor or small pool cryoprecipitate for haemophilia therapy should be reassessed. In particular, the extent to which requirements of good manufacturing practice limit the production of small pool freeze-dried cryoprecipitate should be re-examined and the costs of this product estimated in relation to intermediate factor VIII concentrate.[858]

The extent to which the views of Professor Cash and Dr Brian McClelland influenced the prescribing practices of haemophilia clinicians is doubtful.[859] There does appear to have been a modest response to the threat of AIDS from the director of the Aberdeen Haemophilia Centre, Dr Bruce Bennett. Following discussion between him and Dr Urbaniak, Dr Urbaniak increased the production of cryoprecipitate from 153 units to 425 units between 1983 and 1986.[860]

Dr Gabra in his oral evidence stated that the Glasgow RTC could not have increased their production in Glasgow of cryoprecipitate had they been asked, because they would have had to establish the facilities to do so.[861]

Donation screening

The responses of all four blood services acting as one, nationally, to screen blood donations are set out in the chapters on HIV Screening and Hepatitis C Screening.

Other steps taken in response to an infected donor or donation

Once a donation had reacted to a screening test (“a reactive donation”), or a report had been made to the RTC that gave rise to a suspicion that a particular donation had caused a transfusion-transmitted infection to the end recipient (from “an implicated donor”), then RTCs had to decide what, if anything, they should do.

This section considers what if any steps the blood services took to:

  1. Investigate reactive donations and implicated donors to determine whether other products might be infected.
  2. Quarantine donations from any implicated donors and any products made from them, pending screening being carried out.
  3. Screen previous donations from implicated donors to assess their infective status.
  4. Recall any existing products made from the donations of an infected donor.
  5. Inform BPL that they had been provided with plasma from an infected (or implicated) donor.
  6. Participate in any attempts to recall potentially infected products.

It is clear that none of these steps could be taken without the RTC having accurate and searchable records on all of their donors, and the fate of all of their donations. The issue of records and record-keeping is addressed below.

This section does not address:

The evidence from the RTCs was that they had reasonably robust processes for dealing with a reactive donation. The first step was immediately to quarantine the reactive donation while further confirmatory testing was carried out. This was certainly the practice at the North London RTC.[862] In the event that the confirmatory tests were positive, the second stage in the process was to trace:

  1. any components made from the donation, that may still be at the RTC and destroy them; and
  2. any components made from the donation that had been despatched to hospitals, and inform the blood bank at the hospital so that any existing components could be located and destroyed.[863]

This appears to have been fairly standard practice, and was for example something that the Yorkshire RTC did.[864] The third stage was to identify whether any plasma from the donor had gone to BPL, and if so, inform BPL of that.[865] There is certainly plenty of evidence of RTCs making reports to BPL of infected donations having been provided to them.[866] RTCs could expect BPL, after investigation, to notify them of the batch numbers of any fractionated products to which the potentially infected donations had contributed, and ask the RTCs for their assistance in finding out where those batches went, so that the patients could be followed up.[867]

What is less clear from the evidence is the extent to which, prior to the formal lookback programmes,[868] individual RTCs investigated whether previous donations from an infected donor might have been infected and if so, what happened to them. Certainly until the formal lookback programmes were instituted, there was no obligation on the RTCs to undertake this work.

The expectations on RTCs in England to notify and then counsel donors who were found to be positive for HIV was clearly set out in the CMO’s “Dear Doctor” letter to all doctors in England dated 1 October 1985. This states “donors will be interviewed and counselled about the significance of test results by senior NBTS medical staff who have received training in counselling.[869] Thus for example, the North London RTC always counselled donors who tested positive for HIV themselves.[870]

There was however no clear guidance to RTCs setting out their obligations to inform and counsel donors found to be positive for Hepatitis B or Hepatitis C. Thus, unsurprisingly, practice differed:

Indeed, donors who were found positive in the three-centre trial of Hepatitis C screening kits prior to the introduction of Hepatitis C screening in September 1991 were not to be informed, followed up or counselled.[875]

The actions of SNBTS in 1984 in response to the news from Dr Christopher Ludlam on 25 October of that year, that six of his patients from the Edinburgh Haemophilia Centre had developed antibodies to HTLV-III after exposure to the Protein Fractionation Centre (“PFC”) batch 023110090 (“the PFC batch”), is worth setting out in a little detail. Initially a decision was made by Dr Brian McClelland and Dr Cash not to recall the PFC batch.[876] By 2 November Dr Ludlam had confirmed to Dr Brian McClelland that in fact 16 of his patients had developed the antibody, and all of them (or possibly 15 of them), had been exposed to the PFC batch. On 3 November Dr Brian McClelland and Dr Boulton contacted all the Scottish RTCs and the Northern Irish RTC to recall any of the batch that was still being held.[877] Dr Brian McClelland told the Inquiry in his oral evidence that: “As it emerged, there was very little of that batch to recall, apart from a few units that were still in the Aberdeen Transfusion Centre. All the rest of it had been transfused some time previously.”[878] No other batches were recalled.[879] The donation which had infected the batch was not identified.[880]

Records and record-keeping

From as early as the 1940s, the importance of record-keeping was emphasised as an essential part of any strategy to reduce the risks of transmission of infections such as hepatitis.[881] It is not surprising therefore that:

  1. The maintenance of accurate records was numbered as one of the key preventative measures in the 1952 report of the World Health Organization’s Expert Committee on Hepatitis.[882]
  2. The 1973 version of Notes on Transfusion emphasised the importance of accurate recordings for “the protection of the patient.”[883]
  3. The importance of record-keeping was emphasised by the DHSS in its circular from March 1984: Blood Transfusion: Record-Keeping and Stock Control Arrangements.[884]

One of the consequences of a system of RTCs operating as separate fiefdoms[885] was that each centre had different record-keeping systems. While there were attempts made by the RTDs to standardise some key documents,[886] local practices developed.

Of course during the 1970s and into the 1980s, RTC records were on paper. Keeping accurate records was a challenge. Records about the donor and their donation history were generated at donor sessions in the community,[887] while records in relation to the donation itself were generated in the main at the RTC during the processing phase. Clearly the two types of records had to be reconciled, and importantly, searchable, so that any donation (together with all its component parts) could be traced, and information obtained as to where it had been sent.

Added to this, some RTCs kept the records of donors who tested positive for HIV and hepatitis separately from their other records, presumably to ensure their confidentiality.[888]

One of the major problems the RTCs had in tracing the eventual fate of the products that they supplied to hospitals, was that unlike in Scotland, where four of the five transfusion centres also carried out blood banking for the hospitals in which they were located, the English and Welsh RTCs had no access to the records of the hospitals and haemophilia centres that received their products. Thus once a labile component, or even a vial of Factor 8 produced by BPL, left the RTC, the RTC had to rely on the information provided to it from the recipient hospital to learn its fate. The Inquiry heard a wealth of evidence to suggest that hospitals were not complying with their obligations to keep accurate records of the transfusion of blood and blood products “to enable each unit of blood to be traced from donation to disposal” as required in the 1983 guidance.[889] The Inquiry has repeatedly heard evidence from individuals who have had a blood transfusion with no record at all being made in their medical records, and countless more where the serial numbers of the transfusion were not recorded in the patient’s records.

Professor Contreras considered the problem to stem not from the hospital blood bank, but from the record-keeping of the clinicians actually receiving the blood from the blood bank.[890]

Even once records were computerised, the RTC computer systems remained separate and were not necessarily compatible with one another – they did not communicate one with the other.[891] In England in 1995 a national computer was introduced, but the system was divided into the three separate zones that were created at that time, and records initially could not be shared across the zones.[892] It was not until July 2008 that a truly national system was developed.[893] In Scotland, a Scottish national computer system was introduced in the late 1980s (DOBBIN), but the records of each of the five RTCs were segregated one from the other and could not be shared.[894] It was not until 1997/1998 that records could be shared between and across the RTCs.[895]

This meant that where a donor had been rejected by one transfusion centre, there would be nothing stopping them from trying to donate at another since checks on new donors to see whether they had already been rejected by another RTC were not possible.[896] Some attempts were made to remedy this obvious deficiency in the system. Dr Jean Harrison in her statement recalls the consultants responsible for microbiology sharing details of those donors who were unsuitable to donate.[897] Her RTC had a system of “blacklisted” donors – but this operated only within the RTC.[898]

As well as paper records, RTCs began keeping samples of donations, thus making tracing and lookbacks more effective. For example:

Length of time records were kept

Prior to 1992, the recommendation in the first edition of the Red Book was that records should be kept for 15 years.[903] In 1992 Dr Lloyd, Dr Alan Beal and Mr Tony Martina produced a report entitled Record Storage Report for the National Blood Transfusion Service in England and Wales for the National Directorate. This recommended that donor and donation records and policy and management records, as well as records directly linked to donor and donation records such as QA reports, should be kept for 30 years.[904]

Reporting for public health purposes

England and Wales

Hepatitis

In 1946 a system was devised of reporting cases of serum hepatitis to Dr Maycock.[905]

At a meeting of RTDs in November 1973[906] it was agreed that:

  1. If reports of adverse reactions concerning blood and blood products were to be made to the Committee on Safety of Medicines Adverse Reactions Sub-Committee, such reports were best made by RTDs because they almost always heard of and investigated serious reactions associated with blood and blood products.[907] In making those reports, it was not essential that the Yellow Card scheme was used, providing the name of the doctor in charge of the patient was reported.[908]
  2. Cases of serum hepatitis should continue to be reported to Dr Maycock.[909]

By 1980 the system for reporting transfusion reactions was for Dr John Barbara to collate known cases, and send a report annually to the Centre for Disease Surveillance and Control at the Public Health Laboratory Service.[910] It appears that these reports did not simply record the data from the North London Blood Transfusion Centre: there is evidence of other RTCs sharing information with Dr Barbara concerning the number of donors found to have HBsAg, so that he could report it for the national survey.[911]

HIV

On 29 January 1985 the Expert Advisory Group on AIDS agreed unanimously that statutory notification of HIV was not required and that an informal approach was to be preferred.[912]

This informal approach involved RTDs reporting incidents of HIV-infected donors to
Dr Gunson.[913]

Scotland

At a meeting of the SNBTS directors on 20 June 1985 it was agreed that a system for reporting AIDS cases to the Communicable Diseases (Scotland) Unit should be agreed.[914] At the meeting on 19 November 1985, there was agreement that the Unit’s form should be used.[915]

Northern Ireland

The focus of information sharing at the Belfast RTC was for donor selection. The Belfast RTC encouraged hospital clinicians and GPs to report to them cases of transfusion-associated hepatitis but rarely received reports from GPs.[916] Dr Bharucha recalls only two blood donors who tested positive for HIV who were followed up and did not donate again.[917]

The response of the blood services to the emergence of AIDS

England and Wales

In early July, probably before the Centre for Disease Control and Prevention (“CDC”) published its report that people with haemophilia in the US had contracted AIDS after receiving factor concentrates,[918] the NIBSC[919] was alerted to reports that plasma from homosexual drug-takers “contains a sort of virus” which might lie undetected but “when used for Factor VIII … becomes active again” and that “It seems that 400 haemophiliacs in the USA have exhibited signs of the virus.” Dr Joseph Smith at the NIBSC informed Dr Gunson in his capacity as consultant adviser in blood transfusion. Dr Gunson then in turn alerted civil servants at the DHSS to this.[920] This happened on, or possibly before, 16 July 1982.[921] However, months would elapse before AIDS became the subject of collective discussion by the RTDs in England and Wales or by the SNBTS directors.

The RTDs held one of their regular meetings on 20 September 1982 but there was no discussion of AIDS.[922] Nor was the threat of AIDS considered at the inaugural meeting of the UK Working Party on Transfusion-Associated Hepatitis on 27 September 1982, a meeting attended by Dr Gunson and several RTDs.[923] It was not raised at the next meeting of RTDs on 14 January 1983,[924] nor at the meeting of the Advisory Committee of the NBTS on 10 January 1983.[925] A brief mention came, finally, at the second meeting of the UK Working Party of Transfusion-Associated Hepatitis on 18 January 1983.[926] However, whilst Dr John Craske “summarised the current situation and mentioned the involvement of homosexuals”, and the minutes recorded that in the US “it is recommended that homosexuals with AIDS be deferred from donating blood or organs”, there was no consideration of the UK following suit. It was not until the Working Party’s third meeting on 20 April 1983 that the possibility of taking action was mooted for the first time and in the following, rather tentative, terms: “Dr Gunson asked members of the working party to bear the topic in mind and consider the possibility of producing a pamphlet for donors illustrating the AIDS risk groups. He was aware that this might have adverse repercussions for donor recruitment etc.[927] However, the potential effects on arrangements to supply plasma to BPL came under consideration. The Working Party anticipated that the uptake of cryoprecipitate would rise, particularly for those who had not previously received concentrates[928] and that this would mean a drop in supply to BPL.

A week later, at the meeting of the Central Blood Laboratories Authority on 27 April, Dr Gunson reported that the RTDs “had considered all the American literature on this subject, and at the next meeting of their Committee it would be recommended that no further measures be taken, apart from those already being carried out.[929] As no specific measures had yet been taken by RTDs (and indeed as the subject had not yet been discussed at a meeting of RTDs), the reference to measures “already being carried out” is likely to refer merely to existing donor screening practice.

AIDS was finally discussed at the regular meeting of RTDs on 18 May 1983, some ten months after the Morbidity and Mortality Weekly Report of three cases of suspected AIDS in people with haemophilia in the US.[930] Four options were identified by Dr Gunson for consideration by RTDs:

  1. questioning donors at sessions
  2. discontinuing sessions in areas of high-risk donors
  3. pamphlets explaining AIDS to donors
  4. publications in newspapers[931]

A pamphlet prepared by Dr Brian McClelland was also considered.[932] The RTDs rejected options (1) and (2).[933] Instead it was agreed that contact would be made with the Gay Society stating that, until more was known, homosexuals should be asked not to donate blood, and that Dr Tom Davies and Dr Barbara would draw up a leaflet on AIDS which would be circulated to RTDs for comment.[934] It appeared to be understood at that stage that this was something that required relatively quick action: the minutes record the hope that the leaflet could be ready for printing in six weeks and that Dr Diana Walford would try and have it printed through the DHSS as quickly as possible.[935]

Following the meeting Dr Wagstaff wrote to RTDs on 7 June 1983 to ask for feedback on questions raised by senior staff within the DHSS, who were said to be “a little perturbed that the low key approach being recommended by the NBTS is at odds with the more aggressive measures being taken in the United States”.[936]

Dr Wagstaff sent the proposed leaflet to RTDs on 6 July 1983. His covering letter explained that most RTDs strongly felt that the approach to donors should be as low key as possible and were reluctant to hand the leaflet to every donor or send it out as part of the call-up material, but that a small number of RTDs might be asked to run a kind of trial by posting or handing out the leaflets. He recorded the general opinion that the “illness notice” be amended to include unexpected loss of weight and whether the person was in good health or had needed to see a doctor recently.[937]

It was not until 1 September 1983 that the AIDS leaflet was finally published.[938] The reasons for that delay, and the limitations of the leaflet’s wording, are explored in the chapter on Role of Government: Response to Risk. However, it would have been open to RTDs to take their own local action and produce their own leaflets in the meantime. The Birmingham RTC did so,[939] as did Edinburgh,[940] but most centres waited for the DHSS’s leaflet.

When the RTDs met again on 22 September 1983, Dr Wagstaff reported that centres had been encouraged to use differing methods of distribution. The three methods being used were: posting of leaflets with call-up cards, handing leaflets to donors, and making leaflets available at sessions for donors to pick up. The DHSS requested feedback on donor reaction to the leaflet by the end of November at the latest.[941]

A few days later, at the meeting of the Working Party on Transfusion-Associated Hepatitis, it was agreed to minute, and bring to the attention of the RTDs, the Working Party’s preference for a uniform approach to the use of the leaflets.[942]

The next regular meeting of RTDs took place on 25 January 1984, at which it was reported that no offence had been caused to donors regarding the introduction of the AIDS leaflet at sessions.[943]

The Advisory Committee to the NBTS discussed the AIDS leaflet at its 10 April 1984 meeting. The six-month trial of the leaflet was now complete and the survey of RTDs showed little adverse comment. The DHSS now proposed to prepare, in consultation with RTDs, a revised version of the leaflet for submission to ministers. Dr Keith Rogers suggested that RTDs should adopt “a more aggressive approach” to discourage high-risk donors from giving blood. The Committee recommended that although the method of distribution during the trial period had been left to the discretion of RTDs, ministers should now consider whether the revised leaflet should be sent with the call-up cards in all regions.[944] At the meeting of RTDs the following day, the importance of discouraging high-risk groups from being blood donors was stressed as was the awareness of sessional medical officers in ensuring the fitness of potential donors.[945]

At the meeting of RTDs of 11 July 1984 it was reported that the Divisions had sent comments on the proposed revised AIDS leaflet to Dr Alison Smithies and that the draft would be revised.[946] There was no discussion at the next meeting in October.[947] By the time of the meeting of the Advisory Committee to the NBTS on 8 November 1984 it was reported that ministers had accepted the recommendation of a uniform system of distribution and that the leaflets would shortly be distributed to RTCs for issue individually to every donor. The Committee “advised on the particular problems of getting the leaflet to new donors, and in its use at industrial sessions, and also noted that there would be some cost implications for Centres using a card call-up system.[948]

A paper prepared for the Working Group on AIDS meeting on 27 November 1984 suggested that it was possible to go further than the use of the leaflets and discourage high-risk donors not to give blood by organising (as was said to have been done in some centres) “more intensive interviewing of donors”, which might be particularly appropriate in areas where it was known that there was an increased population of homosexuals and drug abusers.[949] However, the Working Group was not in favour of closer questioning of donors “to see if they were homosexual etc”, concluding that the leaflet was sufficient: “There was concern that too close a questioning might be counterproductive”, according to Dr Michael Abrams’ report on the meeting to Dr Harris.[950] Notes by one attendee at the meeting reported on the questionnaire being trialled at the North London RTC, offering donors the chance to elect for their blood to be used for research; it was also noted that Dr Contreras and Dr Richard Tedder had met with “London Gay Reps” the previous week and “Got strong message that some homosexuals are still continuing to donate.[951]

By December 1984 the promised revised leaflet had not yet been made available although, according to a meeting of Western Division NBTS consultants on 7 December, four centres had produced their own.[952]

At their January 1985 meeting, the RTDs expressed anger at the continuing lack of the revised leaflet (as well as frustration at a lack of information regarding progress in relation to AIDS more generally). Dr Gunson reported that the new leaflet would be available on 1 February and that it was expected that a “positive approach to its distribution” would be insisted upon.[953] There was also some discussion about an AIDS poster for donor sessions, which Dr Smithies was said to be proceeding with.

The second version of the AIDS leaflet was at last made available for distribution in England and Wales from 1 February 1985.[954] The reasons for the delay in the production of the second leaflet, and the significance of the changes in the leaflet’s wording, are explored in the chapter on Role of Government: Response to Risk.

In the meantime the Expert Advisory Group on AIDS (“EAGA”) held its first meeting on 29 January 1985, with Dr Abrams in the chair and the CMO, Dr Donald Acheson, in attendance for part of the meeting. It concluded that the blood donor leaflet (likely to have been a reference to the revised leaflet about to be introduced) was not “sufficiently forceful” and needed some redrafting, “particularly with regard to its objective of persuading homosexuals not to donate blood. Consideration should be given to the introduction of some means by which the ‘closetted’ [sic] homosexual – possibly faced at a visit to a NBTS Centre with advice not to give blood – could unobtrusively withdraw from the system.[955] It seems, therefore, that even before publication of the second leaflet it was recognised that it did not go far enough. However, there was no consideration of this at the next meetings of RTDs on 17 April 1985 and 10 July 1985.[956]

A third version of the AIDS leaflet was available from September 1985.[957] When, in early October 1985, the Eastern Division consultants in the NBTS met, the new AIDS leaflet was criticised and it was recorded that “most Centres were sending an explanatory letter in addition.” The most worrying aspect of the leaflet was said to be that it implied that the anti HTLV-3 was a test for AIDS.[958] The focus of discussion at the next meeting of RTDs on 9 October 1985 was the imminent introduction (on 14 October) of screening and there was no further discussion regarding the leaflets or their use.[959]

Scotland and Northern Ireland

Neither the SNBTS directors regular meeting on 14 September 1982 nor the directors’ next meeting on 14 December 1982 contained any discussion of AIDS.[960] AIDS was briefly considered at a meeting on 21 January 1983 between SNBTS directors and haemophilia centre directors in Scotland: Dr Cash drew attention to recent articles in the US and UK (and circulated a Morbidity and Mortality Weekly Report extract) and Dr Ludlam referred to a letter and questionnaire being sent to UK haemophilia centre directors.[961] There was, however, no exploration of any possible action for the transfusion service by way of risk reduction.

The SNBTS directors met again on 29 March 1983. Although consideration was given to the collection of blood in prisons and borstals, the trigger for this was the Medicines Inspectorate’s criticisms of the practice, rather than any particular concern regarding AIDS risks, which (surprisingly) did not feature in the meeting at all.[962]

SNBTS directors finally discussed the implications of AIDS for blood donation at a meeting on 24 May 1983 of the SNBTS Co-ordinating Group. Dr Ruthven Mitchell had introduced into the health questionnaire to donors a question inviting those who were worried about AIDS to consult the doctor at the session. Dr Urbaniak had decided not to do anything locally, “his view being that once a donor had entered the session it was too late to make an approach and the problem was minor in NE Scotland”.[963] Dr Brian McClelland had prepared a leaflet (which he tabled), detailing those donors who should refrain from donating blood.[964] Dr Cash reported on what had been discussed at the recent meeting of RTDs and agreed to contact Dr Barbara for information about the proposed leaflet, after which he would arrange a meeting with the Scottish Home and Health Department (“SHHD”) to discuss the provision of information which directors could use if they wished.[965]

At the SNBTS directors meeting on 14 June, attended also by Dr Gunson, the latter explained that he had edited the leaflet currently being considered for England and Wales, adopting a question and answer format, after DHSS officials expressed the view that it would not have the required impact, and that he had identified homosexual men (especially those with multiple partners), drug addicts and anyone who had sexual contact with an AIDS sufferer as donors who the blood service would prefer not to see for the time being. Dr Brian McClelland explained that he had amended his leaflet following discussion with the Scottish Homosexual Rights Group.[966] The meeting acknowledged that if the purpose of the leaflet was to deter donors it would have to be issued before they attended a donor session. There was also discussion on how best to deter certain donors without causing offence to others, with the suggestion that a wide audience could be addressed through radio and television (although this suggestion does not appear to have been acted upon).[967] Dr Brian McClelland told the Inquiry that he made clear to the other directors that the leaflet was “public property”; ie that it could be used by them.[968]

By August 1983 the expectation was that SNBTS would use the leaflet that was being produced by the DHSS and would be published on 1 September. The Co-ordinating Group of SNBTS met on 30 August 1983 and its minutes note that each RTC now had a supply of AIDS leaflets pending the lifting of the embargo on their release throughout the UK by the Minister for Health, and that Dr Cash had written to each RTD to explain that the method of issue was at their discretion.[969]

In Scotland a range of different methods for use of this leaflet were deployed:[970] in the north the leaflets were on display with other publicity leaflets at donor sessions and in plasmapheresis rooms;[971] in the North East they were available at all mobile and fixed site sessions;[972] and in the East they were “on display at the clerking desk” and anyone requesting information was referred to the medical officer on duty. In the West the leaflets were available on request with the medical officer at sessions and Dr Mitchell had incorporated into his “health notice” the question: “Have you heard about AIDS? If you wish to know more you may ask the Medical Officer at the session in confidence or your General Practitioner or write to the Transfusion Director.[973] In the South East the leaflets were made available at the donor sessions.[974] In Belfast, Dr Morris McClelland had not as of mid September received the leaflets but would make them available at donor sessions once he did.[975]

The SNBTS directors discussed the methods of distributing leaflets again at their meeting on 8 December 1983, where it was agreed that a more active approach would now be “acceptable”.[976] It was felt that each donor should receive a copy and that the health questionnaire should include the question: “Have you read and understood the leaflet on AIDS?” It was also considered that the leaflet should be revised and the plan within SNBTS was to do so without waiting for the DHSS. It was agreed that no further action would be taken until a revised leaflet had been issued and Dr Brian McClelland agreed to produce a revised version for consideration by the Scottish directors.[977]

At a meeting of SNBTS directors and haemophilia centre directors on 2 February 1984 there was a discussion about the effectiveness of the leaflet and it was felt that “some modifications might be made” and emphasised that the leaflet must, in the absence of a screening test, be given to all prospective donors.[978] Dr Brian McClelland’s revised draft was considered at the SNBTS directors’ meeting on 13 March 1984, which recorded agreement that a leaflet should be sent once to each blood donor as an enclosure with the call-up letter; directors undertook to send Dr Brian McClelland comments on the draft within two weeks. The minutes noted that “While the leaflet had been mailed to all blood donors in some English Transfusion Regions, in Scotland it had been made available at donor sessions and at some STD [sexually transmitted disease] clinics and the Scottish Directors felt their position would be strengthened by mailing to all blood donors.[979]

Further comments (in addition to those received since the last meeting) on Dr Brian McClelland’s draft leaflet were made at the SNBTS directors’ meeting on 12 June 1984 and it was finalised.[980]

In Scotland, whilst the text of the revised SNBTS leaflet had been agreed, it appears that the leaflets had not yet been ordered as at 20 November 1984. It was, however, decided that each Scottish RTC would incorporate into their health questionnaire the words: “I have read the SNBTS AIDS leaflet (Important Message to Blood Donors) and confirm that, to the best of my knowledge, I am not in one of the defined transfusion-related risk groups.” It was also agreed that the leaflet must be distributed: with the call-up letter; at sessions to every donor who attended; to the organisers of workplace and college/university sessions; with the registration book to new donors; and (after donation) to the home address of donors who, not having been called to a session, attended nonetheless. The overall aim was that “as many donors as possible should have seen the message before attending a session.[981]

Dr Brian McClelland told the other SNBTS directors at their meeting on 11 December 1984 about a leaflet from the Terrence Higgins Trust with a “clear explanation” and said it might be necessary to redraft the SNBTS leaflet again.[982]

The SNBTS’s Co-ordinating Group met on 19 February 1985. Dr Brian McClelland was proposing to amend the item he had added to the health questionnaire to read: “If you think there is any reason why your blood should NOT be used for transfusion, please tick this box and you will not be questioned further”, which he intended to run on a trial basis for two weeks.[983] It was agreed to await his experience. He was also now intending to mail all donors; Dr Whitrow expected to do the same but Dr Mitchell “repeated the impossibility in his region of writing to every donor.[984] At the SNBTS directors’ meeting on 27 February 1985 it was agreed that the leaflets should be mailed to all active donors “wherever possible”, although it was recognised that some centres might have great difficulty in achieving it.[985]

1986 and 1987

At the meeting of RTDs on 9 July 1986 Dr Smithies asked RTDs to consider “again” the effectiveness of measures to exclude high-risk donors, in particular an arrangement which would allow donors who had reached a point in the sessional procedure where it was impossible for them to withdraw without embarrassment, to sign their blood away for research or some other purpose. The minutes record the following contributions in response: “Dr Harrison pointed out that staff would be very reluctant to bleed such donors. Dr Hewitt argued that such donors will give blood anyway and that the risk to the staff is therefore no greater than what exists at the moment. It was felt that to designate such donations for ‘research’ might encourage donors from high risk groups to attend and donate.” No specific action appears to have been agreed. There was also discussion of the latest iteration of the AIDS leaflet.[986]

EAGA acknowledged at its September 1986 meeting that people in high-risk groups were still coming forward to donate blood.[987]

By October 1986 the revised (fourth) leaflet had been printed and distributed to RTCs, with the expectation that it be distributed with donor call-ups.[988]

In January 1987 the RTDs at their regular meeting agreed that those who had engaged in prostitution should be added to the list of risk groups on the next AIDS leaflet. Dr Smithies envisaged another reprint within four months, and it was agreed by Dr Smithies that RTDs could see and comment on the final draft of the next leaflet, “though Dr Smithies pointed out that final wording could be influenced by the publicity section at DHSS.[989]

The April 1987 meeting of the Eastern Division of NBTS consultants reported that the North London RTC had a confidential unit exclusion form for the donor to complete so that a high-risk donation could be excluded. This was said to be working well but required additional staff. The RTC also excluded prostitutes (male and female) and their contacts. Both Cambridge and South London RTCs indicated that they would find this system difficult to introduce and relied on the self-exclusion of donors.[990]

The issue of allowing donors a means of opting out was raised by Dr Smithies again at the meeting of RTDs in April 1987. She reminded RTDs that the discussion at the previous meeting had concluded without examining the opportunity for donors to opt out at some point during the donation process. All divisional chairs reported that this had been discussed “and felt to be difficult, complicated and probably unworkable.” The draft of the new (fifth) AIDS leaflet was circulated with Dr Roger Moore asking for comments by the end of April.[991]

At EAGA’s May 1987 meeting, Dr Smithies reported that the last leaflet had been issued in September 1986 and that the time had come for a revision. She sought advice that the leaflet correctly stated all those at risk from AIDS who should not give blood. It was agreed that a further risk group (“people who know they are infected”) should be included.[992]

A further (fifth) version of the DHSS leaflet was printed for distribution in July 1987.[993]

The regional transfusion directors’ evidence to the Inquiry

The absence of a uniform approach is apparent from the evidence given to the Inquiry by RTDs across the UK.[994]

Dr Morris McClelland did not give any serious consideration to producing a leaflet for Belfast: he was aware that a national leaflet was in the pipeline and thought it was appropriate to follow the national approach.[995] Once the leaflet had been received (some time after 13 September 1983[996]), the method of distribution, until late 1984, was simply to display the leaflets at all sessions. In late 1984 the centre began to hand a leaflet to each individual donor, although it was noted that donors often had insufficient time in practice to read it properly before donating and a few had shown resentment. Dr Morris McClelland recognised the difficulty for any donor to exclude themselves at a donor session and the desirability of sending the leaflet to each donor with the call-up letter. This was not possible, as the centre used postcards and did not have the clerical capacity to send enclosed leaflets.[997] Northern Ireland was “quite a conservative society” and Dr Morris McClelland felt there might be some merit in a gradual approach to introducing the leaflet.[998] He thought an amended questionnaire with a question along the lines of “Have you read the AIDS leaflet?” might have been introduced in around 1985/1986.[999]

Dr Napier, the RTD for Cardiff, attended a Welsh Office meeting on 4 May 1983, at which the view was recorded that, given the very low reported incidence of AIDS in the UK, “we might be confident that we are not collecting potentially contaminated blood.[1000] He was reported in the Western Mail as saying that no links between AIDS and blood transfusion had been proved, and that “we do not take blood from anyone who is harbouring any sort of infective problem, and prospective donors are always asked about their medical history.[1001] Cardiff’s method of distribution was to make the leaflets available at sessions; the leaflets stayed within sessions and were reused.[1002] They did not have the facility to incorporate the leaflet with the postal call-up process. At a Welsh Office meeting with the Chief Medical Officer for Wales on 19 November 1984, it was acknowledged that it was “still thought unsafe to rely upon this [the leaflet] as the sole means of weeding out the homosexual population from amongst potential blood donors”and noted that “The matter of a more detailed questionnaire could usefully be pursued.[1003] A question asking the potential donor whether they had read the AIDS leaflet was in use or about to be introduced as at December 1984.[1004] By January 1985, Cardiff’s arrangements were that a leaflet was placed on each chair in the waiting area, and all donors were asked to sign that they believed themselves medically fit and had read the leaflet.[1005]

Dr Wagstaff, at Trent, took the more proactive option once the leaflet was available in September 1983 of sending the leaflets out to donors with the call-up cards and handing them out at sessions. He thought the leaflets did have the effect of putting off high-risk donors.[1006] Dr Wagstaff did not know why it took so long for the second version of the leaflet to be produced, but did not give any active consideration to the introduction of additional measures in the Trent region in the meantime.[1007]

Dr Entwistle was aware from 1982 that the question of infection with the AIDS virus was associated with transmission via blood in some form.[1008] However at Oxford (and in common with most RTCs), in the period leading up to the production of the first AIDS leaflet in September 1983, no particular steps were in place, beyond the usual standard processes, to screen out high risk from AIDS donors.[1009] The leaflets were then made available on display at sessions but not sent out with the call-up cards. Dr Entwistle suggested this was “not the most appropriate way, not least because that would not cater for the walk-in donors.[1010] As at March 1984 the Centre held a stock of 13,000 leaflets, had issued approximately 1,000, and had a “now negligible” rate of usage per month, meaning that donors were not picking up and removing the leaflets from where they were displayed.[1011]

In North London RTC a slightly different approach was taken. On 23 May 1983, shortly after the first meeting of RTDs to discuss AIDS, Dr Davies wrote to colleagues at the RTC anticipating that a pamphlet would soon be available but that in the meantime “there must be no questioning of donors about their private lives.[1012] Once the national leaflet had been produced, it was made available at sessions: Dr Contreras told the Inquiry that it could not be sent with call-up cards because they were sent as postcards, and that the service did not have the staffing to give it individually to donors. Instead it was left on the chairs in the waiting area “the donor attendant said, well, some of them read it and some of them don’t.[1013] However, in mid 1984 Dr Contreras and Dr Barbara visited the New York Blood Center to find out how they were dealing with high-risk donors, and learned that a self-exclusion questionnaire was used which the donor could answer in confidence in a cubicle. On their return they introduced a trial of a self-exclusion questionnaire in North London in July 1984.[1014] By October of that year, fed up with waiting for the revised leaflet to be produced by the DHSS, their existing leaflet had been overprinted so that the words “Practising homosexuals” replaced “having had many sexual partners”. They also produced their own additional leaflet titled “Some Reasons Why You Should Not Give Blood” which set out a list of reasons as to why someone should not give blood “but emphasising the AIDS risk.[1015] North London was not, however, able to fully comply with the requirement in the DHSS circular issued with the second leaflet, which required that the revised leaflet be brought to the attention of each donor on an individual basis: the call-up system using postcards meant that the leaflet could not be sent out at the same time: “We educated our donor attendants and receptionists on handing the leaflet, but I cannot say with certainty that it was handed to every single donor, I’m afraid.[1016] The confidential exclusion questionnaire that had been trialled in July 1984 was rolled out across North London’s remaining donor sessions by July 1985.[1017]

Commentary

The blood services in the UK were slow – too slow – to react to the threat of AIDS. In the detail of the chapter, the starkness of the chronology may be lost. It is this.

The underlying context was that of a growing epidemic. AIDS was known about in 1981. It was largely in the US, but (as things which start in the US often do) it had reached the UK – the first reported death from AIDS in the UK came in December 1981. It had become well enough known about in the UK by mid 1982 to justify a charity to support its victims (the Terry Higgins Trust).[1018] It was thus generally to be appreciated that it was not a disease restricted purely to the US.

Against this background, on 16 July 1982 the blood services – in England and Wales – and the Government became aware that its cause might well be blood borne. The only defence against this immediately available was careful donor selection and donor screening.

It was not until 18 May 1983 that there was any discussion amongst the directors of the regional blood services which together formed the NBTS in England and Wales about AIDS and its implications for donor selection and screening.

It was not until 1 September 1983 that a leaflet was produced for a six-month trial period.

It was not until a further year and five months later (1 February 1985) that the initial leaflet was revised, because the first lacked sufficient strength: and it was recognised even before it was distributed as not going far enough. Yet it took until September 1985 for a further revision.

As for the blood service in Scotland, SNBTS was probably aware of the potential risks at much the same time in 1982. It was slightly behind England in discussing what best to do: its first recorded discussion was on 24 May 1983.

By contrast with England, though, in the West of Scotland questions were already being asked on the donor questionnaire; and for Edinburgh and the South East, Dr Brian McClelland had already prepared a draft letter aimed at deterring donors whose donations might be risky.

In December 1983 it was thought that a more active approach was acceptable, but it was not until 13 March 1984 that a revised draft was considered. It does not appear to have been circulated until later in the year or early the next.

Four features in particular may explain (but not excuse) this history of delay.

Firstly, Dr Brian McClelland’s sense – which is borne out by the narrative set out above – was that most transfusion personnel “took some time to realise the seriousness of this infection” and that there was “some reluctance in some quarters, including within the UK transfusion services to acknowledge the gravity of what was emerging.[1019] He thought that there was a belief in the transfusion community, particularly those who were engaged with the donor and donation side of it, “that blood donors were a particularly worthy and meritorious population … the thought that they could, in any sense, in the public mind be linked with behaviour which was certainly subject of enormous prejudice … there was a deep-seated reluctance to accept that there was any kind of link”.[1020]

There was also overconfidence in the voluntary donor system as a protection against infection. It is not always easy to question cherished notions: and it was right to have confidence that a system of voluntary non-remunerated donors offering their blood was and would be safer than the alternatives. However, “safer” does not mean “safe”. A moment’s reflection might have told those in the transfusion system that the voluntary nature of donations had not prevented the transmission of hepatitis at any time since the services were set up in the 1940s and that the greatest safety that could be achieved in the absence of a totally reliable screening test depended on the selection of “good” donors, and the rejection of those who were a greater risk.

Secondly, there was the same confusion of incidence and risk as bedevilled clinical responses to AIDS. The fact that someone might think themself to be in good health because they are not symptomatic, does not mean that they are not a carrier for the disease in question. Where a disease has a long incubation period before it shows itself the appearance of symptoms may be too late. For instance, Dr Napier in Wales observed “we do not take blood from anyone who is harbouring any sort of infective problem[1021] which relies on the appearance of symptoms, rather than the association of a donor with a risky cohort, as a result of which they may, unknown to them, be incubating an infection.

Thirdly, there was the lack of a “centralised approach by the UK Transfusion Services.[1022] There were multiple forums in which issues were being raised – the meetings of the RTDs in England and Wales, the meetings of SNBTS directors in Scotland, the SNBTS Co-ordinating Group, the UK Working Party on Transfusion-Associated Hepatitis, the CBLA, the Advisory Committee to the NBTS, the Central Blood Laboratories Authority’s Working Group on AIDS – but no single body with responsibility (and resourcing) to “address the problems of AIDS in relation to transfusion in the UK.[1023] The primary responsibility for this lies not with the blood services but with central government and is explored elsewhere in this Report,[1024] but the blood services could and should have addressed this by setting up (as they did in relation to hepatitis in September 1982) a UK-wide working group the sole focus of which could have been AIDS and the steps that could be taken to reduce the risk to transfusion recipients.

Fourthly, it is undoubtedly the case both that the DHSS wanted to take the lead and that, having done so, there were significant delays on its part. This no doubt made it more difficult for the blood services to have in place uniform and robust measures.

Nonetheless, it remains the case that – whatever the DHSS was (or was not) doing – it was incumbent upon the blood services to take whatever action could be identified to reduce the risks of AIDS transmission to transfusion recipients. They did not do so.

The failure on the part of the blood services to give any kind of collective consideration to what could be done prior to May 1983, when the matter was properly considered for the first time by the RTDs and by SNBTS, was wrong. It should have been obvious that testing was inevitably still some way off. Deterring or preventing those in high-risk groups from donating should therefore have been a priority. It was not treated as such.

Few centres took proactive steps in advance of receipt of the first AIDS leaflet in September 1983. That some centres (eg Birmingham, Edinburgh) did use leaflets of their own devising rather than simply wait for the DHSS means that all centres could (and should) have done so. Although the possibility of using posters was raised by more than one RTD,[1025] there is no evidence that this simple, but potentially effective, measure was implemented across the UK.[1026]

A much more active approach to distribution of the leaflets should have been adopted from the outset, so as to ensure that the information within the leaflets came to the attention of all donors. Distribution with the call-up cards was one measure which should have been more widely adopted.[1027] Another way was to provide the information directly to all donors at the sessions. Thus by January 1985 the South London Centre was providing “Dear Donor” letters to all donors, giving them time to read the letters and asking them to confirm that they had. The letter, from the RTD Dr Rogers, identified the high-risk groups and then continued with a personal plea:

“My message to those in the above groups is ‘FOR GOD’S SAKE DO NOT GIVE BLOOD TODAY – YOUR BLOOD MIGHT KILL SOMEBODY’. If you would rather we did not know that you are in a risk group then tell the nurse at the bed you think you may be developing a cold, a sore throat, or feel unwell TODAY – anything – BUT DON’T GIVE BLOOD … Your co-operation will be as important as actually giving blood and we thank you.”[1028]

Practical measures to ensure that all donors understood the risks and were helped to take the right course of action if they were in an at-risk group could have been implemented by all centres in 1983 but were not.

That centres could find a way of alerting donors to the risks is clear, because this is what centres were required to, and for the most part did, do following the ministerial circular in February 1985. Thus, for example, at Brentwood a system was instituted with effect from 1 February of handing the leaflet to each donor as they registered and telling them that it was “most important” to read the leaflet.[1029] In East Anglia, where previously the old leaflet had been made available on display at sessions, the routine was “stiffened up” so as to ensure that all donors were made aware of the contents of the new leaflet.[1030] In North London, the “Some Reasons Why You Should Not Give Blood” additional leaflet was devised to overcome the reluctance of donors to pick up and read a leaflet on AIDS.[1031]

AIDS: what could have been done?

On 4 January 1983 a meeting lasting eight hours in Atlanta brought together representatives of the National Hemophilia Foundation, American Red Cross, various blood bakers, the National Gay Task Force, some state health departments, the Pharmaceutical Manufacturers Association, the CDC, Food and Drug Administration (“FDA”) and National Institutes of Health (“NIH”). On occasions the argument grew heated: Dr Donald (Don) Francis of the CDC at one stage banged on the table, asking how many deaths there had to be before meaningful action was taken. The US Assistant Surgeon General circulated a summary report of the meeting afterwards which conveyed these differences in more measured terms: “A consensus was reached that it would be desirable to exclude high risk donors to reduce the risk of AIDS transmission via blood and blood products. However, no consensus was reached as to the best method of doing this.” It recorded that participants had: “differing perceptions of: 1. The likelihood that AIDS is caused by a transmissible agent; 2. The risk of AIDS from blood donation (both whole blood and pooled plasma); and 3. The best approach for establishing altered guidelines for blood donation, donor screening or testing and donor restriction.” He called at the end of this note for each public health agency to provide candidate sets of recommendations for the prevention of AIDS in patients with haemophilia and for the other recipients of blood and blood products to Dr Jeffrey Koplan, assistant director for Public Health Practice at the CDC, with a view to the agencies developing a uniform set of recommendations.[1032]

Dr Francis took up the invitation and replied, to Dr Koplan as asked, on 6 January 1983. The recommendations he made are worth repeating here in full. They show that the views this Inquiry has reached as to what might have been done are not simply a product of hindsight – a submission to the Inquiry being that hindsight should be avoided – but demonstrate that similar views were set out, prospectively, in clear terms by an expert in public health. He wrote:

“I think the following recommendations should be promulgated by CDC with hoped for, but not essential, agreement of FDA:

  1. Funding. An additional 10 million dollars should be put forth to expand epidemiologic, etiologic, and clinical studies of AIDS.
  2. Whole blood and plasma collection. All blood and plasma donors should be deferred if:
    1. They are IV drug users (already in place).
    2. They are sexually (heterosexual or homosexual) promiscuous (more than than an average of 2 different people per month for the previous 2 years).
    3. They have had sexual (heterosexual or homosexual) contact with someone who is sexually promiscuous or an IV drug user in the past 2 years.
    4. They have lived in Haiti in the past 5 years.
    5. They have a serologic test positive for anti-HBc.
    6. There is good evidence that this will eliminate over 3/4 of AIDS ‘infected’ donors. It will also defer about 5% of U.S. blood donors and add about $5 to each unit of blood and plasma. These seem to be small prices for preventing a serious disease and a potentially dangerous panic.

  3. Factor VIII use.
  4. Only small pool (less than 100 donors) concentrate or cryoprecipitate be used on hemophiliacs starting immediately (after supplies become available). This recommendation should stand until either: 1) knowledge of AIDS permits more accurate recommendations or 2) plasma becomes available which has been collected using the previously stated donor deferral.

I understand that these recommendations will be controversial and that there will be objections by industry and blood bankers. I think we should get comments from these groups and should keep them informed of our to-be-published recommendations. However, to wait for their approval of our recommendations will only endanger the public’s health.”[1033]

This letter should make uncomfortable reading for those who were in decision-making roles in the UK in 1983 to 1985, including those in the blood services. The UK was in a position to learn from the US experience. It should not have regarded AIDS as a purely US problem, but rather one which was highly likely to spread to these shores: contact between UK and US citizens in person was frequent. The US experience was available to inform reactions in the UK. Though it is right to note that the US did not change the pool sizes it used, and it did not mandate the terms of the exclusions suggested by Dr Francis across the whole country, it shows what could and should have been under urgent consideration here. When this letter, with its clarity of recommendations and sense of urgency, is compared to the length of time spent in the UK refining the niceties of text for donor leaflets, and the sluggish nature of the initial response in the UK to the threat of AIDS, it is clear that the benefit of hindsight is not required to find the UK wanting, as I do.

Concluding words

Doing more, more quickly, would have saved some infections. Acting sufficiently, and promptly, was not just the responsibility of ministers, the DHSS and the SHHD, though it was in part theirs, as will be discussed later. It was also the responsibility of the blood services. Sadly, though some within the services did more, more quickly, this was not true overall: in general the blood services did too little, too late.

3.12 Regulation of Commercial Factor Concentrates

This chapter explores the system for the licensing of commercial factor concentrates and in particular whether the decisions to grant licences and thus permit the importation of these products in the 1970s exposed UK patients to unnecessary risks. It examines in detail the July 1983 decision of the Committee on Safety of Medicines not to prevent further imports and the failure to keep that decision under review. It also looks at the extent to which the Committee on Safety of Medicines discharged its function of communicating the risks of hepatitis or AIDS from blood products to healthcare professionals.


Key dates

1970 First use of commercial concentrates in the UK on a named patient basis.

October 1972 inspection of Hyland by Dr Duncan Thomas on behalf of the DHSS.

10 January 1973 CSM advises the grant of a licence for Hemofil.

February 1973 CSM advises the grant of a licence for Kryobulin.

22 May 1975 licence granted for Profilate.

12 November 1975 CSM(B) advises the grant of a licence for Factorate.

9 December 1975 DHSS Divisional Management Group meeting following World in Action documentary Blood Money.

21 January 1976 Meeting with Dr Owen at which Factorate application is discussed.

22 January 1976 CSM advises the grant of a licence for Koate.

28 March 1978 Kryobulin licensed as red packs (European plasma) and cheaper blue packs (US plasma).

1983 - 1984 licences for heat-treated Hemofil and Factorate are refused.

24 March 1983 FDA recommendation of steps including avoiding use of high risk plasma in US blood products.

9 May 1983 paper from Dr Galbraith recommends suspending importation of US blood products.

13 July 1983 CSM(B) decides not to recommend action to stop continued importation of commercial concentrates; recommendation endorsed by CSM.

28 September 1983 confirmation that import of factor products made from higher risk pre-March plasma will continue.

22 November 1984 CSM requests Licensing Authority to invite early applications for licence variations to permit distribution and sale of heat treated concentrates.


People

Dr R D Andrews senior medical officer, DHSS

Professor Arthur Bloom chair,UKHCDO

Dr Leslie Keith Fowler medical assessor, Medicines Division, DHSS

Dr Spence Galbraith director, Communicable Disease Surveillance Centre

Dr John Holgate medical assessor, DHSS

Dr Joseph Smith CSM(B) chair

Dr Duncan Thomas senior medical officer on the secretariat of the CSM(B)

Dr Diana Walford DHSS senior and later principal medical officer, DHSS


Abbreviations

CSM Committee on Safety of Medicines

CSM(B) Sub-Committee on Biological Products of the CSM

FDA Food and Drug Administration, US


Overview

This chapter explains the way in which the state controlled the supply of blood products manufactured abroad. After setting out the historical context, it considers (1) decisions to permit the importation and distribution of clotting factor concentrates made in the 1970s which exposed UK patients to unnecessary risks; (2) how a decision was taken in July 1983 not to ban further imports of factor concentrates, and whether this was flawed or appropriate; (3) similarly, whether the response to the importation into the UK (“dumping”) of factor concentrates which were manufactured from “riskier” plasma than that which was acceptable in the country of manufacture was sufficient and appropriate; and (4) whether any review of these decisions after mid 1983 was appropriate and if so when.

The first two of these decisions are of central importance in setting in train a series of events that led to widespread infections of hepatitis and HIV in people with bleeding disorders. Once the state considered the products sufficiently safe, all things considered, to be permitted access to the UK market, decisions as to whether to use them became dependent on a range of issues. These included convenience, finance, clinical preference and treatment policies, continuity of treatments, the “efficiencies” of production of NHS concentrates, and the way in which blood services were structured rather than placing a premium on the safety of the patient.

I have concluded from the facts available that the decisions in 1973 to license the importation of commercial factor concentrates were wrong on grounds of safety; that the decisions in 1976 to licence the importation of further factor concentrates paid insufficient attention to safety; that the decision of July 1983 was both wrong and flawed in the way it was reached; that in any event there is no evidence that the decision was kept under review as would have been appropriate; and that the approach to “dumping” was not justifiable.

Introduction

It has been known for centuries that traders selling “medicinal products” may expose consumers to risk. As early as 1540 legislation protected them against the grosser malpractices of traders. This empowered physicians of London to appoint four inspectors of goods sold by apothecaries.

In the 19th century the Pharmaceutical Society of Great Britain was established, and legislation was introduced to control the retail supply of poisons. In 1858 statutory provision was made for the publication of the British Pharmacopoeia. The Sale of Food and Drugs Act 1875 [1034] helped control the adulteration of drugs. The regime did not yet extend to biological products, such as vaccines and sera, the quality of which could not be assayed by chemical analysis. This had to wait for the Therapeutic Substances Act of 1925 which provided for the licensing of, and control of the strength and quality of, a product. The Pharmacy and Poisons Act 1933, like later Acts dealing with antibiotics and certain other products, set out lists of medicines which could only be supplied on prescription.

Weaknesses in the system were exposed by the Thalidomide tragedy in the early 1960s. As a consequence, the need to see that drugs were safe to use and to ensure the protection of the public resulted in legislation being strengthened, ultimately by the passing of the Medicines Act which came into force in October 1968.[1035]

Figure 1: The diagram from 1977 shows the Medicines Division’s many interactions with different bodies, including providing preliminary assessments.

Figure 1. Policy process for the control of medicines, from a DHSS leaflet dated July 1977.[1036]

The Medicines Commission was established by the Act. Committees of independent experts were appointed by ministers on the advice of the Medicines Commission. One such committee was the Committee on Safety of Medicines (“CSM”), in effect a continuation of the former Committee on Safety of Drugs. Its role was to advise “the Licensing Authority” (the Secretaries of State for Health and Agriculture, together with the respective ministers for Health and Agriculture in Scotland, Wales and Northern Ireland, exercising their powers jointly). Its advice was concerned with the safety, quality and efficacy in respect of the human use of any substance or article to which any provision of the Medicines Act 1968 was applicable.

In practice, the advice of the CSM was accepted by the ministers who constituted the Licensing Authority at the time. Again, in practice, the functions of the Secretaries of State for Health of the UK as Licensing Authority under the Medicines Act 1968 were discharged on their behalf by the Medicines Division of the Department of Health and Social Security (“DHSS”) in London.[1037]

Dr Diana Walford, who began her work as a medical civil servant in 1976 by occupying a post in the Medicines Division, was asked in evidence what role the Medicines Division had in licensing. She had a clear recollection: “Oh, completely … it was manufacturing licences and product licences. That was their job.” She was asked about clinical trial certificates or clinical trial exemption certificates. She responded: “Well, it was Medicines Division. I mean, again, if somebody applied to the Medicines Division or the licensing authority, as they actually were deemed, for a clinical trial certificate or a clinical trial exemption certificate that was to be considered. It fell to be considered by Medicines Division, whether it needed to go entirely to a Committee on Safety of Medicines Meeting I think might have depended on the product.”[1038] In short, the Medicines Division was effectively in control of the process.

The CSM established a number of sub-committees. One of those was the Sub-Committee on Biological Products (“CSM(B)”). It worked on its own.[1039] The organisation of the sub-committees changed over time. However, there was always a sub-committee dealing with biologicals, and another dealing with adverse reactions.

When an application was made for a licence to distribute pharmaceutical products,[1040] it was considered in the first instance by the Medicines Division. A pharmacist and a doctor who worked for the Medicines Division would each assess the application, along with a toxicologist. The assessor(s) would produce a report. Where the reports related to blood products, they were usually submitted to the CSM(B). The CSM(B) in turn would consider the reports and formulate a recommendation. Its members would know the views of the Medicines Division staff, for their reports did not simply summarise any danger but expressed a view on it.[1041] The Sub-Committee considered the reports, then formulated their recommendations for the main Committee. That Committee’s role was advisory. Technically speaking, the CSM and CSM(B) advised the Licensing Authority, but in practice the Secretaries of State had delegated their decision-making powers to the Medicines Division. Thus the same body, the Medicines Division, two or three of whose assessors had first expressed their views in a report to the relevant Sub-Committee of the CSM, ultimately had to take a decision: a degree of circularity recognised by Professor Sir Michael Rawlins[1042] when he gave evidence.[1043] If the CSM felt unable to advise that a licence should be granted, it had to offer the applicant a chance to appear before it, or to make written representations. The same applied if it was thought that an existing licence should be revoked, suspended or varied. If the Committee still intended to advise against the grant of a licence, the applicant had a right of appeal to the Medicines Commission itself, and in some cases had the right to a further onward appeal.[1044]

Until Freedom of Information legislation came into force, the proceedings of the CSM (and the proceedings of its Sub-Committees) were kept confidential. The Medicines Division (which provided the secretariat) together with members of the Committee alone knew the reasons for the decisions made and what discussions there had actually been. For some time before the Freedom of Information Act 2000 the CSM had concerns about its secrecy of operation.[1045] It believed that disclosing the reasons underpinning its advice would have public health benefits. It would enable prescribers to understand more fully the reasons underlying dosage instructions, contraindications and warnings, and would enable those advising prescribers to offer a better service to their readers and clients. It would permit more informed decisions by NHS purchasers and providers. But the CSM, though believing this, had never itself pressed for openness and transparency in this way until the Freedom of Information legislation presented the opportunity. Commercially sensitive information, often included in an application, could remain confidential, and did: but it would have been in the public interest for other matters such as the risks posed by products and the extent of those risks to have been made known earlier than they were.[1046]

Given the passage of time since licensing decisions were made in the early and mid 1970s, the confidentiality which prevented detail of any discussion being known, and no one now being available from whom the Inquiry can seek an explanation, the evidence is incomplete at best. The quality of the decisions of the Licensing Authority to license blood products to be imported into and distributed within the UK, which were first taken in 1973, thus has to be assessed on the basis of evidence I wish had been fuller. Greater material is available concerning decisions in the early 1980s, but this too is still incomplete.

NIBSC and PHLS involvement in licensing

Two other bodies had some input into the decision-making processes of the CSM/CSM(B). These were the National Institute for Biological Standards and Control (“NIBSC”) and the Public Health Laboratory Service (“PHLS”). Neither had any formal role in respect of Scotland or Northern Ireland.

NIBSC began its work in 1972. As words in its full title (“Standards and Control”) suggest, it was not concerned directly with the criterion of safety: its remit was, and remains, concerned with controlling standards – the purity and potency of licensed biological products used in human medicine.[1047] Though these had some impact on safety, its primary focus (in terms of the three central statutory criteria which informed licensing) was on quality and efficacy.[1048] Its role in licensing was on an informal “on request” basis, if the formal advisory bodies – the CSM and CSM(B) – or Licensing Authority asked for it, in order to take advantage of individual scientific expertise.[1049] It had a more formal role in the control of biological products after licensing, such as when products were required under the terms of a licence to undergo a “batch release” process. This required manufacturers to submit to NIBSC, on a batch-to-batch basis, protocols describing the results of in-process tests made during the manufacture and, in most cases, samples[1050] of all such batches. If a batch failed this scrutiny that batch could not then be distributed in the UK – and for this reason an adverse decision was known as a “stop order”. Dr Trevor Barrowcliffe, who worked for NIBSC as a scientist and later senior scientist and head of haematology between 1974 and 2006, understood that all blood products had to undergo the “batch release” process as a condition of their being licensed.[1051] This did not however apply to unlicensed products supplied on a “named patient” basis.[1052]

The functions of NIBSC described above were provided for by statute after 1975, when the National Biological Standards Board was established.[1053] That board took over management of NIBSC.[1054] The relationship between NIBSC and CSM(B) was particularly close.[1055]

PHLS was responsible for providing a bacteriological service for the control of infectious diseases. It was thus more directly concerned with safety than was NIBSC, and less with quality and efficacy. Until more recent times, it operated 52 area and regional diagnostic laboratories in England and Wales, providing diagnostic services and support for outbreak investigation to local hospitals, public health authorities, and environmental health departments. Each laboratory provided surveillance data, and sent microbiological samples for reference testing to the central PHLS units at Colindale, and took part in national investigations into infectious diseases.[1056] It had an epidemiological arm, based at Colindale with a Welsh unit at Cardiff,[1057] which kept human disease under surveillance (the Communicable Disease Surveillance Centre (“CDSC”)), and a research facility at Porton Down, the Centre of Applied Microbiology and Research (“CAMR”).

“Safety, efficacy and quality”

The Medicines Act 1968 required the Licensing Authority to take into particular consideration the safety of the product (which needs no further definition),[1058] its efficacy (ie how well it did what it was intended to do) and its quality (its manufacture in accordance with the specified method of manufacture, and the quality control which was exercised over this).[1059] The Licensing Authority was to disregard any question whether medicinal products of another description would or might be equally or more effective,[1060] but it was expressly provided that this did not apply when considering safety. In short, the Act itself permitted the Licensing Authority in deciding whether to recommend the grant of a licence to consider whether another product, even if equally or more efficacious, might be safer.

Every medicine is likely to have or carry a risk of side-effects. Some of those side-effects may be more severe than others. The fact that side-effects are caused by taking a medicine does not mean it is necessarily unsafe. As Professor Sir Michael Rawlins said, “safety is a balancing act between … what is being used to treat and what the adverse effects are. So if you’re treating some lethal condition, then you’re prepared to put up with more adverse reactions, perhaps, than something relatively mild. So it was a judgment of balancing safety and efficacy.”[1061]

The extent to which regard was had to licensing in other jurisdictions is not easy to determine. Professor Sir Michael Rawlins’ comments on the degree of weight given to licensing decisions in the US are particularly revealing. Asked “if something had gone through FDA[1062] approval, did that carry particular weight?” he replied: “Yes, it would indicate that a relatively strict licensing body had found it was appropriate.[1063] He added “if a product had already been on the market in the United States [there] might be more information on what had happened once it had gone on the market, about safety in particular.” This shows his general view of the rigour of the FDA: that it was “relatively strict”. This view does not sit easily with much of the material about the FDA considered nearly 30 years ago in the Krever Inquiry into contaminated blood in Canada. The report of the Krever Inquiry found gaps in protection: for instance, licences were not required and nor were inspections conducted by the FDA for most of the US blood centres that sold recovered plasma to fractionators. Thus commercial concentrates manufactured in the US were likely to contain material which came from sources which had neither been licensed nor inspected. When viral inactivation methods were claimed by manufacturers, “Neither the Food and Drug Administration in the United States nor the Department of Health and Social Security in the United Kingdom performed these studies. They were done by the pharmaceutical manufacturers themselves as part of the research and development involved in the manufacture and marketing of new drugs.”[1064]

A somewhat less favourable view than Professor Sir Michael Rawlins took of the rigour of the FDA’s processes was expressed by Dr Duncan Thomas. He was a senior medical officer on the secretariat of the CSM(B) between 1971 and 1974 (therefore during a period when the first licences for commercial distribution of Factor 8 were granted) and as such had a role in advising the CSM(B) whether it should recommend granting a licence.[1065] In his evidence he said:

“We would liaise closely with the FDA but this was not a rubber-stamping exercise. We still checked every application carefully and we would not be influenced or swayed by licensing within another jurisdiction. From my perspective, I recall that occasionally we were reluctant to accept evidence from the Americans where they said a product had already been licensed for a few years without causing problems and that we should take this into account. We would look at all the evidence carefully but we were not influenced by other jurisdictions’ licensing processes or decisions.”[1066]

Over 20 years after the first blood products were licensed for distribution in the UK, a committee to study HIV transmission through blood and blood products was set up in the US by the Institute of Medicine at the request of the Secretary of Health and Human Services. A number of the leading experts in various disciplines throughout the country undertook a two-year study and reported in July 1995.[1067] Their report was critical of the FDA in the way it had managed regulation of blood and blood products in the early 1980s, in particular finding that it had failed to take a proactive approach to regulation and had relied too heavily on the pharmaceutical industry. It noted that there was evidence that the agency did not adequately use its regulatory authority.[1068] The FDA had a blood products advisory committee which, though it contained members drawn from a variety of scientific disciplines, in the 1980s had a substantial membership drawn from those involved in blood banks and fractionators – representatives of the very bodies that were potentially subject to regulation – despite what the Institute of Medicine committee seems to have regarded as a potential conflict of interests.[1069]

The best reconciliation of these differing perspectives is that Dr Thomas was reflecting the approach as he knew it best, largely informed by his experience in the early 1970s, whereas Professor Sir Michael Rawlins was speaking of the approach as it developed in the 1980s. Given what was suggested by the Institute of Medicine study it would have been a mistake to place any determinative weight on the views taken by the FDA and (in particular) to assume that because it approved or licensed plasmapheresis centres and manufacturing plants these facilities operated as safely as they were supposed to operate. There can be little doubt about the impressions which British sources had reached about the probable quality of plasmapheresis centres. As described below, in 1972 on behalf of the CSM(B) Dr Thomas himself had inspected a plasmapheresis centre operated by Hyland. He returned to the UK less than impressed by what he had seen; and some three years later in 1975 a Granada World in Action documentary[1070] (with significant input from a well-recognised expert, Dr Arie Zuckerman[1071]) highlighted the same troubling picture. The apparent fact that those concerns had persisted for some three years in the case of some Hyland facilities suggests that the criticisms expressed by the Krever Report and the Institute of Medicine study may have been well-founded: but, more pertinently, were there to be appreciated in the UK in the early to mid 1970s if not for longer.

Exceptions to the need for a licence

Named patient basis

A licence was not required before a blood product could lawfully be sold in the UK in just two particular circumstances. First, a clinician could obtain supply of an unlicensed product on a “named patient basis”.[1072] The supply was intended for such treatment of a particular patient as their clinician thought was required.[1073] In such a case, if it were later proved that the product unreasonably did harm to the patient, the prescribing doctor could in theory be required to answer for their use of unlicensed products either in response to a complaint to the General Medical Council or in a legal action against the doctor under the law of negligence. Though neither was very likely to occur, it would be wrong nonetheless to say that there was no form of regulatory control which might have been exercised over its prescription simply because the CSM did not provide it.[1074]

Clinical trial exemption certificates

Second, a clinical trial exemption certificate might be applied for.[1075] This had the useful function of allowing products to be tested in use for human therapy, usually in anticipation that if the trial endorsed the benefits anticipated by the manufacturer an application to license the treatment would follow. Again, there was in principle a degree of regulatory control over this: ethical approval to conduct a trial would be required, and be subject to the local ethical committee’s decision; those people receiving treatment in the trial were meant to be told that it was a trial treatment, what its purpose was, what the potential downsides and advantages were, and (latterly) the manufacturer might be required, as a condition of the grant of exemption, to offer compensation to anyone who suffered unanticipated harm as a result of the treatment.

Applications for a licence for Factor 8 concentrates

The first use of commercially produced concentrate in the UK appears to have been under one of these exemptions. In 1970, Kryobulin was first used in the UK.[1076] By 1972, improved factor concentrates became available, and clinicians began to use them more. They were not yet licensed. The named patient exemption permitted their use.

No doubt because the demand increased as a result of these initial supplies, both the makers of Kryobulin (through their UK-based company Serological Products Ltd[1077]) and of Hemofil (made by Hyland, based in California[1078]) applied in 1972 for licences to permit distribution more widely in the UK.

For each application, Dr Thomas provided a report of his assessment to the Sub-Committee, containing his recommendation. Although his recommendation did not have to be accepted, since the Sub-Committee was itself composed of experts formed from different disciplines (including haematology) and could reach its own independent conclusions, it would in general accept the recommendation of its assessor, and that would in turn be accepted by the CSM itself. However, Dr Thomas observed that “licences were often given with conditions which demonstrated the CSM spotted deficiencies and assured they were rectified before the product was released on the market.[1079]

1973: Hemofil

On 24 October 1972 Dr Thomas inspected manufacturing facilities of Hyland in California, preparatory to reporting to the CSM(B) on its application relating to Hemofil. He also inspected a blood bank in Los Angeles which was owned and operated by Hyland and supplied plasma to them. In his summary report he noted that there was a hepatitis hazard associated with the products Hyland manufactured, as a result of two things: the nature of the donors (who “do not inspire confidence[1080]) and the “very large plasma pools”.[1081] He noted that “the firm make no attempt to disguise this potential hazard.”[1082] Indeed all bottles of Hemofil stated “the risk of transmitting hepatitis is present. No warranties are made or created. Warranties of fitness or merchantability are excluded”.[1083] According to Dr Tom Cleghorn[1084] (speaking to an investigative journalist, Michael Gillard, in October 1975), Hyland said that all their plasma was by then collected from the US mainland and was individually tested for Hepatitis B antigen by radioimmunoassay (“RIA”), but also admitted[1085] that plasma to manufacture initial supplies of Hemofil for the UK had not been individually RIA tested and had come from Puerto Rico, where they had a blood bank.[1086] It may be assumed that, if so, the CSM(B) was unaware of this and should have been told.

In his medical comment[1087] Dr Thomas wrote that Factor 8:

“represents a major advance in the care of patients with classical haemophilia. Such concentrates have enabled corrective orthopaedic procedures to be carried out, and for the first time there is a prospect of domiciliary treatment.[1088] The major disadvantage of currently available commercial preparations, such as HEMOFIL, is that they are prepared from very large plasma pools, and carry the risk of transmitting hepatitis virus … no attempt is made to disguise the risk of hepatitis, and it may be considered that the decision to use this material could be left to the individual clinician who can balance the potential hazard against the anticipated therapeutic benefit to the patient.”[1089]

It was on that basis that he assessed that the grant of a licence should be recommended.

CSM(B) recommended the grant of a product licence for Hemofil at its meeting on 10 January 1973. The minutes do not reveal any discussion regarding hepatitis risks.[1090] Given that Dr Thomas had drawn attention to these risks, specifically, and they were openly admitted in the application, this is surprising. The CSM accepted the recommendation from its Sub-Committee, and advised the grant of a product licence at its meeting on 25 January 1973. Though the practice was generally to accept recommendations from its specialist Sub-Committees, again no discussion regarding hepatitis risks is recorded in the minutes.[1091]

1973: Kryobulin

In respect of Serological Products Ltd’s application for a licence for Kryobulin, Dr Thomas reported that all donors were tested at each donation and any donor who had a history of pathological transaminase, or a positive hepatitis-associated antigen level, was permanently excluded from the donor programme. Despite these precautions, he commented that “the risk of transmission of serum hepatitis can only be diminished and not completely eliminated.” The manufacturers themselves made no secret of this. Dr Thomas reported that it was prepared by “large scale fractionation” obtained from plasma pools of 1,000 donors. The information from Serological Products Ltd was that the donors came from Austria and Germany. There was no suggestion that the donors were volunteers.[1092]

Dr Thomas observed that the factory had not been visited recently and that the Sub-Committee “may consider that the Austrian Authorities should be requested to carry out an inspection.[1093] The CSM(B) considered the application, and Dr Thomas’ report, at its meeting on 10 January 1973. The minutes record that “The Sub-Committee was informed that the manufacturer of Kryobulin PL/0215/0003 had not been visited recently, but the licensing authority intended to ask the Austrian authorities to carry out an inspection on its behalf.[1094] It is unclear from the minutes why the CSM(B) recommended the grant of the product licence without waiting for the inspection to take place and considering its findings.[1095]

The recommendation of the Sub-Committee was put before the CSM at the meeting in February 1973 and CSM advised that the product licence should be granted, which it was in March 1973.[1096] It was not until May 1973 that Austrian inspectors conducted an inspection of the manufacturing facilities,[1097] followed by an informal visit by the DHSS the following month.[1098] But Kryobulin was not actually marketed in the UK until July[1099] – and it is therefore to be assumed that the inspections proved satisfactory – satisfactory, that is, in the eyes of the inspectors: for when he spoke in September 1975 to Michael Gillard who was researching sources who could assist in his production of a factual report for Granada,[1100] Professor Zuckerman told him that he had visited Immuno in Vienna, and thought it a poor facility, which lacked medical back-up and had “dubious sources of plasma”.[1101]

Contemporary understanding of the risks of hepatitis

No reference was made in either of the reports produced by Dr Thomas, or in the minutes of the CSM(B) or CSM, about contemporary understanding of the relative safety of commercial products such as Kryobulin and Hemofil when compared with domestically produced concentrates or with cryoprecipitate. It is a feature of this understanding which may cause some puzzlement: why was it that the licence applications were not rejected on safety grounds?

Though some of the relevant material is discussed elsewhere in this Report, it is convenient to draw it together here in order to place the licensing decisions in respect of Hemofil and Kryobulin in context.

First, in October 1970 Bio Products Laboratory (“BPL”) began to screen blood products for Hepatitis B antigen: by November 1971 all plasma received at BPL was screened.[1102] The tests then performed were imprecise.[1103] More sensitive screening tests[1104] only began to be used in 1975, and even they did not “catch” all infections by Hepatitis B. It seems likely that similar tests were being adopted in the US as well as in Austria (and less certainly in those places from which plasma was sourced for production in Vienna). Batch testing would form a further line of defence albeit using a similar, imperfect, screening test. Although at the start of the 1970s many (perhaps most) will have assumed that post-transfusion hepatitis was caused by one virus, rather than by two major ones as turned out to be the case,[1105] there were early, authoritative indications that this might not be the case. Professor Joseph Garrott Allen wrote in 1970 that “it appears that at least two immunologically separate agents are capable of producing hepatitis from the transfusion of blood or the administration of many of its products”, and cited the work of Dr Saul Krugman in support.[1106] However, the risk of contamination of the blood used to make the products would be reduced by the testing available for one of those agents, and within the near future was likely to be reduced further as tests improved.

There were growing concerns about the source material (the blood and plasma) from which commercial products were made. It was considered that the risk of causing hepatitis from blood taken from paid donors was markedly higher than that of blood from unpaid donors: not only had Professor Garrott Allen said as much in “Commercially Obtained Blood and Serum Hepatitis[1107] but researchers at the US National Institutes of Health had found the same.[1108] Richard Titmuss had published his influential work The Gift Relationship in February 1970 to the same effect so far as risk was concerned.[1109] It was known by fractionators that the plasma used to make factor concentrates carried a risk of transmitting hepatitis, patients contracting it, suffering serious liver disease and dying.

Alter et al[1110] and Grady et al[1111] (both published in 1972) showed that the hepatitis risk from commercial blood was markedly higher than that of blood from unpaid donors, but also that even after excluding Hepatitis B Ag positive donors, much post-transfusion hepatitis remained.

Professor Garrott Allen repeated his earlier message in The Epidemiology of Posttransfusion Hepatitis – Basic Blood and Plasma Tabulations in 1972.[1112]

In early 1972 President Richard Nixon delivered a Special Message to Congress on Health Care, calling for “a safe, fast and efficient blood collection and distribution system”. Hospital Week reported this, stating that authorities in the field regarded the present system as inadequate, pointing out that hospitals in many cases were forced to buy blood from commercial blood banks “which often accept blood from such donors as derelicts and drug addicts who may be the transmitters of such diseases as hepatitis, syphilis and malaria. A study made two years ago indicated that 30,000 Americans contract hepatitis each year through transfusions of contaminated blood with 1,500 of them dying from the effects of the diseases.”[1113] The message was clear that blood from paid donors was riskier than that from volunteer donors.

Importantly, insofar as the CSM(B) and CSM wished to consider whether another, safer product might be readily available to treat patients, leading luminaries in the field of clinical haemophilia therapy in the UK, Dr Rosemary Biggs and Dr Katharine Dormandy, in an international forum reported in Vox Sanguinis 1972[1114] clearly assessed that patients with haemophilia could be adequately maintained with the use of cryoprecipitate. A number of the contributors to the forum observed that major surgery, emergency surgery and even prophylaxis could be achieved by using it. A major advantage of cryoprecipitate was seen to be a lower risk of hepatitis. The reported discussion tended to suggest that Factor 8 would be superior to cryoprecipitate only when this hepatitis risk was addressed.

This was not the view of a select few in isolation. The forum was international in its perspective. Nor was it simply the view of an international symposium. Within the UK itself Dr Peter Jones had written to say “cryoprecipitate is now the product of choice in major surgery, allowing the potent but antigenic animal fractions and expensive human concentrate to be reserved for major complications, the emergency treatment of patients with FVIII inhibitors, or, in the case of concentrate, for prophylaxis.[1115] He regarded cryoprecipitate as responsible for a remarkable change in life expectancy of those with haemophilia. In the US, Dr Carol Kasper and Shelly Kipnis published in the prestigious Journal of the American Medical Association in mid 1972. They concluded that, for older children and adults who had had little exposure to blood products, especially those with mild haemophilia, “single donor products are preferable.[1116]

Articles such as these reflected a high level of concern amongst a range of different clinicians and researchers, many of whom were established leaders in their field, that the risks of hepatitis (which were serious) were increased considerably where donors were paid and where large pools were used for manufacture.

Another aspect of the context at this time is that haemophilia centre directors were aware that larger pools theoretically came with a greater risk of causing “clinical” hepatitis.[1117]

It is impossible to know if, and if so how far, material such as this was discussed in the CSM when it came to make its licensing decisions about the safety of Hemofil and Kryobulin. Dr Thomas’ reports make no specific reference to any of these or similar sources, but knowledge of their thrust may have been assumed by him without the need to spell it out, since he did express concern about the risks posed both by the donors and the size of pools used in manufacture. There is however no reference in Dr Thomas’ reports to the ready availability of cryoprecipitate which, as the clinical views of the time establish, was regarded by many as the product of choice, except for major surgery and where inhibitors had developed in the patient.[1118]

Although Professor Sir Michael Rawlins spoke of the balance which it is necessary to strike between risks posed by products, on the one hand, and the advantages they bring through treatment of a difficult condition on the other, it would seem that the opinions reflected in the extracts quoted above would show that the general view was that cryoprecipitate ensured a treatment which was safer than that offered by commercial concentrates. This was notwithstanding that commercial concentrates did have significant advantages over cryoprecipitates in terms of convenience of administration, increasing the quantity of clotting factor which could be administered without causing circulatory overload, giving a lower risk of creating inhibitors, making it easier to determine whether a given dose would be sufficient or not, facilitating home treatment without the need for storage at very low temperatures, and allowing high-dose regimes to manage inhibitors. These are significant advantages, especially in respect of convenience of use, but on analysis they have only a limited effect on safety.

The opinions would also suggest that NHS concentrates of the time were safer. This was for two reasons. First, they were made from very much smaller pools. Second, the donor population was voluntary. So far as was thought at the time, the general population from which these donors came was one in which the prevalence of serum hepatitis was less than that of the populations in the US from which paid donors came. These features were most probably known to members of the CSM(B). However, no reference is made in the papers to the fact that NHS concentrates were available (though it seems likely this was known to the CSM(B)) even if supply was limited. There is no sign that these virtues of NHS concentrates were discussed in the course of any examination of the relative safety of commercial as contrasted with NHS product; indeed there is no evidence that there was such an examination at all. Dr Thomas saw two features as giving rise to a particular concern about the risk of hepatitis, an infection and illness which he would have known could be serious: pool size, and paid donations. The first of these two particular concerns, both of which he expressed in the case of Hemofil, did not apply with the same force to the NHS product, made as it was from smaller pools; the second did not apply at all. If it was not actually clear, it should certainly have been clear from the information available at the time that NHS concentrate was thought to give rise to a lower risk of hepatitis.

Demand for increased factor products

What does not feature in Dr Thomas’ reports or in the decisions of the CSM(B) or CSM is another aspect of the context. Although some 80% of the supply of clotting factor replacements came in the form of cryoprecipitate, and in 1973 (viewed overall, and thus including a period of time after the licensing decisions had been made to permit importation for general distribution to haemophilia centres) the balance of 20% was in the form of factor concentrates, split roughly half-and-half between NHS and commercial products, there was an increasing demand from clinicians for a greater availability of concentrate, especially where it was freeze-dried.[1119] At the end of 1972, Professor Edward Blackburn[1120] wrote on behalf of haemophilia centre directors to the Chief Medical Officer (“CMO”).[1121] He said: “The Directors feel that there is an urgent need to increase supplies of Factor VIII Concentrate, in particular of the freezedried [sic] concentrate. Many feel that if a British preparation cannot be made available very shortly, the commercial preparations should be bought.[1122] Though Dr William d’A Maycock’s view was that self-sufficiency should be the aim “at present insufficient freeze dried antihaemophilic globulin concentrate is made in the UK. There is thus considerable pressure from those who treat haemophiliacs for foreign commercial material to be bought. There is indeed, at present, the need to supplement the existing UK supply but the facilities for larger scale fractionation that will become available in England and Scotland should eliminate the need to use foreign commercial preparations or go a long way towards doing so.[1123] There was thus a general sense among many clinicians that more freeze-dried concentrate should be available; that NHS-produced concentrate was preferable; but that any need to supplement supplies by importing foreign-made products, rather than, or alongside with, augmenting supplies of cryoprecipitate, should[1124] be short-lived. There is no means now of knowing whether this was in the mind of the CSM(B): however, it can be noted that if it was it fell far short of reflecting any sense that there was a crisis of supply which was imperilling the lives of patients.[1125]

Indeed, the fact that it was (not unreasonably) thought at the time that the need for any importation would be short-lived makes it all the more puzzling that in the interim period between importation and the expected early date when sufficient concentrate would be available from domestic sources, a population of patients should be exposed to a risk when taking it was not critical to their health, and those who wished (after consultation with their treating doctors[1126]) to run that risk could take advantage of the named patient exemption.

After the first two products (Hemofil and Kryobulin) had been licensed, the DHSS opened a “call-off” contract with Serological Products Ltd and Travenol Laboratories Ltd for the supply of up to five million international units of each product to haemophilia centres.[1127] Not only was this set at half the quantity that had initially been anticipated[1128] but in the first full year demand for the product was “disappointing”. Only 47% of the allocation of Hemofil[1129] was taken up; and a mere 8% of Kryobulin.[1130]

1975: Profilate

The licensing of Hemofil and Kryobulin was followed by that of Profilate in May 1975.[1131] The plasma used for Profilate was said to have been tested by RIA, a more sensitive test for the presence of Hepatitis B than the electrophoresis tests used in respect of Hemofil[1132] and Kryobulin, but one which still led the manufacturer to acknowledge there might be some risk of hepatitis.[1133] The medical assessor made no separate reference to this risk, but noted that there had been no inspection of the manufacturing facilities.[1134] A list of the centres from which plasma came was provided by the manufacturer. The centres were operated by two different companies. Yet nothing is said in the report about the approach of either to the selection of donors or their rejection rate.[1135] Nor does there seem to have been an inspection made of any of the donor centres on behalf of the CSM(B) before the decision to grant a licence subject to conditions.

However, there is nothing to suggest that the product, though it came with risk, was any less safe than Hemofil or Kryobulin.[1136] If licences had been refused for those two products on the grounds of safety, no doubt a more rigorous examination would have occurred in the case of Profilate; but once the decisions in 1973 were made as they were, it is difficult to see how the CSM could recommend that a licence be refused for Profilate on the grounds of hepatitis risk. It would have been inconsistent with its previous decisions to do so, unless circumstances had changed sufficiently in the meantime. Furthermore, in 1974 there had been considerable pressure from haemophilia clinicians to increase the supply of factor concentrates. Their aim was that this demand should be fulfilled by NHS products (which were likely to be safer, and may well have been cheaper), but this was only likely to be achieved after a while. In the meantime, they sought an increase in the supplies of commercially produced concentrates.[1137]

1976: Koate

The licensing of Koate was applied for in October 1975.[1138] Dr R D Andrews had noted in his assessment report[1139] that the plasma for the product was supplied from 54 different firms under various ownership, including US State prisons, and “suffers from being prepared from multi-centre donations which cannot be properly controlled by inspection.” He pointed, however, to his understanding that each donation was RIA tested for HBV antigen and there had been no reports attributing hepatitis to Koate since its introduction in the US in February 1974.[1140] The date of his report was 17 December 1975 – just after the second episode of the Granada World in Action documentary Blood Money concentrating in part on the poor quality of donors on the one hand, and the claims made that the quality of the screening process was poor on the other, where (presumably) FDA regulations were in force.

However, in his report, Dr Andrews said that in the past the CSM had recommended conditions, which included the number of donations and information about the rejection of donors or donations, centre by centre. In early January 1976, the CSM(B) recommended the grant of a licence. It was subject to conditions. In particular, it sought information about the pool size, and the reasons for and rate of donor rejection on a centre-by-centre basis.[1141] The latter requirement appears to have been in order to form a view as to how rigorous the screening process was. One in which very few donors, if any, were rejected, might give rise to concern. The information as to pool size would be supplied in February.[1142] None was given as to the rates of donor rejection.

The CSM considered the recommendation of its Sub-Committee on 22 January 1976. It confirmed it, including the condition as to information about the reasons for and rate of donor rejection.[1143]

On 2 February 1976, Dr Andrews wrote to Bayer UK Ltd[1144] to say that a licence for Koate would be granted subject to conditions.[1145]

Correspondence followed about those conditions. Bayer UK telexed Cutter in the US on 18 February about their acceptability. Five days later, Cutter responded. In answer to the proposed condition to require ongoing information as to the reasons for, and the rate of, rejection of donors or donations, centre by centre, Cutter said:

“We do not collect information of this nature. Such information would be of dubious value in evaluating a plasma derivative product’s safety or efficacy. In the manufacture of Koateᵀᴹ, all Source Plasma (Human) used as the starting material is collected and handled according to regulations described in Title 21 of the U.S. Code of Federal Regulations. Similarly, all plasmapheresis donors must be acceptable according to the criteria described in these regulations. All plasmapheresis centers from which our source material is obtained are licensed by the U.S. FDA. Thus, the FDA insures [sic] that all donors and units of Source Plasma (Human) are handled according to the regulations.”[1146]

The letter also objected to batches of Koate being subject to the batch release procedure. This was on the basis that each lot was subject to rigorous testing by both Cutter and by the US Bureau of Biologics before being released to distribution. Accordingly, any “additional routine testing would be redundant”. This, it would appear, was a similar objection to that which was made to the request for donor information. In effect, it was being said that in each case reliance could be placed on the existence of the regulatory regime in the US. To argue that was to argue against lessons of experience that had already been learned in the DHSS: as Dr John Holgate put it in May 1975 “It is one thing to have regulations and another to learn of the enthusiasm with which they are carried through.[1147]

The Bayer UK/Cutter communication was forwarded to Dr Andrews on 4 March 1976.[1148] There followed a series of communications between Dr Andrews and Bayer UK. These concerned outstanding issues about the conditions, in particular the question of whether Cutter would submit to the batch release procedure. NIBSC, who would be due to conduct batch testing, told Dr Andrews that the licence should be refused unless Bayer UK/Cutter agreed that it should occur.

It is, however, important for what follows to understand why the batch release system was insisted upon by NIBSC. NIBSC said on 18 March 1976 that they would not agree to the licensing of a product unless there were batch release procedures because “Experience with all such products so far licenced has revealed deficiencies in proper formulation of information on protocols, as well as unacceptable assay biometry and methods.”[1149] In short, reliance could not necessarily be placed on all being done elsewhere as it was supposed to be done.

On 20 April, Bayer UK/Cutter pushed back on batch release for testing, arguing that it was sufficient to rely upon the work done by the US Bureau of Biologics.

NIBSC offered to explain the quality control difficulties to a representative of Bayer UK, if it “should wish to send someone to NIBSC Hampstead”.[1150] It did. In the event, Bayer UK capitulated (as it had to do if it were to be licensed).

The correspondence did not address any further question of the donor rejection rate. This is particularly surprising for two reasons. First, the apparent inability of Cutter to do that which the CSM(B) and CSM had both recommended, does not seem to have been substantively considered within the Medicines Division of the DHSS. Nor did the Medicines Division ask CSM(B) or CSM for further deliberation or recommendation on the point. Further, the DHSS continued to insist that there had to be a batch release process, and plainly did not think it satisfactory for Cutter to say that it would be sufficient for the UK licensing authorities to rely upon the fact that they might expect the regulatory authorities in the US to have given their approval. Nonetheless, they did not appear to insist in the same way on the observance of a condition as to donor rejections which both the CSM(B) and CSM had recommended.

Cutter then agreed with Speywood Laboratories Ltd that Speywood would import, sell and distribute Koate (and Konyne) on the UK market. Towards the end of August, Bayer UK withdrew its application for a product licence for Koate, since under the agreement it had reached with Speywood the importation and market would be solely in the hands of the latter. Accordingly, when the product licence was granted it was granted to Speywood. The licence contained no condition requiring the provision of ongoing data about donor rejections centre by centre, or at all.[1151]

1976: Factorate

In March 1975 an application was made for the importation and distribution of Factorate.[1152]

Dr Andrew’s report for the CSM(B)[1153] noted that there was a hepatitis risk associated with Factorate. He commented that it was not clear who supplied the donated plasma.

In November 1975, the CSM(B) recommended[1154] that a licence be granted, on similar conditions to those which Cutter had been asked to supply in respect of Koate, including information about the rate of rejection of donors, centre by centre, together with confirmation that the only plasma which would be used would be that from donor centres from the US or from other certified countries.

The importance of the proposed condition relating to the rate of rejection of donors or donations was emphasised by observations of Dr Holgate at a Divisional Management Group meeting in the DHSS on 9 December 1975. This was the day after the screening of the second part of Granada’s Blood Money documentary.

The meeting considered the criticisms which the World in Action documentary had made of Travenol’s production of Factor 8 in the US. The minute of the discussion continues:

“The criticisms were in conformity with an inspection report carried out on behalf of the Division. The Minister of State had been briefed and was concerned about the supply of the Factor and about the hazards of using it. A similar product manufactured by Armour had recently been cleared by the CSM; [this is obviously a reference to Factorate] Supply Division were anxious that it should be licensed as it would be available at a lower price than the Travenol product. There was some doubt as to whether the collection of blood products for either product was satisfactory. Dr Holgate said that he doubted whether inspection of the American collecting centres would be useful. What was needed was to strengthen the requirements in the product licence, and to insist on returns from each collecting centre including the rate of rejection of donors or donations.”[1155]

The reference to there being doubt as to whether the collection of blood was satisfactory can relate only to safety. By contrast, being “anxious” to license the product relates only to price. This shows that there was a pressure to license not because a product was safer, but because it was cheaper.[1156] It is not clear why a product which might be unsafe was to be licensed and what is of particular note is Dr Holgate’s observation that the DHSS should “insist” on returns from each collecting centre including the rate of rejection of donors.

The potential lack of safety is emphasised by the two following sentences from the discussion: “Dr Holgate said that doctors would prefer the British products as being safer. Indeed, once a pure supply is available, doctors will want to use the product in situations in which the currently available Factors would be too great a risk.”[1157] Importation of commercial concentrate was thus being licensed when the DHSS believed it to be less safe than UK products.

The Granada programme led to a meeting at which the Minister of State for Health, Dr David Owen, was present. He asked to see any further applications for product licences to authorise the importation of Factor 8. Accordingly, a submission was prepared for him regarding the application from Armour. A related minute echoed the theme of cost addressed in the December meeting, saying: “I understand that Supply Division have received a ‘very favourable’ tender from the company for the supply of Factor VIII to haemophilia centres but, of course, action on this depends upon the granting of a product licence.[1158] The author was part of the Medicines branch of the Medicines Division.[1159] The Medicines Division was, according to Dr Walford, effectively the Licensing Authority.[1160]

It is difficult to avoid an inference that at this time cost pressures were playing a role they should never have been permitted to play in licensing decisions. That inference is strengthened by the facts discussed in the next two paragraphs.

The submission itself contained the assessment that inspections were limited in their usefulness and it “seems best to assume that all blood products of this nature coming from the USA may be obtained from plasma taken under the worst circumstances and any protective measures should be achieved by other means.[1161] Those means are not spelt out: but it is clear that civil servants considered that the sources of plasma were likely to give rise to a significant risk that the plasma itself would contain infections.

The submission was preparatory to a meeting on 21 January 1976 where the Factorate application was expressly considered by Dr Owen.[1162] The “worst case scenario” assumed by the submission resulted in additional conditions being proposed. One of those conditíons was that “plasma will be obtained only from donor centres in the USA or in other countries specified in respect of which the licensing authority is satisfied as to the donation arrangements, being premises in respect of which you provide an undertaking that they may be inspected by or on behalf of the United Kingdom licensing authority.[1163] Armour’s reply did not contain any such undertaking. It did say “We confirm that the plasma will be only from donor centres in the USA, and from USA sources.” It also responded regarding the rate of rejection of donors centre by centre by simply saying “the rejection rate at blood collection centres is below 1% for accepted donors.[1164]

These responses leave a lot unsaid:

  1. What the donation arrangements actually were, so that the Licensing Authority could be satisfied with them (as had been asked).
  2. What were “USA sources” – they plainly were not donation centres, but the plasma had to come from somewhere. Therefore presumably it was plasma obtained through a plasma broker, who would be the US source, and could have come from human sources outside the US but whether inside or outside the US, there was little possibility that the rate of rejection of would-be donors might be established, or the donation facilities inspected.
  3. Whether the rate of “below 1%” was true in every donation centre, or was an overall average, and why – if it was to be given at all – it could not be given centre by centre.
  4. A rate of “below 1% for accepted donors” does not define who an “accepted donor” is – a regular donor? Or one accepted according to some other criterion or standard?

There is no evidence that the Licensing Authority took any further steps to clarify these matters, as might be available if they did. If, as I infer from the lack of evidence, they did not press Armour further, then given the assumption of the “worst case scenario” and the knowledge, as it had become, that collection of plasma in the US invited rather than assuaged risks of infection, this was a failure of regulation.

It is when these decisions as to the licensing of Koate and Factorate are viewed in context that the need for a clear understanding of the reasoning of the CSM(B) as to safety becomes clear. No material is now available to tell the Inquiry what it was. Arguably, however, circumstances had by now developed further from the times when licences were granted in respect of Hemofil, Kryobulin, and Profilate. 1975 in particular was a year in which concerns about the transmission of hepatitis by imported blood products mounted.

The context in 1975: hepatitis outbreaks

1975 started in the knowledge that in November 1974 there had been an outbreak of hepatitis amongst people with haemophilia in Bournemouth.[1165] This came under the spotlight. Of the 11 cases reported as constituting the outbreak, 7 had hepatitis which was neither Hepatitis A nor Hepatitis B. Dr John Craske spoke about this to the UK Haemophilia Centre Directors’ Organisation (“UKHCDO”).[1166] It is reasonable to think that what he said to them was to the same effect as he later wrote in respect of the same episode in The Lancet for August 1975.[1167] There he drew the attention of a wider audience than leading haemophilia clinicians to his concerns that commercial concentrates carried a much greater risk of transfusion hepatitis than did cryoprecipitate. He argued that their use should be reserved for life-threatening bleeds, and major operations. (If so, this is open to the comment that the use of NHS concentrates would have provided lesser risks than commercial concentrates, and if reserved for such incidents it is likely there would have been no shortage).

At the turn of the year from 1974 to 1975 therefore, hepatitis transmitted by commercial concentrate had become an issue of some momentum, at least amongst those treating haemophilia patients.

Then, in January 1975, Dr Owen expressed his belief that it was vitally important that the NHS should become self-sufficient as soon as practicable in the production of Factor 8 including AHG (antihemophilic globulin) concentrate: “This will stop us being dependent on imports and make the best-known[1168] treatment more readily available to people suffering from haemophilia.”[1169]

Also in January 1975, Professor Garrott Allen (from Stanford University) wrote to Dr Maycock (since he was consultant advisor to the CMO).[1170] He expressed his concern about commercial blood products being purchased by Britain from the United States.[1171]

In August The Lancet article by Dr Craske about the Bournemouth outbreak was published. Hemofil was implicated.

1975: investigative journalism

Then on 1 December 1975 Granada screened its World in Action programme Blood Money – Part 1.[1172] This highlighted the risk of hepatitis sourced commercially from prisoners and “skid row” paid donors, and featured an investigation into the facilities of Hyland. Its findings were consistent with the description of those giving blood at such facilities given by Dr Thomas in 1972. It was followed by Part 2 a week later.[1173]

As noted above, this led to a rare ministerial intervention in the licensing process, and a request for more stringent conditions being made in respect of Factorate.

Reverend Alan Tanner, chair of the Haemophilia Society, also described how in 1975 he was invited to see Dr Owen and that he “spoke very forcibly to Dr David Owen to let him know that we were not prepared to accept the risk of hepatitis coming from the blood products issued from the United States.[1174] The meeting can be dated to 11 December 1975. According to the Society, “Dr Owen stated that in 1977 we would be fully self-sufficient in concentrates. The whole question of commercial concentrates, plasmapheresis, the regional structure of the BTS [National Blood Transfusion Service], and other matters were fully discussed.[1175]

The following month Dr John Cash wrote to the British Medical Journal: “There is no doubt that the import into the United Kingdom of factor VIII concentrates derived from external sources, however well screened for hepatitis viruses, represents an unequivocal pathway by which the level of a potentially lethal infection into the whole community is being deliberately increased.[1176] Though Dr Cash was given to expressing himself in forcible terms his use of the word “deliberately” is striking. It must represent a view that those who were responsible for allowing the importation made a conscious choice to do so and, because they had the knowledge of what that choice was likely to lead to, were by that deliberate act knowingly facilitating its consequences. It may be that readers dismissed the expression as hyperbole; but at the least it indicates a strength of view amongst some of the professionals whose work related to the use of blood products about the (lack of) desirability of continued importation.

There is also evidence that the DHSS was indeed well aware of the dangers which the importation of commercial concentrates posed.[1177] The evidence for this comes from a sequence of letters in November and December 1975. Dr Theodore Cooper, US Assistant Secretary of Health, wrote to the UK CMO (then Dr Henry Yellowlees) following Granada’s production. He had been approached by Michael Gillard, the journalist investigating matters prior to the Granada broadcast, who had indicated that there was the feeling in the UK that the hepatitis warning in the leaflets accompanying antihemophilic factor concentrates was not strong enough. Dr Cooper commented: “We feel that the warning is quite direct and adequate; in fact, it is generally appreciated in the U.S. that every lot of this particular product is probably contaminated with hepatitis B virus.”[1178] Dr Sheila Waiter took on the task of responding to this. She noted that:

“while it is generally accepted that the benefits of having Hemofil available for the treatment of bleeding episodes far out-weigh the risk of acquiring hepatitis B nevertheless the statement that ‘every lot of this particular product is probably contaminated with hepatitis B virus’ will come as a surprise to many clinicians using the product, especially if the practice of issuing a warning on the label has been discontinued, as is indicated in the letter from Dr Cooper.”[1179]

She was dealing only with Hepatitis B. By early January 1976 Dr Maycock, to whom the correspondence dealing with the Assistant Secretary of Health’s letter had been copied, observed that the prevalence of hepatitis in the UK associated with UK blood and blood products had “long been smaller” than that in the US, but added (in chillingly prescient terms):

“However, until concentrate prepared from UK plasma is available, I would have said the benefits attaching to Hemofil and other similar concentrates of antihaemophilic factor, used with discrimination, outweigh the risk. There is always the problem of non-B hepatitis; some American authorities now say that this may account for 90% of transfusion associated hepatitis. This opens a new vista of complications.”[1180]

Set against this background, with concern about the detectability of hepatitis contained in plasma, the nature and motivation of the paid donors, reports on national television of how at least one company’s plasmapheresis centres operated, and a heightened concern to eliminate a need for reliance upon imported concentrate, the decisions to license two further commercial concentrates need some explanation. This is all the more so given that there needed to be reassurance as to the source from whom plasma was collected. Yet there appears to have been no site examination of any of the collection centres.[1181] Within the DHSS there was doubt over collection practices. Yet conditions which were thought should be insisted on as of real significance in ensuring as much safety as possible – the rate of donor rejections in collecting centres, centre by centre, in respect of the Factorate application – simply were not followed up on in respect of Koate and not provided in the specified detail for Factorate. There is however a hint that cost played a role it should not have done: such as the minute in which the Supply Division were stated to be “anxious” that Factorate be licensed as it was cheaper than Hemofil.[1182]

These events all tend in one direction. On the other hand, at this stage testing of plasma used in commercial products from abroad had improved: hence the variation of the licences in respect of Hemofil and Kryobulin to permit the use of RIA testing, which was a better screening test than had originally been used. Thus it was to be expected that the risks of transmission of Hepatitis B though not eliminated would be reduced, and although NANBH was not directly identified by such a test, it too might have been reduced since to an extent a test for Hepatitis B could be a surrogate for NANBH.[1183]

Though the dangers of making a judgement in hindsight must be borne in mind, the grants of the licences for Koate and for Factorate are highly questionable. There may have been reasons for what on the face of it appears to have been a decision to take a further risk which seems an unnecessary one to take, and should have been seen as such at the time. Given the events just described, there may well have been discussion. However, on the evidence available to the Inquiry, if safety were to be the primary consideration as it should have been, they are decisions which are difficult to justify. Unfortunately, such documents as remain reveal nothing beyond Dr Andrews’ report which can assist with the nature of them. The evidence, such as it now is, suggests an absence of consideration of plainly relevant factors. The process appears lacking, and the decisions flawed.

1978: Kryobulin in blue and red packs

In November 1976 an application was made by Immuno to vary the product licence in respect of Kryobulin.[1184] The firm sought permission to use plasma from licensed plasmapheresis stations in the US in its manufacture, in addition to plasma from Europe. Kryobulin produced from the US was to be sold in a blue pack, Kryobulin produced from European sources in a red one. The former was said to be cheaper. In the minutes of a meeting of haemophilia reference centre directors on 6 April 1979, it was suggested that the implication was that the cheaper product carried the higher risk of plasma viral hepatitis and this worried some directors.[1185] Although a first version of the minutes recorded that Professor Ilsley Ingram[1186] had been in contact with Norman Berry, the managing director of Immuno Ltd, who had said they “aimed at making available to clinicians material which may carry less risk of transmitting hepatitis”,[1187] an amendment at Professor Ingram’s request was adopted at the next meeting. The amendment substituted that Norman Berry “had said that the American (‘blue’) material was offered for those who wished to take advantage of the lower American price, whereas the European (‘red’) material was still available for those who felt that it carried a lower risk of conveying hepatitis, although the Company regarded both products as equally safe.[1188]

When the application to vary the licence was made the reason was set out in this way:

“It is possible to sell Factor VIII Concentrates produced from plasma of US origin at lower prices than European based material. Because of the preference in the UK market for this lower priced material, we also wish to make it available. Packs of Kryobulin from alternative source material will be of a clearly distinguishable colour e.g. blue as compared with present red. We will continue to make available European as well as the proposed new concentrate derived from American Plasma.”[1189]

Despite the apparent reasoning – to offer clinicians both a choice of product, depending on their views as to product safety, and a choice of price, to compete more effectively with cheaper US-made alternatives – when marketed neither the labels on the packaging nor the accompanying information sheets distinguished between them on the basis of the origin of the source plasma.[1190] Reference centre directors were however plainly aware of the difference, hence the discussion about it in their minutes of April, about which they would have been reminded in October 1979 when Professor Ingram corrected the record of the earlier discussion. However, neither patients nor other clinicians would necessarily have been aware. Nor would they have been alerted to the concerns of those reference centre directors who thought that the US-sourced product was cheaper because it carried more risk of hepatitis.

Internal documentation from Immuno AG, the manufacturing company, suggests a different reason from that signed off on the application for a licence. The documents are in German: an English translation is used here. The managing director of production at Immuno, Dr Otto Schwarz, is reported as having had a conversation with Norman Berry. A “Note for the Registration Department” states:

“In the future, two types of KRYOBULIN concentrate will be sold – KRYOBULIN 1 and KRYOBULIN 2. KRYOBULIN 1 = Made from European plasma (with a lower hepatitis risk) KRYOBULIN 2 = Made from US Licensed Source Plasma (proven to have a significantly higher hepatitis risk) KRYOBULIN 2 will be significantly cheaper than KRYOBULIN 1 because the British market will accept a higher risk of hepatitis for a lower-priced product. In the long-term, KRYOBULIN 1 will disappear from the British market.”[1191]

If, indeed, the concern expressed at the April meeting of reference centre directors was well-founded, it is troubling to note that sales of Kryobulin (blue pack) licensed thereafter exceeded those of Kryobulin (red pack) by 4.5 times.[1192] The last sentence in the quote above would be shown accurately to have predicted what would happen.

However, there is no evidence that the Licensing Authority was aware of the view expressed in the translation noted above. It was merely being asked to authorise Immuno to distribute a product sourced from US plasma, in the same way as it had licensed other products made by commercial rivals of Immuno from similar sources.

Though it might be argued that the Licensing Authority should have been more critically alert, and interrogated the reasoning more carefully, on balance it should not be criticised for permitting a less safe product (if it was) to be marketed in place of an existing product. To do so, it would have had to conclude that the plasma being used was less safe purely because it came from the US, to the extent that it should be banned. But the Licensing Authority had previously authorised products made from very similar sources, and continued to do so. It is difficult to see how it could have excluded blue pack Kryobulin without reviewing, and excluding, all those other products made from US-sourced plasma (or, indeed, made from plasma sourced from paid donors in any country).[1193] Kryobulin was no different from other commercial products in being made from the plasma of paid donors, from large pools. The time for refusing a licence because those two factors (paid donors and pool sizes) adversely affected safety was when Hemofil and Kryobulin were first licensed: so the “ship had sailed” and the authority was caught by the approach it had taken to its own earlier decisions.

However, those decisions should have been reviewed in the light of the development in 1975 and early 1976 reported above: yet the decisions in respect of both Koate and Factorate left much to be desired, for the reasons given above. Though it is clear that the regulatory regime as a whole did not fulfil its functions in respect of the importation and licensing of blood products commercially produced, if the decision in respect of blue and red pack Kryobulin is examined on its own, then for the CSM(B) to recommend that there should be no variation of the licence it would have to have been aware of a reputable study[1194] showing that paid German and European donors were significantly less likely to be infected with hepatitis than donors in the US. It follows that to find that it should have made a different decision would be to require too much of it.

Commentary

There can be little doubt that although applications had to be judged against three criteria – safety, quality and efficacy – the paramount consideration among these three was that of safety. Three things reveal this. First, the inspiration for the governing legislation was what had been experienced with Thalidomide.[1195] To prevent a similar disaster happening again was primarily a question not of efficacy (whether the product “did what was claimed on the tin”), nor quality (consistency of manufacture, and ensuring that pharmacologically the product was as it was intended to be) but of safety. Second, the Medicines Act 1968 itself permitted consideration to be given to whether there was a product that might be safer which was already licensed, whereas it excluded an equivalent comparison in the cases of quality and efficacy. Third, the name of the recommendation-making committee was the Committee on Safety of Medicines. The name says it all.

Since safety is not necessarily an absolute, it involves balancing risk against benefit. However, where there are sufficient products already on the market, the introduction of a product which acts in a similar way, which does essentially the same job as they do, and is of similar quality means there is no or no sufficient benefit to set against the known risks to which that further product gives rise. When licences were given to Hemofil and Kryobulin in 1973 they did essentially the same job as NHS concentrates and locally produced cryoprecipitates, that of increasing Factor 8 levels in recipients. They acted in a broadly similar way.

Why, then, was the safety of recipients risked by permitting a more widespread distribution of commercial concentrates than was the case when they were imported on a named person basis? Certainly, one of the reasons for the licensing of imported commercial products to treat bleeding disorders may have been to satisfy a desire of clinicians in the UK to use a product with the advantages which factor concentrates (both commercial and NHS) had. These are undoubted when it comes to convenience and ease of administration but, as observed above, have only a tangential bearing on safety. The major change in haemophilia therapy since the Second World War had been the introduction of cryoprecipitate from 1966 onwards. It was the product principally in use in 1973 when the first licensing decisions were made in respect of concentrate. It was thought desirable that there should be more factor concentrate to reap the advantages which factor concentrates gave. But given that those advantages were, on analysis, matters principally of convenience of administration,[1196] the price of having more factor concentrates which were from large pools and paid donors was the taking in of products manufactured in a way which significantly increased the risk to recipients that they would incur serious, and sometimes fatal, disease.

There is limited documentation which bears upon the reasoning of the CSM(B) in 1973 and 1975 of which we now have sight. Before the Committee would have been the licence application, and the medical report by the medical assessor (in the cases of Hemofil and Kryobulin this was Dr Thomas; in the cases of Koate and Factorate, Dr Andrews). No other significant documentation has come to light. There is no record of the discussion. The Sub-Committee kept its deliberations under wraps. No one not a party to the discussion would have known of it. No witness who was present, and is still available to be asked, has a sufficient memory to fill in any of the details of what was said.

The supply of products on a named patient basis places an onus upon the medical practitioner to justify their choice of product should need arise and if it appears they may have chosen badly. By contrast, once a product is licensed practitioners choosing it for use in therapy are likely to consider that the fact of it being a licensed product is an indication of its safety, quality and efficacy. They will know that regulatory authorities whose primary concern is safety have authorised its marketing and distribution: the burden of justifying the choice of product is then some distance towards appearing justified. Summarising what one clinician ruefully said to the Inquiry, surely, if it was licensed it was safe?[1197]

A possible basis for the decision made by the CSM(B)/CSM, at least as far as Hemofil is concerned in 1973, is Dr Thomas’ final argument in his report that “no attempt is made to disguise the risk of hepatitis, and it may be considered that the decision to use this material could be left to the individual clinician who can balance the potential hazard against the anticipated therapeutic benefit to the patient.”[1198] However, what was required was a judgement by the CSM as to the safety of the product. It was for it, not others, to draw the balance. If those words, however understandable they may seem, were adopted by the CSM in its reasoning (it, and not Dr Thomas, had responsibility for any decision) it would be accepting an approach which abdicated that responsibility. If the CSM were to take this approach, it would have to make it clear to the clinician (and ultimately the patient) that a decision as to overall safety risk/benefit was what was expected of the clinician: it could not be assumed from the fact of licensing. It could have considered that clinicians, some of whom might be quite junior, who were in practice likely to be exposed to pressures of time and emergency in a hospital setting, do not always have time and leisure to reflect and research overall risks and benefits. It might have reflected that an Act intended to assure the safety of medicinal products after the Thalidomide tragedy would not have been satisfied by a decision that it was for a clinician to determine the risks and benefits of prescribing that drug; but, rather, that there are some decisions as to safety which are best routinely[1199] taken out of clinicians’ hands. In short, if leaving it up to treating clinicians to decide on the safety of the product for use, was the basis for the decision of the CSM to license it, it was unacceptable.

Given the belief at the time the initial licensing decisions were made in 1973 that commercially manufactured blood products sourced from paid donors and manufactured in large pools were less safe than either NHS concentrate (made from smaller pools of plasma derived from volunteers) or cryoprecipitate (less likely to transmit infection because of the very small pools or single-donor nature of the product, and also made from volunteers), which was a rational and evidenced viewpoint, and given the absence of any information showing that patient safety was at risk by reason of a failure to import factor concentrates,[1200] the decisions to license Hemofil and Kryobulin are not easy to understand.

The decision to license Profilate is justified only by the fact that it followed the licensing of Hemofil and Kryobulin. The decisions in respect of Koate and Factorate however occurred against a different background, especially since it was now becoming widely appreciated not only that testing for Hepatitis B was imprecise, but also that the majority of infections were of a form of hepatitis for which there was a supposed viral cause but no available test (neither Hepatitis A nor Hepatitis B but a virus or viruses which were blood borne: non-A non-B Hepatitis). Conditions were proposed which were aimed at reducing the risks of infection: but they were not insisted on as they should have been, and answers to requests for information could be unrevealing, and ambiguous.

It is surprising, given the changing context summarised above, the developing knowledge of the potential risks of non-A non-B Hepatitis, and the general view that factor concentrates were likely still to cause infections with Hepatitis B, that the renewal of licences (a process which occurred every five years[1201]) was not accompanied by detailed discussion of whether it was sufficiently safe for the products to remain licensed. There is no evidence that this detailed consideration took place.[1202]

In February 1980 there is a somewhat curious example of an exception to this. Speywood had a product licence to distribute Koate (as set out above) following an agreement with the manufacturer, Cutter, to do so. That agreement expired. Cutter did not propose to renew it. Accordingly, in February 1980 Speywood obtained a variation to its product licence to allow it to: (a) sell its remaining stock of Koate for one year, and (b) import unlabelled vials of Factor 8 concentrate manufactured by Cutter for relabelling and sale under the name Humanate. Later documents show that Humanate was Koate re-labelled.[1203] However, NIBSC raised concerns that since Speywood and Cutter were no longer in contract, they could not obtain details from Speywood as to the source of the plasma, the place or method of manufacture, such that the licence could no longer safely be continued. The Licensing Authority wrote to Speywood to vary the licence because:

“Humanate could no longer be regarded as a product which could safely be administered for the purposes indicated in that product licence since evidence of access to data relating to the original manufacture, as evidenced by the absence of protocol data relating to the source of donor blood and in process control, was now lacking … Without this evidence, there was no means of ensuring that the product had been manufactured under conditions which could be shown to minimise the risk to patients of contracting, for example, NON-A and NON-B hepatitis. The action which was proposed would be taken in respect of any product licence for a biological product under similar circumstances”.[1204]

Speywood exercised its right to argue an appeal before the CSM. The hearing maintained its advice that “because of the risk to patients arising from lack of evidence as to the origins and provenance of the donor blood, the Committee were not satisfied as to the safety of the product.[1205]

There is a contrast between the approach taken here – in essence, that if there was not full access to knowledge of “the origins and provenance” of donor blood the product would not be licensed on safety grounds, and the approach taken in respect of the licensing of, for example, Koate, where that information was not provided,[1206] but the licence was granted.

Reconstructing the past as it was inevitably has shortcomings, given the limited material available. As a result, I have no adequate information[1207] to explain a recommendation that it was on balance safe to license Hemofil and Kryobulin when they were first licensed. On the face of the applications, to use the products exposed patients to a real risk of hepatitis. The risks were, if anything, emphasised by the report of Dr Thomas in respect of Hemofil. Safer products were already in use (though it is unlikely that the practice of the CSM involved any detailed comparison of safety). The best insights into the reasoning now available are the reports of the medical assessor for both. The reports contain much to concern a reader in respect of safety, and nothing other than general convenience in use to balance against those concerns. Such reasoning as these reports offer, if adopted, would not be a proper basis for the decision, for no acceptable balance is struck; no other reason to justify licensing is apparent; and there is no discussion of the context in which safety fell to be assessed, that of safety concerns being expressed by respectable commentators.

I have concluded that on the evidence before the Inquiry, such as it is, there was evidence of a lack of safety. The risk was a serious one. Though safety is a balance, and not an absolute, there is no material now[1208] available which shows what it was that may have tipped the scales in favour of licensing. I have concluded that the decisions were wrong.

They led to the decisions about Profilate, Koate and Factorate, in respect of which (so far as the last two are concerned) there were further reasons[1209] to think the decisions flawed. The requirement was to give particular consideration to safety. The evidence available to the Inquiry suggests that safety was not put first. Having considered all the available evidence, I have ultimately to conclude that the decisions should not have been taken as they were: they should certainly not have been left to individual clinicians to take.[1210]

Licensing in the 1980s

In March 1981 members of the Haemophilia Centre Directors’ Organisation became alarmed by reports in the press about the importation of blood products into the UK. Their chair, Professor Arthur Bloom, expressed two concerns to Dr Holgate of the DHSS.[1211] The first was a concern that material produced by a US company had been sold “through brokerage or other means”to Speywood where it had been relabelled as a product of their own.[1212] A second concern was that it had been reported by Dr Geoffrey Savidge that a firm (Inter-Pharma) intended to market cut price Factor 8 obtained from Cutter and from Hyland, and he was concerned that the material was cleared for use and had passed the “normal control mechanisms”. This was especially since the Hyland material might be high potency, rather than the cheaper intermediate variety which had “been marketed in the ethical way in the UK.” The UKHCDO were aware that it might be difficult to be sure of the exact origin of plasma used in any of the currently available concentrates.

Dr Holgate replied. He suggested that the Speywood issue was “being dealt with, and that Inter-Pharma was necessarily subject to licence requirements. He stated (when responding to the Speywood issue): “As I am sure you are aware one of the cornerstones of our philosophy for the licensing of ‘biological’ products is to have detailed knowledge of and control over early stages of manufacture and in-process control – this including source material.[1213]

In doing so, he made a bold claim. There is little, if any, evidence to back it up. Certainly, in the early 1970s Dr Thomas visited Hyland facilities in order to compile his report in respect of the licensing of Hemofil. However, no similar report was made to the CSM(B) when it was considering the applications for licences made later in the 1970s. When she worked in the Medicines Division, Dr Walford also recalled a visit she had undertaken with two inspectors to the US, to examine two blood product manufacturing facilities on sites fairly close to each other. She recalls it because she ended up writing the report. In summary, she described it as “not a happy one” for the manufacturers. She said “We were very unhappy … about the facilities … [there was] the big clean area that was not a sterile area … and the toilets opened up into and off the clean area and people were just toing and froing and there was no changing of clothes, and so on, and that was not good practice, and we wrote it up, amongst other things.”[1214] As already mentioned, Professor Zuckerman visited Immuno facilities in Vienna shortly before 1976[1215] and was not impressed by what he saw. Dr Thomas had informally been permitted to see the Immuno factory not long after Kryobulin had first been licensed, and thought parts of what he saw were concerning.[1216] It is unclear how “control[1217] was exercised over facilities such as these, in another country and subject to their own regulatory systems. It was also a surprising comment for Dr Holgate to make given that he had been minuted at a meeting in 1975 as saying that “shortage of funds inhibited visits to manufacturers’ premises abroad”.[1218]

It is difficult to think that there was anything approaching a comprehensive system of inspection and control of the sources of plasma used in manufacturing commercial blood products.[1219] Some was sourced from prisons.[1220] Some was, at least for a while, sourced from countries outside the US. There was a considerable trade in plasma; some was supplied to companies by plasma brokers. It is difficult to know where the original source was – yet it was this in respect of which the UKHCDO had sought some clarity. Occasional visits of the sort described by Dr Thomas and Dr Walford, echoed in the Granada World in Action programme to which Professor Zuckerman had contributed, do not, on the face of it, justify the claims that Dr Holgate was making – there should be far more evidence, such as correspondence showing that the DHSS had learned lessons from similar depressing reports (when and if they were made) and was taking manufacturers to task, then re-inspecting the facilities and processes to satisfy themselves that improvements had been made, under the threat of revoking licences or of imposing conditions to require those improvements. All that would be necessary before claims of the sort Dr Holgate made could be accepted.

This is, however, taking Dr Holgate’s claims as standing in isolation from their context. The context was a response to the particular situation regarding Speywood, and its marketing of Humanate. The Licensing Authority could and did exercise some degree of “control” through imposing conditions on the product licence, for example specifying the warnings that should be used, the use of international units as measure, and requiring information about donor rejections,[1221] donor sources, detailed accounts of how the product was manufactured, and the results of tests done during and on completion of manufacture. All of that control was indirect: but it was backed up by the power to refuse a licence and/or batch release to a company that did not comply. An example was Speywood. It could not supply the information requested. In consequence, its product was not released onto the UK market.

As a general statement, Dr Holgate’s comment that the UK Licensing Authority had “detailed knowledge and control over early stages of manufacture and in-process control” may be criticised for overstating the degree of control exercised. The specific context of the letter to which he was responding may provide mitigation: but it remains the fact that only some inspection as such occurred, and the indirect controls described in the last paragraph depended centrally on knowledge derived from the manufacturers themselves.

The March 1983 watershed

I have set out in an earlier chapter the story of the developing knowledge of AIDS in the Western world, starting in June 1981. It is nonetheless necessary to set decisions taken in March 1983 by the FDA, and then in July 1983 by the CSM(B) and the CSM in context.

By early 1982, The New England Journal of Medicine carried an article suggesting that the recorded cases of the principal symptoms included in the syndrome (PCP and Kaposi’s sarcoma) might only represent the tip of the iceberg in terms of the prevalence of conditions associated with AIDS.[1222]

By August 1982, it had been reported to the World Congress of Blood Transfusion (Budapest) that it was suspected in the US that an infective agent in blood concentrates resulted in people with haemophilia dying of AIDS.[1223] Later that month, Alpha committed to stop using plasma collected from donors who were Hepatitis B surface antigen (“HBsAg”)[1224] positive in order to manufacture Factor 8 and Factor 9 concentrates, a move that recognised the possibility that there was a viral cause of AIDS which was transmissible by blood products.[1225] Dr Denis Donohue of the FDA asked the pharmaceutical companies to stop using blood collected from donors likely to have high levels of antibodies to hepatitis.[1226]

By September haemophilia doctors in the UK were alerted to the possibility (or risk, by another word) that whatever caused AIDS might be a virus, and that this could be transmitted by blood or blood products.[1227]

By September 1982 it was reported in the Morbidity and Mortality Weekly Report (“MMWR”) by the Centers for Disease Control (“CDC”) of the US that there had been 593 cases of AIDS, 243 of whom had died;[1228] by October 1982, it was reported that the CDC had been notified of 684 individuals who had been diagnosed with Kaposi’s sarcoma (“KS”) and/or serious opportunistic infections resulting from an acquired immune deficiency[1229] and that at least 260 (41%) had died;[1230] by December 1982 the MMWR figure was 788 cases.[1231] If the same mortality rate were to occur in people with haemophilia, then the risk of dying from AIDS would be far greater than from any other cause. It could also be seen that the number of cases was doubling numerically roughly every six months.

By November 1982 Dr Craske expressed his view that there were three possible causes of AIDS. He discounted the first two of these – the taking of amyl nitrite, and the immunosuppressive effects of CMV.[1232] He thought an infectious agent was the likeliest of the possible causes he identified.[1233]

By the end of 1982 – according to Dr Charles Rizza of the Oxford Haemophilia Centre – it was clear (and should therefore have been known by all UKHCDO directors) that there was a real risk that AIDS could be transmitted by an infectious agent carried by blood products.[1234] By now, too, it was becoming more and more apparent that this was actually the case, though certainty was still lacking: it was reported that a baby in San Francisco had developed symptoms of AIDS after receiving transfusions. One of those transfusions had been given by a donor who was subsequently diagnosed with AIDS.[1235]

A viral cause was further implicated on 7 January 1983, when the MMWR noted that it had, since June 1981, received reports of 43 females who had developed pneumocystis pneumonia or other opportunistic infections typical of AIDS. Some had no risk factor other than to be the steady partner of a man with AIDS or at high risk of it. If accurately reported, the cause was highly likely to be viral, transmissible by sex as well as blood, and similar in these respects to Hepatitis B.[1236]

On the same day Alpha issued a press release which said that: “The evidence suggests, although it does not absolutely prove, that a virus or other disease agent was transmitted to [haemophilia patients with AIDS] in the Factor VIII concentrate, derived from pooled human plasma”.[1237]

Further information emerged about the nature of the risk. Dr Craske told 21 haemophilia directors at a meeting in the Excelsior Hotel at Heathrow on 24 January 1983[1238] that:

  1. the disease was “intractable”.[1239]
  2. up to December 1982 in the US 45% of those suffering from it had died.
  3. ten people with haemophilia in the US had been affected of whom five had died, the youngest aged seven.
  4. there appeared to be an incubation period of between six months and two years.
  5. studies reported in The New England Journal of Medicine[1240] showed that people with haemophilia who were currently without symptoms but who had received factor concentrates had abnormalities of the T-cells of their immune system. By contrast, those who had received just cryoprecipitate had not.

The Lancet [1241] and New Scientist [1242] concurred that the prime suspect was a blood-borne virus.

Other countries in Europe started taking precautions: for instance, Dutch physicians treating haemophilia patients agreed to use only cryoprecipitate in children under four and suggested that all other patients should consider cryoprecipitate as the treatment of choice, followed by locally produced (smaller pool) Factor 8 and Factor 9 concentrates. The Dutch Association of Haemophilia Patients advised members that it was highly likely that Americans with haemophilia had been infected by using factor concentrates.[1243]

When the CDC reported in March that blood products or blood appeared responsible for AIDS among haemophilia patients requiring clotting factor replacement it also noted that the first signs of AIDS might take two to three years to emerge after exposure to a possible transmissible agent. It added that there was a fatality rate of more than 60% for those first diagnosed over one year previously.[1244]

Reviewing the position as it was in early 1983 leads to these two observations.

The US was the epicentre of the AIDS epidemic in the Western world. The earliest recorded case identified in the UK was in December 1981, two years after the first symptomatic case of AIDS was believed to have occurred in the US – but the person concerned had regularly travelled to the US. An outbreak in the UK was thus separated not only by geographical distance but also by time from that in the US: for those who chose to be alert to it, there was an “early warning” of what might yet be in the UK, coupled with a knowledge that AIDS had come to its shores. After all, Terrence Higgins had died of AIDS in July 1982, and a Trust in his honour had already been set up by his friends.[1245]

During 1982, concern had risen in the US that the epidemic had many features which were reminiscent of the previous epidemic of Hepatitis B, which in part had been spread by blood and blood products as well as by sex. Because it was now being reported to the CDC as arising in Haitian patients and intravenous drug abusers, who were regarded as less likely to use amyl nitrites or to have anal intercourse[1246] than male homosexuals, the CDC considered that this indicated that the cause was probably an infectious agent, transmissible by blood.[1247] This view was increasingly shared by European epidemiologists.[1248] More significantly still, commercial producers of blood products recognised that whatever the true cause might be they could not exclude the possibility that it was indeed an infectious agent transmitted by blood products. They started to consider the sources of the plasma, which they bought, to exclude riskier donations. Alpha issued a press statement on 7 January 1983 advising that there was a real risk that those who took its products might as a result develop the symptoms of AIDS.[1249]

The FDA decided against this background, augmented as it was by further comment in both the popular and medical press in 1983 and by the growing epidemic, to press commercial producers to take steps to avoid the use of blood from high-risk groups when preparing factor concentrates. The pressure fell short of making a formal regulation (though it is often, erroneously, referred to as such). So, it became formally a recommendation, on 24 March 1983.[1250] The recommended steps included adopting standard procedures to quarantine, and dispose of, any products collected from donors known or suspected of having AIDS. They were to train personnel responsible for donor screening to recognise early signs of AIDS. Blood and plasma facilities were recommended to inform persons at increased risk of AIDS that they should stop donating. Those at increased risk were often referred to “the four Hs”(homosexual males, heroin addicts (IVDU users), Haitians and those with haemophilia). In short, people within the “four Hs” were much less likely to be sources of blood or plasma after 24 March 1983. All products made for distribution after that date were to be made from plasma collected after 24 March 1983 in accordance with these recommendations, and products made from donations collected earlier were supposed to be labelled to indicate this.[1251]

Dr Joseph Smith was the director of NIBSC, and chair of the CSM(B).[1252] By late March 1983, Dr Smith had been made aware of letters recently released by the FDA drawing attention to the recommendations. Having been alerted by these letters, Dr Smith wrote to Dr Leslie Keith Fowler, a senior medical officer in the Medicines Division of the DHSS, on 28 March 1983. He came straight to the point: his opening words were “I think it would be advisable to consider, at a meeting of the CSM(B), the problem of AIDS in relation to licensed blood products.” He added that it would be “extremely helpful to secure the advice of Professor Arthur Bloom” and also the latest information on surveillance in the UK from the CDSC. Dr Smith drew attention in his letter to the fact that the US was taking steps to avoid the use of blood from high-risk groups. He asked if Dr Fowler could prepare a brief paper on which the discussion might be based “together possibly with a note that Spence Galbraith’s unit might be prevailed upon to prepare.[1253]

In the event, only two documents were placed before the CSM(B) when it finally met on 13 July 1983:[1254] a paper from Dr Fowler and an annotated agenda prepared by Dr Smith that formed the basis for discussion. No paper from Dr Spence Galbraith or his unit was circulated and there is no evidence of Dr Galbraith being asked to prepare any such paper. Dr Galbraith had written, on 9 May 1983 to Dr Field of the DHSS setting out his view that the importation of concentrates should cease. The broader significance of that letter is considered elsewhere in this Report[1255] but for present purposes the key fact is that the letter was not provided to the CSM(B).

The second document prepared for the meeting of CSM(B) was a suggested “agenda”for the discussion on AIDS and blood products.[1256] This was prepared by Dr Smith and dated 28 June 1983. The agenda started with the assumption that “participants will be familiar with the problem and with at least a proportion of the many publications.” Headings for the discussion and a suggested first speaker were proposed. Somewhat unusually for an agenda, “brief possible conclusions”were also indicated, although Dr Smith added “doubtless these would be changed radically.[1257] From the agenda and its suggested conclusions, it appears that even before the meeting of the CSM(B) took place, Dr Smith’s view (or, if the suggested conclusions were based on discussions with others, the views of those others),[1258] was that the very issue under consideration could not be recommended: the suggested conclusion was “Impracticable on grounds of supply”.[1259]

Dr Fowler’s paper – on which he was not invited to speak, though it presumably was read by all attendees – took as a starting point that immune deficiencies were not new, and whilst accepting that AIDS was clearly a transmissible condition queried whether it had a single causative agent,[1260] and appeared to think it questionable that factor concentrates posed any additional risk to people with haemophilia. He asked in it whether haemophilic AIDS might be a function of the concentrate itself, but then said: “one cannot ignore other views and hope the problem will go away.” He identified four potential responses: improving donor selection; using cryoprecipitate to minimise exposure to multiple donors; using labelling and stop orders to prevent dumping of US concentrates manufactured before the FDA recommendation; and potentially heat treatment in the future.[1261]

Dr Smith regarded Dr Fowler’s paper as representing that division’s evaluation of the cause of the AIDS problem.[1262] This is almost certainly an inaccurate evaluation since Dr Walford was clear that the general and accepted view within the medical stream of the civil servants of the DHSS was that the likeliest cause of AIDS was an infective agent.[1263]

13 July 1983

The CSM(B) met on 13 July 1983.[1264] This was a special meeting of the CSM(B) rather than one of its routine scheduled meetings. The meeting was attended by nine members of the CSM(B) itself.[1265] The invited experts were Professor Bloom (UKHCDO chairman), Dr Craske (PHLS virologist), Dr Galbraith (director of CDSC), Dr Harold Gunson (CMO’s adviser on blood transfusion) and Dr Phillip Mortimer (PHLS virologist). Also present were Dr Holgate (medical assessor), Dr Purves (pharmaceutical assessor), Mr Morgan (secretary) and others from NIBSC and the DHSS, including Dr Walford.

The meeting began with a reminder from Dr Smith that the material those attending received was confidential and should not be disclosed outside the meeting. The minutes were succinct: Dr Smith told the Archer Inquiry that that was the practice in the Medicines Division.[1266] The upshot of the meeting was that it did not decide to take any action in response to the risks from imported factor concentrates.

The discussions that took place were not recorded in the minutes. Instead, the meeting’s conclusions were set out. They are not identical to, but are for the most part similar to, the suggested conclusions in Dr Smith’s annotated agenda.[1267]

The first issue to be discussed was the cause of AIDS, on which Dr Mortimer was to speak first. The conclusion was that “The cause of AIDS is unknown, but an infectious aetiology seems likely.” Although Dr Smith told both the Archer Inquiry and this Inquiry that he was almost certain by the end of 1982 that a new virus was the explanation, this was watered down between the possible conclusion in his annotated agenda and the minutes, perhaps influenced by Professor Bloom’s and Dr Fowler’s views.

Dr Galbraith was then to discuss epidemiology. From annotations to a copy of the agenda, this may have included a briefing on the numbers to date infected.[1268] The conclusion, though, is not consistent with Dr Galbraith’s earlier advice and looks more like Professor Bloom’s characterisation of the risk:

“Patients who repeatedly receive blood clotting-factor concentrates appear to be at risk, but the evidence so far available suggests that this risk is small. The risk appears to be greatest in the case of products derived from the blood of homosexuals and IV [intravenous] drug abusers resident in areas of high incidence (eg, New York and California), and in those who repeatedly receive concentrates in high dosage. Balanced against the risks of AIDS (and of other infections transmitted by blood products) are the benefits of their use; in the case of haemophilia they are life-saving.”[1269]

It is difficult to square this view of risk with, for instance, the views expressed by Dr Craske in the paper he wrote the previous autumn in which he drew attention to the fact both that there was a substantial delay in symptoms emerging and that mortality after first symptoms was high (around 50%).[1270] Nor does it appear that the Committee was informed of a letter Dr Evatt had sent to Professor Bloom on 7 March 1983, in which he described the AIDS epidemic as evolving at “a frightening pace … The incidence rate has been increasing in hemophiliacs and the epidemic curve paralays [sic] that of the total epidemic curve.[1271]

The second problem is the belief that factor concentrates were life-saving. This is true in part, but was not the unique preserve of commercial concentrates: it was true also of NHS factor concentrate and cryoprecipitate. Moreover, the phrase over-inflates the benefits of factor concentrates, which were used mostly in non-life-threatening circumstances. It is easy to see how this over-simplistic belief, that people with haemophilia would die without imported factor concentrates, would skew the outcome of the deliberation.

Continuing with the meeting, the annotated agenda suggested that Drs Gunson, Craske, Schild and Fowler would be invited to speak about screening tests and future heat treatment but the main focus of the discussion was next.

Professor Bloom was invited to address the question of whether to withdraw (all) factor concentrates on grounds of safety. The conclusion had no subtleties:

The possibility was considered of withdrawing clotting factor concentrates from the market and replacing them with cryoprecipitate. It was concluded that this is not feasible in the UK on grounds of supply.[1272]

As for withdrawing US products,[1273] the CSM(B) concluded:

“this is not at present feasible on grounds of supply. Moreover, the perceived level of risk does not at present justify serious consideration of such a solution. Efforts are however being made to secure UK independence of foreign suppliers of clotting factor concentrates. This should reduce markedly, although not eliminate, the risks to recipients of these products, and the Sub-Committee strongly supports this aim. The Sub-Committee was also informed that the UK Haemophilia Centre Directors have adopted a policy for use of US Factor VIII in order to minimise risks as far as possible.”[1274]

Dr Walford in her statement[1275] said that she did not know the factual basis for the conclusion that the replacement of concentrates with cryoprecipitate was not feasible, but observed that Dr Richard Lane was at the meeting as a member of the CSM(B). This conclusion is consistent with reports which he had given to a CBLA meeting on 27 April 1983.

It is plausible that this is the case. Dr Lane had produced a report for the CBLA, dated 22 April 1983.[1276] This said that the potential of BPL to manufacture small pool freeze-dried cryoprecipitate in significant amounts as an alternative had been ruled out on logistic production considerations. His objection to the production of single-unit wet cryoprecipitate (which would be in regional transfusion centres) was that it would “seriously reduce the efficiency of the current plasma procurement programme to satisfy BPL targets for factor VIII concentrate.

However, the production problems identified by Dr Lane were concerned with BPL’s ability to produce freeze-dried cryoprecipitate. They had nothing to do with the ability of regional transfusion centres to produce single-unit wet cryoprecipitate. The overwhelming evidence given to the Inquiry from directors of regional transfusion centres was that they would have been able to revert to the production of cryoprecipitate without great difficulty.[1277] Dr Lane’s objection was not one of feasibility, or logistics, so far as this was concerned: it was, rather, that it would reduce the amount of plasma available to BPL. Understandably, given his position as director of BPL, Dr Lane’s focus was on the impact on BPL and its activities.

As a result, it seems likely that the CSM(B)’s decision was based in part on a mistaken belief that cryoprecipitate could not be produced in large quantities.

Moreover, the position in respect of continued importation appears to have been based on a complete misunderstanding (at least that of its chair) of the position regarding self-sufficiency. Dr Smith thought it was going to be achieved imminently. He told the Inquiry that he had “the clear impression” that the UK’s self-sufficiency was expected to be achieved within a period of months (he thought about two further months).[1278] Indeed, he recalled asking the DHSS (probably Miss Zoe Spencer) after a couple of months about progress, and that she had remained positive about progress whilst telling him that self-sufficiency had not yet been achieved.[1279] This repeated his evidence to the Archer Inquiry which was that, from the discussions at the meeting, he had gained the clear impression that UK self-sufficiency was expected soon.[1280]

A view that self-sufficiency was going to be imminently achieved must inevitably have skewed the decision-making. Indeed, in Dr Smith’s written evidence, he described the Committee as having gained “some comfort[1281]from this view of the position.

As for the Sub-Committee being informed that the UK haemophilia centre directors had adopted a policy in order to minimise risks as far as possible, the errors in this conclusion are demonstrated by questions and answers during the course of Dr Walford’s evidence:[1282]

“Q. ... This particular paragraph is all about risk, isn’t it?

A. Yes.

Q. It talks about the ‘level of risk not at present’ justifying withdrawing US product.

A. Yes.

Q. Then the next three sentences deal with risk, and they throw into the equation to assess what the risk actually is, the sense of when and if we become self-sufficient in making our own concentrate, then the risks of US concentrate will, by definition, fall away: we have our own risks, but we won’t have the USA’s risks.

A. Yes.

Q. The second part is that there has been a change of policy or a policy which ensures that the risk is kept as low as possible. In terms of the first of those two points, the context was that of what was possibly a transmissible agent that was thought to be the most likely cause -- you yourself thought it was the most likely cause?

A. Yes.

Q. – which, if it was something which had a long incubation period which, was understood at the time, might take a while to manifest itself in the way that epidemics do.

A. Yes.

Q. So the sense that in a year or two or three or four’s time there might be sufficient production domestically really had nothing to say about that risk, did it?

A. No, the risk as they were defining it was what they perceived as the risk at present time, yes.

Q. And the future production of self-sufficiency, enough quantities, unless it was very, very imminent –

A. Yes.

Q. – would have nothing to say on it?

A. No, it does appear to be another non sequitur.

Q. So it’s – yes, it’s another non sequitur.

A. Yes.

Q. The last looks like misinformation, the sense that there’s been a change when there hasn’t been a change.

A. Yes.

Q. Because the general message being sent out from – as I understand your view of – and at the moment it may well become my own view, of what is said in the letter of 24 June, is that the UKHCDO weak recommendations really were: business as usual.

A. Or slight improvement on business as usual, if you like, emphasising how best practice –

Q. It gives a nudge.

A. Best practice.”

Dr Walford there recognised there had been at least two “non-sequiturs” in the reasoning.

The CSM(B) reached three further conclusions about factor products. First, that it was advisable that factor products for use in the UK should be derived from plasma collected after the FDA recommendation as soon as that became feasible. Second, that manufacturers were working on viral inactivation[1283] and when licence applications were received, it would be important to examine not only improvements in safety but also clinical effectiveness. Third, that manufacturers should be stopped from making claims about the safety of heat-treated products when safety and effectiveness had not been established by the Licensing Authority.[1284]

What role did Dr Galbraith’s letter on 9 May 1983 play?[1285]

The short answer, it appears, is none, at least insofar as Dr Smith can recall. He told the Archer Inquiry that he had only seen Dr Galbraith’s letter – which he described as a “very good letter recently, and that as far as he could remember it did not come to the CSM.[1286] In his statement to this Inquiry, he stated that neither he nor, he thought, members of the CSM(B) knew that Dr Galbraith had written to the DHSS, and that Dr Galbraith made no mention of it at the meeting.[1287] The contemporaneous documentation (summarised above) suggests that Dr Smith is right and that the Galbraith letter and paper did not reach the CSM(B).

Professor Christopher Bartlett gave written[1288] and oral[1289] evidence to the Archer Inquiry, effectively on behalf of and at the request of Dr Galbraith whose health did not enable him to give evidence himself. In 1983, Professor Bartlett had been a consultant epidemiologist to the CDSC, reporting to Dr Galbraith, and became the Director of CDSC in 1988 when Dr Galbraith retired. His written statement, referring to Dr Galbraith’s letter and paper of the 9 May 1983, explained that Dr Galbraith had sought his opinion on the final drafts and that he fully concurred with the conclusions and advice he had expressed. In his oral evidence it would appear that Professor Bartlett understood that Dr Galbraith repeated his advice orally to the CSM(B).[1290] Professor Bartlett commented further in his oral evidence that although there were, by May 1983, only a small proportion of recipients of Factor 8 concentrate who had developed AIDS “the risk may not have been small” and added that “I think this last piece of evidence about the risk is one where other experts at the time disagreed; they felt the risk was small. So there was a difference of opinion at that time and that came out in the meeting of the Sub Committee of Biologicals.[1291]

Professor Bartlett had understood that the CSM(B) meeting was convened in response to Dr Galbraith’s letter.[1292] However, in the light of Dr Smith’s letter of 28 March 1983 that seems unlikely, for the letter predates Dr Galbraith’s contribution. Further, the fact that Dr Galbraith’s letter and paper do not appear to have been circulated to members of the CSM(B) or to other invited experts tends to confirm this.

Professor Bartlett recounts that he spoke to Dr Galbraith the day before he gave evidence to the Archer Inquiry and conveyed that Dr Galbraith’s reaction to the CSM(B) conclusions was one of being “completely bowled over.” Professor Bartlett reported his own reaction as being “dismayed at the time.[1293]

Professor Stephen Palmer, who took up a position as CDSC’s first medical consultant epidemiologist in Wales in May 1983, has explained in his statement that in mid 1983 Dr Galbraith was unwell and not able to be present at CDSC.[1294] Dr Galbraith’s colleague would record on 5 September 1983 that he had been off sick with severe arthritis for the previous six weeks.[1295] This might, possibly, explain why Dr Galbraith’s views as expressed in strong terms in his paper to the DHSS do not appear to have been similarly expressed to the CSM(B).

Finally, it should be noted that both the Archer Inquiry and the Penrose Inquiry seemed to have proceeded on the (inaccurate) basis that Dr Galbraith’s statement had been considered and rejected by the CSM(B).[1296]

Did the Sub-Committee consider the Council of Europe Recommendation?

On 23 June 1983, the Council of Europe recommended the governments of member states “to take all necessary steps and measures with respect to the Acquired Immune Deficiency Syndrome and in particular: to avoid wherever possible the use of coagulation factor products prepared from large plasma pools; this is especially important for those countries where self-sufficiency in the production of such products has not yet been achieved.[1297]

There is no evidence that this was drawn to the attention of the CSM(B). There is no evidence that it formed part of any discussion on 13 July, or later when the CSM itself decided to endorse the recommendation of the CSM(B).

Further observations

Professor Richard Tedder is a medical virologist and physician, and a member of the Royal College of Physicians. His early training was in zoology. In a talk he gave in Cardiff in 1984, he is recorded as expressing the view that “in veterinary medicine, products from one country would not get through incoming Customs of another country in the way that … concentrates have come into the human market for haemophiliacs in the UK.[1298] In his written statement he added:

“Personally, I would have recalled, prevented or very strictly controlled the use of imported commercial blood products, especially those from the USA, which were known to have a significant risk over and above the expected. If the same was to occur with a British product, then clearly recall would be appropriate. At the Middlesex, we would only have used such products if it was the only option to avoid serious harm to a patient. That was David Dane’s teaching.”[1299]

In evidence, he amplified that the use of prisoners for donated blood particularly in the US meant that the producer of the product could not “know their donor”, which Professor Tedder regarded as a cardinal principle.[1300] The decision of the CSM(B) does not seem to have been one which that virologist, speaking soon after the event, would himself have taken.

At its meeting on 21-22 July 1983 the CSM endorsed the recommendations of its Biologicals Sub-Committee. There appears to have been little by way of discussion, if any.[1301]

Professor Sir Michael Rawlins was a member of the CSM and was present at the meeting. He confirmed that it would be normal for the CSM not to have the papers that had been considered by the CSM(B), because the experts in the subject sat on the Sub-Committee rather than the full committee. He would, however, have expected the CSM(B) to examine and interrogate rigorously the evidence about the factor concentrates when reaching its decision. He expressed surprise that the CSM(B) did not keep the issue under review so as to look at it again actively after July 1983. He considered that Dr Galbraith’s letter and paper should at the very least have been provided to the CSM(B).[1302]

Although Sir Joseph Smith, in his written statement to the Inquiry, described the remit of the CSM and CSM(B) in relation to advice on the response to AIDS as relating mainly to dealing with applications for product licences or variations to them in respect to blood products, he also observed that there were occasions when the CSM(B) or the CSM considered broader issues relating to the safety of blood products in the context of AIDS. This led to advice, recommendations or “remarks” being conveyed to the Medicines Division of the DHSS for consideration by the Licensing Authority.[1303]

He thought that the CSM(B)’s meeting of 13 July 1983 was the most obvious example of this.

However, the Licensing Authority itself – the ministers – were not informed of the discussion of 13 July. Lord Glenarthur, who at the time had ministerial responsibility for policy relating to blood and blood products, was “completely unaware” of the CSM(B)’s meeting and deliberations.[1304] Lord Kenneth Clarke, who at the time was Minister of State for Health, did not think he was aware of the CSM(B)’s existence. He was, however, a “little surprised” that the issue was not drawn to the attention of Lord Glenarthur. [1305]

Commentary

If the decision at this meeting had been as Dr Galbraith had proposed in May and as the Council of Europe had recommended in June, the consequence would almost certainly have been that many lives would have been spared, and many others saved from the desperation of infection by HIV when it was untreatable – and, for that matter also from the great difficulties caused by infection with what became known as Hepatitis C and to a large extent Hepatitis B.[1306]

In retrospect, the decision was almost certainly wrong. Whether it is now legitimate to see the decision as flawed at the time it was made demands further analysis.

The CSM and CSM(B) has as their principal focus the safety of the public, and in that light whether medicines were safe to import for use. By the time they met, there was a well-established risk that factor concentrates, especially from the US, were unsafe so far as hepatitis was concerned, and it was known that this was a serious disease with significant long-term consequences.[1307]

There was now also a very real risk that use of the same products as gave rise to a significant risk of hepatitis could also cause AIDS. The fact that the Sub-Committee appreciated that there was indeed a risk of AIDS,[1308] and that this risk was higher in the case of imported concentrates made in the US from large pools of purchased plasma than it was for concentrates and other products made from smaller pools from unpaid donors in the UK, is underlined by the reference[1309] to the clinicians adopting a policy towards the use of US Factor 8 which would minimise risks. This is a reference to a policy adopted by the UKHCDO for use by its members.[1310] In short, the Sub-Committee recognised that cryoprecipitate was safe[1311] (otherwise it could not have been given preferentially to those considered the most vulnerable). Though there were risks from using cryoprecipitate – if made in small batches, there might be a risk of bacterial contamination, and there was a greater risk of causing inhibitors because cryoprecipitate would contain more proteins than simply the clotting factor protein of interest – these were outbalanced by a greater risk of viral infection or[1312] AIDS arising from the use of large pool concentrates commercially manufactured.

The real choice was not between stopping importation, and requiring patients with haemophilia to go without treatment, or continuing it and exposing them to the risks of contracting AIDS and non-A non-B Hepatitis as a necessary adjunct to the advantages factor concentrates brought. It was a choice of therapy. Underpinning the Sub-Committee’s reasoning was an acceptance that cryoprecipitate was generally safer than concentrate.

The critical issue, therefore, in evaluating whether the decision was justified is not one primarily of safety, but whether the decision to continue the importation of factor concentrate “on the ground of supply” was reasonable. If sufficient supplies of cryoprecipitate to meet the continuing need for people with haemophilia to treat significant bleeding could be assured,[1313] the logic of the meeting’s reasoning should have led to recognition that a safer course would have been to suspend the importation of factor concentrates, and to consider carefully whether concentrates made from domestic, volunteer sources of plasma (“NHS concentrates”) should continue to be used to the extent they were.[1314]

The balance which the Sub-Committee said it was drawing was between the risks of AIDS and other infections being transmitted by clotting concentrates and the risks of ceasing to use them. It said that “in the case of haemophilia they are life-saving”.[1315] But as has just been pointed out, so too was the use of cryoprecipitate, and cryoprecipitate was safer. This language used in the minutes appears aimed to persuade rather than to convey a balanced picture.

In saying also that the evidence suggested that “the risk [of AIDS] was small[1316] it is unclear what approach the Sub-Committee was taking to risk. The magnitude of risk is a combination of the likelihood of it eventuating, coupled with its severity if it does. A small incidence of a fatal outcome is of high magnitude, and all the more so where any disease leading to death is incurable. In 1983 AIDS could not be treated. To say the risk was small was most probably to refer to an estimate of its likely incidence: the incidence would have to be very small indeed to be of a low enough magnitude to justify a decision which would expose some patients to inevitable death. The Sub-Committee must have thought there was only a very small chance that the infectious agent they thought probable would actually cause an infection, or it could not properly have decided as it did. It is, however, difficult to see a proper basis for this.

Professor Bloom had apparently earlier assessed the size of the risk by reference to the number of cases of AIDS which had developed in the UK and Europe amongst people with haemophilia.[1317] Given the way he expressed himself in this letter to the Haemophilia Society about risks, he was committed to a view both that this was how to evaluate risk and that the risk was very small indeed. It is to be inferred that his contribution to the discussion at the Sub-Committee would have echoed this position. He was an influential figure, later to be described by a pharmaceutical company’s UK marketing manager as an “opinion former”.[1318]

Yet, a memo of 13 July 1983 from Dr Peter Foster to John Watt referred to the assertion (by others, but reflecting Professor Bloom’s expressed view) that the risk of contracting AIDS was one in one million. Dr Foster considered that this understated the risk by a factor of 100.[1319] Even this, revised by a factor of 100, does not display the caution necessary when dealing with a new viral infection.

Risk is to be assessed not by past experience of what has happened, but by an holistic assessment of what can be seen potentially as happening in the future, taking all the evidence into account. A reasonable worst case scenario should be the guide. Since it had been noted[1320] that it might take six months to two years for the first symptoms of AIDS to emerge (and some time after that for it to be confirmed as a case of the syndrome), since Dr Fowler had observed that the apparent long latency period meant that the seeds had been sown for disease in the future, and Dr Evatt had confirmed that 50% of people with haemophilia in a study in the state of Georgia already had T-cell abnormalities and 13% were markedly abnormal,[1321] it is clear there was contemporaneously at least one reasonable view of what scenario might follow. Dr Evatt’s figures suggested that 13 in 100 people with haemophilia looked to be on the threshold of AIDS, and the 50% figure and the epidemic history of the disease showed that the figure of 13 in 100 was likely to increase.[1322]

Dr Evatt had told Professor Bloom in March, in a clinically factual letter, that “the incidence rate has been increasing in haemophiliacs, and the epidemic curve paralays [sic] that of the total epidemic curve.[1323] That curve showed that from small numbers initially the total figures had reached 1,150 cases in the US by March, 40% of which had come in the previous four months, each month showing an increasing rate of incidence. The implication was chilling. Yet there is no evidence that Dr Evatt’s letter was mentioned at all to the Sub-Committee.

Moreover, Professor Bloom’s approach was a scientifically inept way to evaluate the risk given that it was known that AIDS had a latency period following infection. It focused on the wrong figure. The number of reported cases was bound to be only a fraction of the numbers of those already infected but not yet symptomatic. Once it was postulated that the cause was a virus transmitted by blood or blood products, the appropriate question was not one which focused on the number of infections which had already become apparent, but one which focused on the risk that the plasma pool from which a batch was made had been contaminated.[1324] If even only one contaminated donation had been made to the pool, there was no scientific basis for assuming that its effect would be diluted by the other donations.[1325]

To assess the risk, first the chances of there being one (or more) contaminated donations to a pool needed to be assessed; that needed to be multiplied by the number of donors to the pool; and the risk to people with haemophilia who had concentrates from multiple similar batches then had to be assessed in the light of the knowledge that they had as many times that risk as they had had batches. A risk assessed in this way was an order of magnitude greater than that which was assessed simply by reference to how many confirmed[1326] cases had already been reported. Focus needed to be on what was potentially in the pipeline, not what had already been seen to come out of the tap.

Professor Bloom’s approach, as expressed in his letter to the Haemophilia Society, and the concise nature of the minutes of that Sub-Committee (which is apt to disguise almost as much as it reveals), suggest that his misdirected views[1327] as to the nature of the risk and the way it should lead to inaction may well have carried the day. If, as seems likely, the Sub-Committee was also aware that the Haemophilia Society was firm that it did not wish there to be a ban on the importation of US factor concentrate,[1328] this too must have had some influence. The members of the Sub-Committee may not have taken on board the fact that the Society’s view was in truth not a wholly independent view, but rather reflected the advice which it had sought from Professor Bloom,[1329] and he had given. If a view is expressed by one person to another it is seen as the view of just that one person. If, however, that other then relies on the view and expresses it to a third person, that third person may well conclude that two authoritative sources separately support the view, since that is how it will appear. And so views may spread, and gain acceptance, purely on the basis of apparent numerical support rather than merit.

Too cavalier a view was taken of the risks. Whilst there may have been still uncertainty about the precise scale of the risks, the reality of the risks was not properly appreciated. Dr Smith’s proposed conclusion as to whether an infectious agent was a cause was watered down, possibly because of the doubts expressed by Dr Fowler in what Dr Smith incorrectly took to be the authoritative view of the DHSS, coupled with Professor Bloom’s personal approach to risk. The analysis of risk was flawed. This is demonstrated within the minutes themselves by the approach the Sub-Committee took to further reliance on the purchase of commercial product from the US. In the corresponding minute, in apparent explanation why the Sub-Committee thought the level of risk did not justify withdrawing US preparations from the UK, the view is expressed that the efforts being made to secure independence of foreign suppliers of concentrates “should reduce markedly, although not eliminate, the risks to recipients of these products”.[1330] This must have assumed that a long-term solution could be an answer to a challenge likely to materialise, if at all, within the short term. It simply was no answer to such a risk.

A risk of infection is just that – a risk. But if the event which was risked was actually to happen – especially where it is realistic to suppose it might, as was plainly there to be seen on 13 July 1983 so far as AIDS was concerned – the nature of the supposed risk must be clearly kept in mind in determining what may reduce it. The nature of the risk of AIDS, which it was well justified to fear for the reasons given above, was not of a slowly evolving, gradual increase in risk. It was of the rapid escalation which Dr Evatt’s figures implied. Its progression was likely to be epidemic. There would be no reason to feel sure that the UK would not copy the US in this. Where there is a long incubation period after first infection before symptoms become apparent, an increase in the appearance of those symptoms will evolve only after the period is over, but then do so at an epidemic rate – with the unsettling knowledge that it is now too late to avoid those infections which have increasingly, in the meantime, been incurred. Dr Evatt’s figures show, in short, that there was no time for long-term solutions if indeed the cause was a virus.

Dr Smith mistakenly understood – he thinks now from what Dr Lane and the DHSS representatives were saying – that self-sufficiency was expected soon, and would be achieved “within a period of months”.[1331] Indeed after only a couple of months had passed he asked the DHSS if it had by then been achieved. However, there could have been no realistic hope that “efforts … to secure independence of foreign suppliers[1332] (“self-sufficiency” in other words) could be achieved in the immediate future. Dr Owen told Parliament in 1975 that the Government would fund national self-sufficiency in blood and blood products. By the time the Committee on Safety in Medicine met in July 1983, eight to nine years had passed. Yet England and Wales had failed to achieve what he had promised would occur. No explanation as to why this was had been advanced to Parliament. The redevelopment of BPL was underway by July 1983 but was a long-term project. At the earliest it might be able to increase production of concentrate derived from voluntary unremunerated donors in the UK, which had been made from smaller pools, sometime in 1986.

It would seem therefore that a central figure on the Committee was labouring under a misconception. It was one which was highly material. If it had been right, then after a couple of months or so there would have been no need to continue to import foreign concentrates; the supply issues would have been solved. It is right to note that this (mistaken) view is not, as such, recorded in the (economically worded) minutes.[1333]

As it stands in the minutes, the reasoning is poor. It is an unacceptably casual approach to regard a long-term solution as an answer to a risk of immediate harm, however slowly signs of that harm might begin to emerge. The development of self-sufficiency might provide an alternative source of supply to that provided by importing products, but it could not be an answer to the risk of imported commercial concentrates causing AIDS: that risk would long since have either materialised or been shown to be a misplaced worry.

The appropriate response to what appears on the horizon as a threatening hurricane is to batten down the hatches, and if need be evacuate the coast, rather than leave such steps till later, even though the hurricane may change course and the risk evaporate. Letting it make landfall is leaving it too late.

In summary:

  1. The meeting was unusual in having suggested conclusions on the very issues to be discussed, which probably led the discussion towards drawing conclusions of that sort.
  2. The chair probably discussed these suggested conclusions beforehand with Dr Fowler, Professor Bloom and Dr Lane; each of whom had their own distinctive views. Neither Dr Fowler’s nor Professor Bloom’s stands up to logical analysis; Dr Lane’s inevitably reflected his perspective as a director of BPL, manufacturing blood products, and thus concerned with future supplies of plasma if much of it was diverted to local production of cryoprecipitate.
  3. Its approach to risk conflated incidence with risk.
  4. It regarded US factor concentrates as “life-saving” (true, in some cases – but so also were cryoprecipitate and NHS concentrates, and this was not acknowledged as a balanced view required).
  5. It did not recognise what the Inquiry knows, that it would not have been difficult to ensure adequate supplies of cryoprecipitate from regional transfusion centres for treatment.[1334] Nor did it take adequate steps to investigate the potential level of supply (despite the minutes reading that “supply” was a – indeed, was the – critical factor).
  6. As Dr Walford acknowledged, there were two non-sequiturs in its reasoning.
  7. The chair had a misguided view of the likelihood of self-sufficiency being achieved in the very near future.
  8. The meeting did not have a number of important documents before it: Dr Galbraith’s letter of 9 May; the Council of Europe recommendations; Dr Evatt’s letter to Professor Bloom – it only had Dr Fowler’s report, and an agenda with suggested conclusions.
  9. It seems likely that Professor Bloom’s views were overly influential.
  10. The CSM(B) appear to have relied on the assertion (presumably by Professor Bloom) that UKHCDO had adopted a policy for use of commercial Factor 8 “in order to minimise risks as far as possible”.[1335] There is no evidence that CSM(B) actually saw, or asked to see, the policy itself, and such a policy could not be described on any view as minimising risks as far as possible. Insofar as the CSM(B) assumed that in practice clinicians had a treatment policy that minimised risks as far as was possible, that assumption was simply fallacious.

Accordingly, I have concluded that the decision was on any view flawed in the way in which it was taken and in the reasoning that was advanced. There is no doubt – not only in retrospect – that it was also wrong. It was not universally accepted at the time.[1336] Accepting there are dangers of views reached in hindsight, it seems clear that if paramount consideration were to be given to patient safety it should have resulted in a stay on further imports of commercial concentrate, coupled with an increase in local supplies of cryoprecipitate, whilst reserving NHS concentrates for those occasions when their use was truly life-saving, appropriate to provide cover for elective surgery, or used for patients only after careful consideration between clinician and patient as to the balance of risk and remedy.[1337]

It is not asking too much of the Sub-Committee, and relying too much on retrospectivity, to have reached this conclusion. I am fortified in this by a telling exchange between Counsel to the Inquiry and a member of the CSM (but not of the Sub-Committee) which on 22 July accepted the decision of the CSM(B) to make no recommendation about the continued importation of commercial factor concentrates. That member, Professor Sir Michael Rawlins, commented in oral evidence to this Inquiry: “Well, I think if we’d known then what we know about AIDS, that it’s caused by a virus … because at that point it was suspected it was an infectious agent. If we’d known it was caused by a virus, and if we’d known how frequently pool donors in the US … had the virus, then I think we would have done something different. At least I hope we would have done.[1338] This was the evidence of someone who knew the difficulties and demands of regulation inside out: he plainly did not think it too much to have expected the CSM(B), and in turn the CSM, to have reached a different conclusion if those two matters had been understood.

The first of these – the idea that the cause was a virus – was already regarded as approaching certainty: it was clearly put in Dr Galbraith’s letter (which Professor Sir Michael Rawlins said he would have expected to be put before the CSM(B)), and the second – how frequently pools might contain contaminated donations – was also as good as stated by Dr Galbraith when he spoke of intravenous drug users and male homosexuals being frequent donors. If this second point had not been fully appreciated before the meeting of 19 July 1983 in the US at which the representative of the Pharmaceutical Manufacturers’ Association, Dr Michael Rodell, spoke, it was after that so dramatically clear that it could not be ignored:[1339] that, on average, persons who were paid for their plasma had it collected 40 to 60 times per year. Dr Rodell’s public presentation suggested that at that rate, and given the pool sizes used in the US, as few as four infected persons could contaminate the entire world supply of Factor 8 concentrate (his analysis was confined to concentrates made from US plasma).[1340]

“Dumping” of riskier product in the UK

The logical corollary of the FDA’s formal recommendation to pharmaceutical companies of 24 March 1983 that they should not fractionate plasma collected from “the four Hs”was that products produced from “pre-March plasma” should not be supplied, and those already in the supply chain should be withdrawn, on the grounds that supplying them would give rise to an increased risk of transmitting AIDS. Matters in the UK came to a head in early May.

In early May, as recounted above, there was press speculation about whether the Government might decide to ban further imports of commercial concentrates. On 9 May Dr Galbraith wrote his letter advising just this. Coincidentally Travenol wrote on the same day to Dr Walford advising her that “well before”the FDA’s 24 March recommendation, steps had been taken by Hyland to introduce donor screening procedures designed to eliminate high-risk donors.[1341] On the other hand, on the same date the Haemophilia Society planned a meeting with the junior minister, Geoffrey Finsberg, to seek his assurance that there would be no immediate ban on the importation of US blood products.[1342] There were thus voices pushing in the opposite direction to Dr Galbraith and the media.

Professor Bloom was alert to the risk that fractionators in the US and elsewhere who used US-sourced plasma might have stocks of concentrate prepared from plasma collected before March 24, and wrote to Armour to express the worry that if such material was difficult to sell in the US it might be preferentially exported. He sought reassurances from Armour that this would not happen with their products.[1343]

Dr Walford[1344] took the similar view that there were likely to be “large stocks”of Factor 8 concentrates in the US prepared before the 24 March guidelines came into force. She considered it possible that concentrates made from the “safer” plasma might be retained for use in the US while these older stocks were dumped on export markets such as the UK. She asked if there was any way – perhaps by means of new labelling requirements – to prevent this.[1345] Dr Ronald Oliver[1346] responded by suggesting a meeting of officials to coordinate “our activities” so that “we safeguard our own supply position, and if possible obtain Factor VIII from the safest available sources.” He said “Ideally[1347] I suppose we would like to see any imported Factor VIII which is derived from American material to be manufactured after 24 March 1983.[1348]

It was agreed (by Dr Fowler) that there was a serious possibility that US manufacturers “may try to ‘dump’ pre 24 March 1983 material on the UK market.” He thought, however, that pharmaceutical companies probably had “large supplies of Factor VIII in store waiting for batch clearance by NIBSC and it is very unlikely that material was made solely from post 24 March 1983 plasma.” He nonetheless suggested that “stop orders” might be used; and saw no reason why stating the date of the collection of the source plasma on the labels attached to the products should not be a requirement if it were thought desirable.[1349]

When the question whether new legal restrictions could be introduced to prevent “dumping”was discussed in a DHSS meeting on 3 June 1983, Zoe Spencer[1350] said this would “present significant practical difficulties” and suggested that informal discussions with the companies concerned were more likely to lead to successful control.[1351] Pharmaceutical companies were to be asked to confirm that all future supplies of coagulation factor concentrates to be sold in the UK would be manufactured in accordance with the FDA “Directive”.[1352]

Each firm contacted gave the assurance at the end of June 1983 that “future saleswill comply with FDA guidelines. However, two (Miles[1353] and Immuno) indicated that Factor 8 concentrates manufactured from plasma collected since March 1983 would not be available until August and September, respectively.[1354]

There remained uncertainty within the DHSS as to the expression “future sales.” It seemed vague. It spoke of sales rather than supplies. Dr Oliver’s minute noted that “in some instances we are still left with the backlog of uncertain plasma or of Factor VIII.[1355]

About this time,[1356] an information sheet about AIDS prepared within the DHSS was passed to Ministers – Lord Glenarthur and John Patten.[1357] It was said that “The Department’s Medicines and Supply Divisions are endeavouring to ensure that there will be no dumping of high-risk plasma products on the UK market and are seeking various assurances from the manufacturers in relation to the quality of their products.[1358]

Accordingly, when Lord Glenarthur’s office asked for a “fairly full draft letter” to respond to a parliamentary question asked by Baroness Masham, the draft response from the DHSS on 26 July 1983[1359] included the note for his consideration that “we have confirmed with American manufacturers that future supplies of Factor VIII for this country will be manufactured only from plasma collected in accordance with US Food and Drug Administration Regulations introduced in March this year. These were designed to exclude from plasma donation, donors from high risk groups.” This draft – using the word “supplies” rather than “sales[1360] – was in accordance with the only recommendation which the CSM(B) had made on 13 July.

However, matters then took a rather different turn. This was in the light of developments in the US. On 19 July 1983, the Blood Products Advisory Committee of the FDA considered what to do about products which did not comply with the recommendations it had made to pharmaceutical companies back in March. Since the logical position was that public safety was served by these being observed – or, put the other way round, was at risk if they were not – it might be thought that the FDA would recommend, or require[1361] that product manufactured from plasma obtained before 24 March should be recalled. It did not do so. Instead, the Advisory Committee concluded that:

“a balance must be struck between theoretical risk of the product to recipients against the need for an uninterrupted supply of a life-sustaining therapy … it would be undesirable[1362] to distribute and use a lot of product which incorporated plasma from a donor with a definite diagnosis of AIDS. However, signs and symptoms suggestive of AIDS … would not be persuasive enough to dictate a recall of product … The consensus of the Committee was that the action to be taken for each incident of inclusion of plasma from a donor who might have AIDS into a product pool should be decided on a case-by-case basis.”[1363]

The FDA accepted this advice, deciding that the working policy of the Office of Biologics would be to evaluate the desirability of a recall on a case-by-case basis whenever a donor was found to have AIDS or strongly suspected to have AIDS, taking into account its judgement on the accuracy of the diagnosis, the timing of the occurrence of symptoms in relation to the time of donation, and the impact of a recall based on this.[1364]

It may seem surprising, now, to UK eyes that the FDA had thus not gone so far as to prohibit the continuing distribution of products made from pools to which someone who had AIDS, and had been confirmed to have AIDS, had contributed. It was later to be criticised authoritatively in the US for having failed to do so.[1365]

The result of the FDA meeting was reported by Dr Fowler in a memo of 28 July 1983.[1366]

On 2 August 1983 Mr Charles Wrigglesworth sent an internal memo to Dr Walford, attaching a minute of the US meeting. It asked for Dr Walford’s views on whether further action, if any, was now required in relation to the stocks of pre-March 1983 material; he suggested there was a conflict between the result of the FDA meeting and the recommendation of the CSM(B) of 13 July.[1367]

Dr Fowler was shown a copy of the minute, and took issue with the idea there was a conflict. He argued that there was none:

“Although the first sentence of CSM(B) recommendation (5) states the ideal situation, that only those products made from ‘post-March 1983’ plasma should be used in the UK, it goes on to explain why such a step is impractical because of the effect it would have on essential supplies. Although it is not specifically stated that ‘pre-March 83’ material should be used until adequate supplies of ‘post-March 83’ material are available, this is clearly implied in the full text as being the only practical approach to this difficult problem. The CSM(B) and FDA would thus seem to be in accord on this matter.”[1368]

Dr Walford’s response on 3 August posed a practical question: in effect, was the product to be supplied to the UK (or at least a substantial part of it) just as safe as it would be if it had been made from plasma collected after March 1983 because (as three US companies had asserted) measures at least as good as those requested by the FDA had already been put in place by them well before that date? If so, then the UK “need have no qualms”. However, she too did not feel there was a conflict, because the CSM(B) she thought had “carefully worded [the] recommendation” so as to express the view that “assured supply of material must take precedence over [the] implementation [of its recommendation].[1369]

Armour were quickly alert to the decision by a “small majority” at a DHSS meeting that included Professor Bloom that the DHSS would permit the use within the UK of factor concentrates produced from pre-March plasma – an internal memo from the UK representative to the parent company suggested it was “vitally important”that Armour sell the product as quickly as possible.[1370]

It transpired before the end of August that the DHSS only had details of precautions introduced by one pharmaceutical company prior to the FDA recommendations in March: it had not pursued information from the others. Mr Wrigglesworth noted that some of the “old” stock was in the UK and that some was still in the US but destined for the UK market. Nonetheless he queried whether any further action was required, given that a US Congressional hearing had accepted the outcome of the FDA meeting.[1371]

The position was thus no longer that there would be “no dumping”. Lord Glenarthur advised the Haemophilia Society: “Although future supplies of Factor VIII both for export and for use in America will be manufactured from plasma collected in accordance with these Regulations [a reference to the FDA recommendations of March], there is still a quantity of stock, some already in the UK and more in America awaiting shipment here, which has been made from pre-March plasma. The FDA has recently decided not to ban the use of similar stocks intended for the USA market because to do so would cause a crisis of supply. The same considerations apply here.[1372]

He expressed this view to Clive Jenkins, general secretary of the Association of Scientific Technical and Managerial Staffs (“ASTMS”), in a letter[1373] which nonetheless began by saying “there is no conclusive evidence that AIDS is transmitted through blood products. Nevertheless we are taking all practicable measures to reduce any possible risks to recipients of blood and blood products”. There is plainly a tension between what he was saying in the letter would happen – that stock manufactured from plasma considered in the US to be a less safe source material would be allowed freely into the UK – and the passage which is highlighted.[1374] Moreover, it was being contemplated that, to accompany a leaflet which was under preparation by the DHSS to advise high-risk groups against donation in the UK it should be said: “Meanwhile, as a safety precaution, the US Food and Drug Administration have introduced special requirements for plasma collection which are designed to exclude … plasma donation donors from high-risk groups.[1375]

Nothing such as “but products already manufactured from such donors will continue to be imported, distributed and used in the UK for a while” was suggested as a qualification. This should have been made clear in the draft press statement.[1376]

In respect of this letter, Lord Glenarthur was asked in evidence about the draft of the letter prepared for him by civil servants. One exchange (of many[1377]) is particularly revealing:

“Q. I’m just asking about, as it were, the draftsman, thinking it appropriate to put in, in your mouth -- and I know you looked at the letter and approved it, but you did so partially on trust -- saying, ‘We’re doing everything we possibly can’ and then saying, ‘Well, actually, we’re not in this respect because we’re having to accept ‘dodgier stuff’ because there’s no alternative’.

A. Yes, that’s correct.”[1378]

Commentary

There was a marked change in the approach of those responsible for regulating the importation of blood products between the end of March 1983 and the end of August. Within five months the approach had changed from, in effect, “There shall be no dumping here” to “There shall be no restriction on dumping here.”

The reason for not wanting dumping was the worry that pre-March products would pose a greater risk of AIDS to the patient receiving them than post-March products. It is obvious that the recommendations of the FDA that firms should not accept donations of plasma from those in groups at high risk of AIDS compared to others were designed to reduce the risk of AIDS as far as practicable by aiming to make the sources of plasma safer.

It is thus equally obvious that to impose no restriction on the continued import of products made from such plasma sources put the British public at some additional risk.

Why, then, was there a U-turn? The reasons for it given by Dr Walford and Dr Fowler[1379] relate to a textual interpretation of Dr Smith’s draft on the minutes of the CSM(B) but there is no record of his having been asked if, when he recommended that the use of pre-March plasma be avoided, he was actually countenancing that its supply should continue for as long as there were stockpiles of previously manufactured products.[1380] If he had been asked, as he should have been, there might have been a clearer answer. From what he has subsequently said – that he expected self-sufficiency within a few months – it is difficult to think he would have said this, for there would then be no need to use commercial products, whenever produced.

Their grounds were simply that supply would be insufficient without reliance on the riskier product. The Inquiry has found no convincing evidence that this was the case;[1381] nor that there was detailed consideration of how much “post-March” plasma was available to augment the mix of NHS concentrate and cryoprecipitate which was available for treatment; nor any step “in between” which could have led to discussions between clinician and patient, such as obliging pharmaceutical companies to label, clearly, the date of the collection of the source plasma.[1382] It was however noted that (when looking at August stock levels) that all of Miles’ pre-March 1983 stock and around 70% of the Alpha and Armour stock had been “collected in accordance with companies’ special precautions”.[1383]

Even if the importation of all commercial blood products should not have been suspended for a period, the decision to allow continued importation of pre-March 1983 plasma products is difficult to justify on the evidence available to the Inquiry.

There are two troubling aspects of the chronology of events set out above. The first is this. If the FDA decided not to ban products to which those with AIDS might have contributed, it seems to have been thought that that justified a similar response in the UK. It did not. Further, if any close attention had been paid to what was said at what was a public session in the US on 19 July 1983, the magnitude of the problem would have become apparent. Dr Rodell, a vice president (regulatory and technical affairs) of Armour, said that, on average, persons who were paid for their plasma had it collected 40 to 60 times per year.[1384] His presentation suggested that at that rate, and given the pool sizes used in the US, as few as four infected persons could contaminate the entire world supply of Factor 8 concentrate (his analysis was confined to concentrates made from US plasma).[1385]

This may sound, now, like an elegant plea to ban the products, or at least to ensure the strictest of precautions in manufacture, and measures such as reducing reliance upon regular paid donors and reducing pool sizes.

The context is different. He was arguing the pharmaceutical company position against recall of any infected batch, not simply giving an estimate which would undercut the entire commercial industry. His point was that if a donor was subsequently found to have AIDS, and the product was recalled, the likelihood was that his donations “could easily be represented in as many as 50 plasma pools in one year … 25 to 250 million AHF activity units could be affected, all in various stages of pooling, production and distribution.”[1386]

The comment was made that given the pharmaceutical manufacturers’ estimate of 800 million AHF activity units produced annually by the fractionation industry, the potential for serious disruption of AHF supply described by Dr Rodell “seems quite real”. In other words the need to continue supplying a product trumped any question of whether it was safe or (in effect) a poison.

Quite simply, the presentation suggested that the methods of production used relied on pools which were so big that the products could not be recalled – the system was in effect too big to allow it to fail.

Though it is clear that Lord Glenarthur had been made aware that stocks of product made from pre-March plasma would not be banned in the UK, on supply grounds, and said he considered the decision justified,[1387] it is unclear that he was alerted to the basis of the decision in the US for not requiring recall of products to which a person with signs of AIDS had contributed. A decision to expose those British citizens who might be prescribed blood products to the kind of risk implied by that decision justified specific ministerial consideration. There is evidence that the DHSS knew of the decision by the US authorities (it was, after all, a basis for their accepting pre-March plasma products after having first decided they should not do so).[1388] It is unclear whether the DHSS was told of the detail of the public discussion which had led to it.[1389] On the one hand it would be surprising if an approach as disturbing as that suggested by Dr Rodell was not picked up on: it should have been, given the alarming implications and Dr Fowler seems to have known enough of the discussion to have regarded it as “well aired”. But on the other there is no contemporaneous internal documentation of which the Inquiry is aware which specifically mentions it.[1390] The DHSS as a whole may have fallen short in this, though on the available evidence the Minister himself is not to blame. He could, and should, have been told of the point (if it had been realised within the DHSS), but could not reasonably be expected to discover it for himself.

The second troubling aspect is that although the increased risk posed by “pre-March” plasma products was appreciated, and was now to be taken on supply grounds, nothing was done to try to reduce this risk or help clinicians and patients to identify the additional risk to which they might be exposed. The possibility of imposing a requirement to state the date of collection of the source plasma was contemplated by the DHSS[1391] but this remained only a suggestion. It did not result in the DHSS doing anything. Whether this was because of the “practical difficulties” referred to by Zoe Spencer[1392] or otherwise is unclear. If the DHSS was not going to take action to prevent the importation of pre 24 March products, the least it could do was to take steps to mitigate the additional risk which it was permitting by:

  1. going down the route of a labelling requirement as contemplated by Dr Fowler; and/or
  2. making clear to haemophilia centre directors/UKHCDO the fact that pre-March 1983 products would continue to be available on the market for a further period, thus enabling haemophilia centre directors to take informed decisions about whether to use such products or not and/or to provide information to patients and the Haemophilia Society.[1393]

Although Lord Glenarthur wrote letters about “pre-March plasma products” to both the Haemophilia Society and Mr Jenkins of ASTMS, there is no evidence that information about the additional risks were circulated more widely.

Follow up to the July decision

It is impracticable for a regulatory body constantly to review all its previous decisions, and whether acting in accordance with them remains appropriate. However, some decisions call out for active review. In particular, where a decision is taken against a background of uncertainty, and events are moving fast, such review is appropriate.

The CSM(B) recognised that its view of risk at the meeting of 13 July 1983 might well evolve. It recognised that knowledge of the cause of AIDS was yet to be firmly established. The issue was not necessarily one of banning the import of commercial concentrates for all time, but rather suspending imports until the balance of risk and advantage became clearer. The members were bound to be aware that what was an epidemic in the US might well become one in Europe, calling for further measures which might involve the variation or revocation of licences. It should have known that its decision was controversial. Its task was not simply to take a decision and move on, without reviewing whether the decision remained appropriate. The decision should have been kept under active review.

The Inquiry has sought to find evidence that there was any such review. Professor Sir Michael Rawlins expressed his surprise there did not appear to have been any.[1394]

The Inquiry has kept looking. Nothing has been found to suggest there was any active review. I conclude it is likely there was none. There should have been.

Comparison with other decisions

Is there any evidence of a different (more precautionary) approach being taken to the risk of hepatitis where caused by other products during the 1970s and early 1980s which might shed light on whether the decision-making in respect of Factor 8 products was lacking or not?

There is some.

Thus, in the same year as it declined to recommend the suspension of the importation of products which might carry the cause of AIDS – 1983 – the CSM(B) declined to recommend a licence for a blood product (Immuno’s “Tisseel Kit”) on various grounds, which included that there was inadequate evidence on the hazard of the transmission of hepatitis to the patient:

“Although the use of selective 100 donor pools, as proposed by the Company, would reduce the risks of transmitted infections, there can be no assurance that infecting agents such as Non-A, Non-B hepatitis will be excluded.”[1395]

The approach of regarding it as too great a risk to permit importation of a product because a risk of infection could not be excluded is a very different approach to one which looked instead for evidence that a product actually did cause a serious disease, as was the approach where factor concentrates were being considered.

In another case (this time, in relation to a class of enzyme) the CSM declined to recommend a licence because: “(2) there was a need to consider whether there were alternative treatments for these conditions, (3) there were serious potential adverse effects (particularly hepatitis)”.[1396] The CSM noted that inadequate evidence had been given in relation to safety in respect of the transmission of infection, especially non-A non-B Hepatitis.

These two examples stand out because they show that the CSM had refused to recommend products for licences where there was a risk of hepatitis, and actively looked for evidence as to whether there were alternative treatments.[1397] This contrasts with its approach when considering the potential suspension of the importation of factor concentrates (where there was an acknowledged risk of hepatitis as well as a risk of an infective agent which caused AIDS) and the availability of alternative treatments (where there was a treatment acknowledged to be safer, cryoprecipitate, and no evidence that there was any adequate investigation of its availability).

Licensing of heat-treated products

Towards the end of the 1970s the risks of hepatitis[1398] being transmitted by blood products began to take greater prominence for commercial companies. What seems to have focussed attention in particular amongst the US pharmaceutical companies was knowledge that Behringwerke had developed a product intended to eliminate the risk of hepatitis. It began clinical trials of this product in 1978.[1399] Hyland/Travenol learned of this in February 1979.[1400] In May 1979, they heard that the clinical trials of the Behringwerke product were almost complete, and the results seemed promising.[1401] It prompted them into action. Not only Hyland, but Cutter,[1402] Alpha and Armour all appear to have made a determined effort to inactivate their own products. More about this is said in the chapter on Viral Inactivation.

Three companies developed a heat-treated product and had sufficiently successfully conducted clinical trials by 1982[1403] to apply in that year to the FDA for a licence. This was followed by a fourth company making applications in 1983.[1404]

In March 1983 the FDA granted a licence for Hemofil-T; in January 1984 it then licensed the Armour product and the Cutter pasteurisation product, followed by Cutter’s dry heat product in February 1984.[1405]

Applications followed to the Licensing Authority in the UK in 1983 in respect of Hemofil-T and Behringwerke’s product “Haemate P”. In 1984 they were followed by an application from Armour seeking a licence for heat-treated Factorate.[1406]

Both the application in respect of Hemofil and that in respect of Factorate were refused.[1407]

So far as Hemofil-T was concerned the CSM(B) said it was unable to recommend a product licence. The grounds included that “justification should be provided for the inclusion and choice of the heat treatment step.” The CSM(B) remarked that “Promotional letters making unjustified claims on improved safety margins in respect of infection and AIDS were seen by the Sub-Committee and strongly deprecated.[1408]

The pharmaceutical assessor[1409] noted that there was anecdotal evidence that the heat treatment step had been “included to minimise the chance of transmission of AIDS: but, this assumes AIDS is a viral mediated infection. There is no evidence to confirm this.”[1410]

The medical assessor[1411] observed “The company have applied to vary their product licence for a conventional FVIII concentrate, by the addition to the manufacturing process of a heat treatment step. No reason is given for this, but it may be assumed that they have a reason because the proposed treatment destroys about 20% of the coagulant activity yield.” However, he then suggested that the reason was to be found in promotional letters sent by the company to regional transfusion directors and specialists in haemophilia. These letters promoted heat-treated Hemofil as being less likely to transmit viral infection, “with particular reference to hepatitis B, non-A non-B hepatitis and possibly AIDS.[1412]

Given the references – to the “unjustified claims … in respect of infection and AIDS”, the “anecdotal evidence” and the promotional letters – a reason for including a heat treatment step in the process seems clear, even if not formally articulated by the manufacturer. Yet the application was rejected.[1413]

It was another year before Hyland/Travenol reapplied for a licence for the same product. By then it had been invited to apply using an abridged application process. In the meantime Hemofil-T had been licensed for sale in West Germany, Canada, Spain, Sweden, Belgium, and Ireland as well as the US and a licence to import it had been granted in the Netherlands.[1414]

The only other application to the UK Licensing Authority in 1983 was made by Behringwerke. A licence was granted, subject to a number of conditions. One was to provide satisfactory information on the heat treatment process; another not to make claims that transmission of Hepatitis B and non-A non-B Hepatitis had been “excluded”; and a third that there should be no reference to AIDS except to warn that blood products may transmit the syndrome.[1415] Of interest in the medical assessment is the view by the assessor, Dr Fowler, that “All anti-haemophilic factor (AHF) preparations can cause hepatitis, and most haemophiliacs will sooner or later get hepatitis. Some will die of it. Clearly anything that improves this situation would be desirable but it is important to distinguish which form of hepatitis poses the greatest threat.” He went on to identify the “greatest threat” as being non-A non-B Hepatitis which “is thought to account for ninety percent of all hepatitis due to blood and blood products and to be more likely than HB [Hepatitis B] to cause chronic liver disease.” He concluded that it was probable that Hepatitis B would be less likely with the heat-treated product than with comparable untreated products.[1416]

Although the licence was granted (in February 1985, though the recommendation had been to accept it with conditions in March 1984) the product was not introduced onto the UK market before 1987 for commercial reasons.[1417]

The third application was from Armour. It was declined in July 1984 on the grounds of safety, quality and efficacy.[1418]

Thus far the Licensing Authority had rejected applications to license heat-treated factor concentrates (with the exception of Behringwerke).

The situation changed on 26 October 1984. On that date, the CDC published in the MMWR series a report on their study conducted on a heat-treated product produced by Cutter. It recorded that the preliminary evidence of the effects of heat treatment was that it reduced the potential for transmission of the AIDS virus in blood clotting factor concentrate products. It suggested that the use of non-heat-treated concentrates should be limited thereafter.[1419]

Things moved quickly. Within a month Dr Smith, chair of the CSM(B), reported that heat treatment appeared to abolish detectable infectivity of the AIDS virus.[1420] As a result, the CSM requested[1421] that the Licensing Authority invite the companies concerned to make early applications for variations to their licences to permit the distribution and sale in the UK of Factor 8 products made by processes including a dry heat-treating step.[1422]

Four days later the Licensing Authority wrote to pharmaceutical companies encouraging them to use a dry heat treatment process, and to make early (abridged) application for a new product licence for concentrates manufactured by using it.[1423]

A summary of the way heat-treated products had been dealt with by the regulator since the first application was set out in an internal Cutter memo of 30 November 1984. It read:

“AIDS has finally come to the United Kingdom with a force that has caused a virtual panic in the Department of Health. For one year this department has blocked every application for registration of heat-treated factor VIII products [apart from Behringwerke, albeit with conditions, though it did not enter the market] and now in the space of one week they are in a panic responding to the newspaper demands for action concerning the AIDS risk to hemophiliacs. The action by the Department of Health comes after the announcement in the Sunday Mail that 2 hemophiliacs have died from AIDS.”[1424]

The summary then set out five headlines from 25 November, and one from 20 November, from The Mail, The Times, The News of the World and The Observer, before continuing: “Following these headlines the Department of Health has advised Cutter that every action will be taken to grant us registration by early December.

The summary is both concise and correct so far as evidence available to the Inquiry goes. (Though the Inquiry has no confirmatory evidence of the “early December”date having been given, it has none to the contrary).

For reasons which are unclear, despite apparently telling Cutter that a licence would be approved in December, a licence was not granted until February 1985. In February 1985, but not before, the heat-treated products of Baxter, Alpha, Armour, Cutter and Immuno were all licensed.[1425] Quite why it took this time given the urgency of the invitation remains unknown. However, it did not stop clinicians taking advantage of the products (technically on a named patient basis).

It was not until March 1986, amid concerns that the use of Factorate could still give rise to infections despite heat treatment, that the DHSS collected data on the ability of the products to inactivate HIV.[1426]

Commentary

It is for an applicant for a licence to set out their case for being licensed. It is not for a Licensing Authority to make that case for them. If, therefore, there was good reason for introducing a heat treatment step into the process, it was in the first place for Hyland/Travenol to state it. However, by 1983 the CSM(B) had already determined (at its meeting of 13 July 1983 considered above) that the cause of AIDS was probably viral. There was an obvious reason, known to the Committee, why a manufacturer would wish to include a heat-treated step which resulted in a lower yield of Factor 8 activity per litre of plasma. The pharmaceutical assessor quoted anecdotal evidence that the step was intended to minimise the chance of transmission of AIDS. The medical comment noted that no reason was given, but then went on to describe the reason as being that heat treatment made the product less likely to transmit viral infection, in particular the hepatitis viruses and possibly AIDS. In itself the decision was justifiable, and not unreasonable given a lack of detail of the methods being used to kill viruses and the uncertainties of their success.[1427] However, given the material available to the Committee it would have been a simple matter[1428] for the Committee to write and ask Hyland, prior to final consideration whether to accept or refuse the application, why the heat treatment step had been introduced.

Nonetheless, the medical assessor was also concerned that the introduction of the heat-treated step involved the loss of 20% of coagulant activity. This meant that some of the Factor 8 would be degraded, and might then have a “toxic potential”.[1429] There was a general concern that a new process might bring with it new risks to the patients receiving the product. Taken in isolation this would be a real risk.

The decision not to recommend a licence was reached in a wider context of which account could have been taken. It was made in September 1983. This was only two months after the decision had been taken by the CSM(B) on 13 July, endorsed by the CSM later that month, not to recommend suspension of the importation of factor concentrates. That should have been kept under close review. As part of the discussion before the CSM(B) there had been positive reference to the likelihood of blood products being heated to protect against AIDS, assuming AIDS to be virally caused: here was an application in respect of just such a product, it might have been thought. It was also already known by September that on “grounds of supply” untreated products coming from the US made from “pre-March plasma” were to be permitted to enter the UK market, contrary to earlier assurances that they would be excluded because they were more likely to risk AIDS. It was also clear that the number of AIDS cases amongst people in the UK who did not have haemophilia was continuing to grow exponentially. Finally, the reality of the risk of transmission of the cause of AIDS to people with haemophilia by blood products had been highlighted by the death of the first person with haemophilia in the UK. He had most probably died in consequence of receiving commercial factor concentrates.

As Professor Sir Michael Rawlins observed, safety often involves a balancing exercise. Here it was the balance between the possible, but unknown, adverse effects of heat treatment against the highly likely, though not yet certain, fact that blood products transmitted a fatal disease. The balance did not fall against Behringwerke’s product, despite the fact that the yield was far less than that claimed for Hemofil-T, so the drawing of this balance does not appear consistent even prior to the events of autumn 1984.[1430] However, once those events occurred – the endorsement by the CDC of the effectiveness of heat treatment, and adverse press comment – the balance was held not only to favour licensing but effectively to fast-track it (despite all the reservations which had led to Hyland’s application failing a year earlier).

There is an apparent inconsistency between the approach taken in late 1984 to that taken in late 1983: the explanation is, however, the impact of growing information about the viral cause of AIDS, growing concern about its effects on society transmission, and a growing need to stop further progression in its tracks: coupled with an appreciation after October 1984 that (dry) heat treatment was likely to be effective.

When the approach of rejecting the applications to market heat-treated products in the early 1980s (on grounds of safety) is compared with the approach in the 1970s to licensing products which were both known to transmit hepatitis, and thought likely to do so to a greater extent than domestic coagulant therapies, there is however an irony. Hemofil and Kryobulin were first licensed in 1973 despite being less safe than products already in use. The heat-treated versions were rejected on grounds of safety, despite being aimed at reducing the risks of transmitting disease.

Finally, there is no explanation why, when Cutter was assured (it appears) that the aim was to have its product available by December 1984, licences should have waited until February, at least in the cases of Hyland (for Hemofil) and Cutter (for Koate) who applied in September and November of 1984, respectively. It is not unreasonable for applications made by Alpha and Armour in January 1985 not to have been granted before February,[1431] and I am not prepared to say that it was unreasonable that there should be delay in the grant of a licence for the Kryobulin TIM 2 product in respect of which Immuno made an application in December 1984.[1432] If, however, the applications to resubmit a licence were an invitation to push at what was effectively an open door, then, given that the decisions on them had consequences for safety at least the first two could have been taken faster.

Information and warnings

The Medicines Commission was responsible for directing the British Pharmacopoeia Commission which published the British Pharmacopoeia.[1433] This sets out, product by product, the standards for human medicines and formulated products.

In a publication in July 1977, entitled The Control of Medicines in the United Kingdom, it is said that “A central theme in the British approach to the regulation of the marketing of medicinal products is a conviction that controls before marketing are not sufficient. However extensive the precautionary animal work, some effects may not be detectable until a large number of patients have received the drug … Great importance is attached to the monitoring of possible adverse reactions to medicinal products.”[1434]

For that purpose, the publication noted that the CSM maintained a Register of Adverse Reactions to which confidential reports about individual patients were made on a voluntary basis by members of the professions, usually on a specially designed “Yellow Card” issued to all doctors and dentists. Other sources – the Registrar General, the pharmaceutical industry, coroners – also contributed. 80 doctors were engaged part-time to investigate reports of individual reactions at the request of the CSM; summaries of the information obtained were provided routinely to those who reported adverse reactions, and “are available for certain classes of authorised person who may enquire about them”.[1435] No information was released which would identify a patient without written permission. This “Yellow Card scheme” has become very well known, which is part of its general success in alerting the authorities, the medical profession generally, and patients to potential problems.

Other means of communicating with the professions (and later the public) were through leaflets as to adverse reactions; to provide “Dear Dr” letters from the chair of the CSM; produce leaflets on “Current Problems”; and publish articles in medical journals.[1436]

The Yellow Card scheme[1437] has limited relevance to the risks of hepatitis amongst people with haemophilia, since the risk was already known to practitioners, and it is more likely to be used for reactions which are closely linked in time to the causative injection or transfusion. Chronic hepatitis, for instance, may first show itself with relatively unspecific symptoms, easily attributed to other causes and by the time the problems are becoming understood as serious, and the patient’s condition worsening, are not likely to trigger a Yellow Card since so long will have passed between the effect now manifesting itself and its cause several months or (more probably) years earlier. Nonetheless, so far as acute hepatitis[1438] is concerned it might suggest that certain products appeared to produce more infection than others. Similarly, AIDS symptoms were likely to arise some time after the injection/transfusion which caused them to develop, and it might not be easy for either patient or clinician immediately to have spotted that there was a possible relationship between the two which would justify an adverse reaction report. In those cases where the patients themselves suspected that there may be a link, the Yellow Card scheme was of little use to them between 1970 and 1995 because reporting was limited to medical practitioners during that period.[1439] Between 1970 and 1995 there were 144 Yellow Card reports relating to blood products,[1440] 63 of which occurred between 1977 and 1981. Approximately half did not relate to reactions involving the blood and lymphatic system, the liver, or “infections and infestations”.[1441] None related to HIV or AIDS.

There was an established process of investigating reports of serious adverse reactions to see if there is a link between effect and postulated cause, and whether to notify the professions of it.[1442]

After hearing from Professor Sir Michael Rawlins, the Inquiry sought evidence of the information (if any) which was disseminated regarding the risks of factor products. The reply on behalf of the Medicines and Healthcare products Regulatory Agency (“MHRA”) states that no record exists of any communication between 1970 and 2000 in respect of the risks of factor products.[1443] There was some (limited) evidence that hepatitis was reported through the Yellow Card scheme after the use of factor products. However, there is no evidence (despite the efforts of the MHRA to trawl through potential sources) that the full investigative process which might then have followed, to understand and if necessary report the risks, did so. This was despite the fact that on the evidence hepatitis is a serious condition when it occurs. I am bound to conclude that there was no such investigation.

In conclusion, the CSM had as one of its functions the job of communicating particular risks of medicinal products to professionals. However, it made none in respect of the risks of hepatitis or AIDS from blood products, though there was ample material which might have caused it to do so or to investigate further. There is one qualification to this. The British Pharmacopoeia did contain references to named commercial factor concentrates risking hepatitis if they were administered. However, in the years that most mattered in protection of the public (1983-1985) nothing appeared about the risks that AIDS might result from their use.[1444]

Concluding words

So far as regulation of factor concentrates is concerned, the system failed the British public. The failures were in how the system was operated, rather than being inherent in the system itself. They were compounded by a lack of openness in the critical period of interest to the Inquiry in respect of blood products: that was almost certainly because the committees involved understood this to be required by statute. A more open approach would have gone a long way to ensuring that those who were to receive products knew clearly of the risks to their health. The central failure however was not to prioritise safety.

3.13 Self-Sufficiency

This chapter examines the long-standing government policy of self-sufficiency in blood products. It assesses the various factors which contributed to the failure to achieve self-sufficiency for England and Wales earlier, including the lack of forward planning, the delay in redeveloping BPL and the failure to arrange for plasma from England and Wales to be processed in Scotland.


Key dates

1962 “make and mend” extension to BPL.

1965 Ministry of Health and SHHD agree to build PFC, which will fractionate plasma from northern regions in England.

1967 Dr Biggs calls for the “organisation, apparatus and buildings” to enable greater product of domestic concentrates.

January 1973 Dr Maycock urges DHSS to “have constantly in mind the need to develop our own sources in the UK transfusion services.”

20 March 1973 first meeting of Expert Group on Treatment of Haemophilia sets initial aim of 250,000 donations to be used for factor concentrate.

January 1974 MRC working party endorses range from 547,540-720,000 donations.

24 December 1974 DHSS writes to regional administrators that achieving self-sufficiency is “of the greatest importance”.

22 January 1975 Dr Owen states the government’s commitment to self-sufficiency and announces allocation of additional finance.

January 1976 Dr Bidwell estimates need for between 970,920 - 1,213,650 donations; similar estimate advanced by Dr Biggs to Expert Group in May 1976.

August 1977 Dr Lane unwilling to enter into long term agreement to have plasma fractionated in Scotland.

December 1977 Report of Working Group on Trends in Demand for Blood Products concludes that major government investment is required.

December 1978 Roland Moyle tells Parliament self-sufficiency not yet achieved.

April 1979 Medicines Inspectorate inspects BPL and produces damning report in July

December 1980 ministers instruct officials to begin planning for new BPL.

October 1981 90% of the blood products used in Scotland are manufactured at PFC.

11 November 1982 Treasury approves redevelopment of BPL (completed in 1987).


People

Dr Ethel Bidwell director, PFL (1967 - 1981)

Dr Rosemary Biggs director of Oxford Haemophilia Centre (1970 - 1977)

Thomas Dutton joint secretary of the Central Committee for the NBTS

Dr Richard Lane director, BPL (1978 -1990)

Dr William d’A Maycock consultant advisor on blood transfusion to CMO and director, BPL (until 1978)

Roland Moyle Minister of State for Health (1976 - 1979)

Dr David Owen Minister of State for Health (1974 -1976)

Dr Gerard Vaughan Minister of State for Health (1979 - 1982)

Dr Sheila Waiter joint secretary of the Central Committee for the NBTS

John Watt director, PFC (1967 - 1983)


Abbreviations

BPL Blood Products Laboratory (Elstree)

PFC Protein Fractionation Centre (Edinburgh)

PFL Plasma Fractionation Laboratory (Oxford)


Overview

The starting point is that, in overview, product made in the UK from UK-donor sources was safer [1445] than product made commercially from plasma purchased from donors. If the UK had become self-sufficient in the production of factor concentrates, NHS treatment using them would have been safer. The likelihood is that significantly lower numbers of infections with HIV would have followed. The likelihood is also that there would have been fewer infections with Hepatitis B and non-A non-B Hepatitis (Hepatitis C as we now know it).[1446] The sad fact is that despite the policy of governments ever since the end of 1974 being to achieve self-sufficiency in the production of blood products to treat haemophilia throughout the UK, this was not achieved in England and Wales until after 1990, although it was substantially achieved in Scotland at a much earlier stage. This chapter examines the causes of this, looking at each of the factors set out below in turn, except for finance which is threaded throughout the narrative.

The story this chapter tells is one of missed opportunities: a fractured and uncoordinated response to problems; a failure to plan in time for anticipated need; a disinclination to accept changes to methods of collecting and using blood; as well as an overenthusiasm amongst treating haemophilia clinicians for a newly developed treatment without significant regard to the risk, coupled with an apparent acceptance that much of that risk was inevitable. It is a story, too, which focuses on manufacturing products on a large scale in facilities which (save in Scotland from 1975/76 onwards, and in England from 1986/87) had never been designed for that purpose. Pervading it is a sense that patient safety took second place to cost, at times, and “efficiency” of production, at other times.

It is difficult to avoid the conclusion that Dr Robert Cumming, who was Dr William d’A Maycock’s Scottish counterpart overseeing the fractionation plant in Edinburgh,[1447] had it right when in 1975 (having retired) he wrote a paper on The Voluntary Blood Donor: “It is the author’s opinion that those inadequacies which exist in modern developed countries are entirely the fault of the organisation of the Service ... One of the important features of a successful voluntary blood donor system is forward planning.[1448] He seems to have been clear-sighted about demand and how to meet it.[1449]

Domestic and commercial supply

After the National Health Service (“NHS”) began operating on 5 July 1948 the supply of blood was almost entirely from domestic sources, but the supply of blood products to patients can be traced through two separate chains: domestic and commercial.

The domestic supply has origins that can be traced back to the start of the Second World War, and to arrangements (described in the chapter on Blood and Transfusion) which were inherited when the NHS first began. Commercial supply involved blood products (mainly clotting factor concentrates) being imported for use for those who required them.

The two chains remained largely distinct, though they could overlap in some respects. Principal differences between the two were:

  1. Domestic supplies were provided by donors who gave their blood (and latterly plasma) willingly, for no reward save the satisfaction of knowing that by their altruistic act they had benefitted a stranger in need. They may well have helped to save their life. By contrast, commercial supply began not with a donation, nor with the satisfaction of saving the life of others, but with a purchase. Donors sold (not gave) their blood or plasma – more often the latter – and did so in order to benefit financially (or materially) rather than morally.
  2. Blood products made from domestic donations were manufactured without a view to financial profit;[1450] the same was not true of products from commercial manufacturers.
  3. Commercial products were manufactured under the control of others outside the jurisdiction. Though commercial manufacturing units were on occasion inspected by officials from the UK, supply was generally subject instead to a system of licensing before they could lawfully be distributed to centres within the UK for use by patients generally.[1451] For years, it was considered that because domestic blood products were manufactured by agents of the Crown, Crown immunity rendered domestic products free of similar controls.[1452]
  4. Domestic blood products were not marketed actively, with a view to increasing their market penetration, by contrast with commercial products.[1453]
  5. The risks of disease transmissible by infection through the domestic blood supply were dependent on the underlying rates of infection in the domestic population generally, and in particular amongst those in the domestic population who chose to be donors (which so far as both hepatitis and later the viral cause of AIDS were concerned were thought to be lower in the UK than in the US). Blood or plasma used as the raw material from which blood products were manufactured in the UK was thus inherently less likely than were commercial products to carry hepatitis, or the viral cause of AIDS, or transmit any other than “local” infections. This was because blood products manufactured commercially were made from plasma sourced from outside the UK, from populations where these infections were more prevalent. Indeed, at some stages some commercial products manufactured in the US were not even made entirely from plasma sourced within the US, but from other regions where the nature and levels of infections amongst donors were almost entirely unknown to purchasers of the products, and may only sketchily have been appreciated by the commercial producers themselves.[1454] Commercial products manufactured in Vienna were also not necessarily made from Austrian or even European-sourced plasma.[1455]
  6. The greater the number of donations made to a pool of plasma the greater the chance that one or more would infect the whole pool.[1456] Commercial products were made from pools which were very considerably larger than those customary in the UK until the late 1970s, and then remained generally larger than those used to manufacture domestic products, though much of the advantage of smaller-sized pools was lost as pools in the UK grew ever larger and larger over time. See the chapter on Pool Sizes.

Introduction: leading up to 1973

Until the mid 1960s there was no easily available or easily usable means of remedying a shortage of clotting factor in a person’s bloodstream. Cohn fractionation offered a way of separating blood into its constituent parts, and opened the possibility of using that part which was responsible for helping blood to clot without the far greater volume of blood from which it was separated.[1457]

Early attempts to adapt the methods of Dr Edwin Cohn to producing large quantities of clotting factor concentrates were hampered by technical difficulties, until the mid 1950s when these were overcome by researchers working independently in England, Sweden, France and Scotland.[1458] A product information sheet for the Scottish achievement dated 27 April 1954 recorded that it had ten times as much Factor 8 activity as fresh frozen plasma (“FFP”), and that “A similar fraction is prepared by ether fractionation (Kekwick).[1459] This was a reference to Dr Ralph Kekwick and Dr Peter Wolf’s work at the Lister Institute (at the Blood Products Laboratory (“BPL”), in Elstree).[1460] It was freeze dried. Their product was supplied in small quantities to Lewisham and Hammersmith hospitals in London and to the Radcliffe in Oxford. By 1959, 75 patients had been treated.[1461] The UK had thus become an international leader in the field as early as 1957, and at that stage technologically ahead of the US.

There were three blood product manufacturing units in the UK: in Elstree, which became BPL; in Oxford, the Plasma Fractionation Laboratory (“PFL”); and in central Edinburgh (eventually in Liberton, a suburb), the Protein Fractionation Centre (“PFC”).

BPL was a post-war 1954 building. It had been planned initially as a civil defence project to prepare freeze-dried ultraviolet light irradiated large pool plasma. In the course of planning, this goal was abandoned in order to return to freeze-dried ten-donor small pool plasma.[1462] Fractionation of plasma was not a primary objective. As planned, the building had no space to accommodate this. When he retired as the director of BPL in 1978, Dr Maycock described how it had been extended in a “make and mend operation” in 1962.[1463] It was not until 1965 that planning for a 1972 extension, which then followed, contemplated accommodating the fractionation of plasma on any scale. This was a necessary step in producing clotting factor concentrates, though the main reason for the expansion at the time was to produce more immunoglobulin to prevent rubella in pregnancy, following an outbreak.[1464] There was also a slowly rising demand for albumin.[1465]

Dr Maycock described how even then the production facilities were not designed for the task nor large enough. He thought it generally took at least four to five years from the planning stage for a production facility to come on stream.[1466] Given his view, which he expressed with some force in the final report of BPL (for which he was responsible as its director), it is probable that he would have expressed it to the Chief Medical Officer (“CMO”)[1467] of the day. He is likely to have pressed the Government to provide for suitable production facilities, rather than the one he ran, which was not designed for the task, nor large enough. There are signs that he did so throughout the early 1970s, especially when – as will be seen below – he allied himself to calls for self-sufficiency, greater provision of plasma to achieve it, increased production, and for collaboration with Scotland to achieve this on a UK-wide basis.

Whilst the production unit in England was “make and mend[1468] when extended in 1962, the Scottish Home and Health Department (“SHHD”), faced with a growing demand for a range of blood products,[1469] drew up plans to build a completely new unit in Edinburgh processing 1,500, and if necessary up to 3,000, litres[1470] of plasma per week, to produce antihaemophilic globulin (“AHG”)[1471] as well as other fractions required for therapy. It was also to have research and development facilities.[1472] In 1965 the Ministry of Health made a formal agreement with the SHHD that the new unit in Edinburgh would fractionate plasma for the NHS using plasma collected by four English regions – Newcastle, Leeds, Manchester and Liverpool.[1473] This history, and the way in which through Westminster the necessary finance was made available, suggests a forward-looking interest was taken at that time to updating and renewing production facilities so as, between them, to serve the UK as a whole.[1474] It proved lacking after this.

The planning for the Edinburgh unit is all the more striking a provision for the future because it happened before a means of producing cryoprecipitate for therapeutic use was developed.[1475] Estimates of future annual needs for haemophilia therapy were already being predicted by a Working Party on AHG,[1476] which favoured a central laboratory for preparation of AHG.[1477] This demonstrates that there was developing use of AHG prior to the discovery of a way of using cryoprecipitate therapeutically, reported in the next year by Dr Judith Pool. It explains why, when new premises at Liberton were planned for PFC in Scotland, it was decided that it should be capable of producing AHG in quantity.[1478]

Antihaemophilic fraction (“AHF”)[1479] was by 1965 produced as a freeze-dried product, to be reconstituted with water, one unit giving 6-8 hours of Factor 8 activity (at least in Scotland). It was then made from six donations (approximately equivalent to just over one litre of freshly prepared plasma to which an anticoagulant had been added).[1480]

Cryoprecipitate[1481] could by contrast be easily prepared. It did not need a central production unit to make it. It was far richer by volume in Factor 8 than plasma had been, and thus could be administered in smaller, more concentrated doses. It was a game changer. By April 1967 it was being said in The British Medical Journal (based on a paper presented in September 1966) that cryoprecipitate was an “extremely valuable therapeutic material” and that from “many points of view it is the therapeutic material of choice.”[1482] Dr Peter Jones said in a publication in The Lancet on the same day that cryoprecipitate was now “the method of choice in treating bleeding episodes” and that “concentrated human AHF should be reserved for patients in whom haemostasis presents particular difficulty.”[1483]

Blood product usage now focussed upon cryoprecipitate: and thus, so did production. For nearly ten years cryoprecipitate was the major therapeutic material used for “on demand” treatment of bleeds. Centralised production facilities were largely unnecessary to provide it, since it could be made locally[1484] with relative ease. AHF was used in small quantities, in circumstances where clinicians felt the volumes of cryoprecipitate required to ensure a reasonable level of clotting would be too great, or where the use of cryoprecipitate[1485] had provoked an immune reaction.

During 1967 a haemophilia centre was established at the Churchill Hospital in Oxford. This was in succession to a coagulation research laboratory which had been operated by the Medical Research Council (“MRC”). The Centre had three elements to it: a clinical section, a coagulation research laboratory, and a plasma fractionation laboratory.[1486] The Lister Institute, which already administered the Blood Products Laboratory at Elstree, agreed to administer this Plasma Fractionation Laboratory as well, “because of the similarity of the work of the two organizations and the benefits which would accrue from their close association. The work of PFL was to be concerned with the separation and purification of coagulation factors for clinical use.” It thus became a third production facility of importance for people with haemophilia in the UK, alongside BPL and PFC (Liberton). It became operational in mid 1968.[1487] It was always much smaller than Elstree or Liberton. Much of its work involved the production of Factor 9 concentrates. In broad terms, the UK was then self-sufficient in meeting Factor 9 needs, and remained so until products sourced from British donor plasma ceased being used because of the threat of vCJD[1488]

The AHF as first produced by PFL and BPL was not ideal for home therapy since it was “poorly soluble, of low potency and specific activity.”[1489] Home therapy was thus pursued only to a limited extent at this stage: it was feasible with cryoprecipitate, but not as convenient.[1490]

There was a change in the method of preparation of AHF after 1970, first at PFL, and later at BPL.[1491] Thus far, AHF had been manufactured from liquid plasma. The base material had not been frozen. It thus had usually to be sourced from local donors, and used fresh. A change from the Blömback to the Newman method of preparation involved a change to the use of frozen plasma, from which cryoprecipitate could be obtained so as to form the basis for the further concentration of the Factor 8 it contained.[1492] This opened up more possibilities of larger-scale production.

It became apparent during 1967 to Dr Rosemary Biggs, who was at the Oxford Haemophilia Centre, that pharmaceutical companies in the US were planning to produce large quantities of freeze-dried concentrated AHF. She regarded concentrates as of prime importance. In a letter of considerable foresight to the CMO, she said: “I have estimated, on the basis of our practice,[1493] that a minimum quantity of these concentrates required at present is the product from about 50,000 donors a year. When all of the patients come for treatment more would be needed. The supply of plasma, as starting material for fractionation would, I think, be no problem since the use of the red cells can be organised.[1494]

She estimated that product from more than one million donors a year would be processed on a commercial basis in the US, adding:

“When this material comes on to the market we shall be obliged to buy it at a very high cost for our patients unless the English shortage can be remedied.

In this country we have pioneered this treatment, we have the personnel who know how to make the products, we could easily have enough plasma to serve as starting material. It would seem to me a great pity if we cannot make our own material in this country for lack of the organisation, apparatus and buildings in which to work. The purchase of the finished products in the United States will undoubtedly be very costly … a large amount will be made by commerical [sic] enterprise and on sale. On present prices a course of anti-haemophilic treatment for one emergency purchased from the United States, would cost $1,500 to $5,000. Surely it would be less costly to us to do everything to expedite the manufacture of these fractions in England and in particular to accelerate as much as possible the new fractionation buildings at Elstree and in Edinburgh.”[1495]

As to “new fractionation buildings” in Edinburgh, at this point the plant was still at the design stage, and was being purpose built for large-scale fractionation. It could not be expected to be on stream for a while.[1496] By the new fractionation building in Elstree, Dr Biggs presumably had in mind a further extension to the then BPL, which was completed in 1972.[1497] The estimates for Factor 8 concentrates and albumin concentrate on which the plan for the extended facilities were based were, in Dr Maycock’s view, “totally inadequate.”[1498] The limited size of the site imposed constraints; and yet Dr Maycock complained that “reductions in floor space were nevertheless imposed by the Department [of Health].[1499]

In short, an adequate state-of-the-art facility was planned for Scotland, and a “make do and mend” solution for England, at a time when little space might have been needed to produce AHF – unless, that is, there were going to be rapid improvements in the quality of AHF, its solubility, its ease of production, and the ability to scale it up for production in large quantities. Developments in all those respects were, however, certainly foreseeable. And in the near future.

By 1967, as Dr Biggs’ letter shows, the message that more space for production might well be essential was slowly becoming clear, albeit after the initial plans for both the “extension” at BPL and the new building of PFC had been determined.

The force of her message was demonstrated particularly after 1969 when a pharmaceutical company, Immuno started to market Kryobulin.[1500] This was first used in the UK in 1970.[1501] Its distribution within the UK was not, as yet, licensed. However it could be used by clinicians on a named patient basis.[1502] (It was not until December 1972 that an application for a licence to distribute Kryobulin in the UK was made). A rival US product – Hemofil – was also used on a named patient basis. Their efficiency in use, easier solubility, and (plainly) the fact that they were the result of large-scale production, might have indicated that the UK would need to follow suit if those products were not to dominate the market.

Dr Maycock appears to have foreseen a need for more substantial production facilities to be provided. So too did Dr Biggs. At this stage, in the late 1960s to the turn of the 1970s, public money was more freely available than it was later to be. An opportunity was missed.

Nonetheless, as the chapter on Regulation of Commercial Factor Concentrates reports, by 1973, the year in which both products were licensed, 80% of treatment was by cryoprecipitate, with the balance being made up of factor concentrates, largely of NHS manufacture. Between 1974 and 1975, for the first time, the use of commercial concentrate exceeded the use of NHS concentrate so far as the consumption of concentrates was concerned: but two thirds of the total product used for factor replacement was still cryoprecipitate. The two commercial products licensed in early 1973 thus did not immediately form a major plank of replacement therapies.[1503]

In summary of the story thus far, before Hemofil and Kryobulin were licensed in 1973 there was a mixture of inadequate, out of date, and struggling manufacturing facilities in the UK, aiming to cope with the manufacture of a new product for which they had not been designed. There was bound, inevitably, to be a passage of some time before new, effective facilities could be made available (at PFC Liberton in Scotland), intended to provide for roughly half the needs of mainland UK,[1504] which could match the rapid developments in, and the acceptability in use of, concentrates provided by commercial fractionators in Europe and the US, and which could fulfil the desire of some leading clinicians[1505] to facilitate home treatment for people with severe haemophilia. But there had been a failure to recognise that BPL would either need very considerable extension, for which there seemed to be little if any space, or complete redevelopment, if necessary on a site elsewhere than Elstree.

It was this licensing, and its consequences, that led to further calls for self-sufficiency.

What is self-sufficiency?

A simple definition of self-sufficiency in blood products is producing from a country’s own resources enough factor concentrate to meet clinical need without having to import any.[1506] This is the broad definition adopted for the purposes of this chapter.

It may seem simple. However, even expressed this simply, it is not. Clinical need is not easy to identify objectively. Who is to judge? In the treatment room, one doctor may think that a condition is best treated by a conservative approach, whereas another clinician might think that surgery is required. If the first were asked “does the patient need surgery?” their answer would be “no”. Depending upon the degree of confidence of the second, their answer might be “of course they do”. It might, however, be “I think it is probably safer/better to have surgery, even though it is not essential.” These are three very different responses to what represents the patient’s clinical need.[1507]

If one asks the same question of the hospital administrator seeking to save costs in one area so that the money might be spent in another to make the most of limited resources, they might answer by expressing doubts that the patient needed surgery. At the time of principal concern to the Inquiry, there were regional health authorities. If asked what the patient needed, what would their view be? They might see their task as being to prioritise the “needs” of each patient as best they could. This would involve a balance with other groups of patients who might seem more deserving to that authority than those groups did to other regional authorities. It might have the result that in one region a patient would be seen to have certain “needs” for more extensive treatment, whereas in another region they would not “need” it.

In short, “need” involves questions of perspective. The individual patient’s perspective may well be different from that of his treating clinician, and in turn from the administrative bodies. What patients think they need (always of critical importance) may reflect very different considerations again, both individually and as a group.

In the context of organising the supply of blood and blood products the “needs” of the population as a whole had to be determined. By reference to what standard? Here, views differed.[1508] It was seen by some that people with haemophilia needed to be able to live a “normal sedentary lifestyle.[1509] Others had further ambitions: that having the condition of haemophilia should not inhibit anyone having as active a lifestyle as they wished. The World Health Organization (“WHO”) defines health as not simply the absence of disease, but a state of complete physical, mental and social wellbeing. At the other end of the spectrum, it could be considered that “the need” of patients was relief from life-threatening bleeds.

“Demand” tends to be equated in common parlance with “need”. But it is not the same, though demand for a product may over a period of time[1510] evolve into the product being seen as “necessary”, or (to use another word) “needed”. Dr Terence Snape pointed out in his evidence that prescribing choices in England and Wales (ie the choices that would be reflected in demand) were being formed between 1973 and 1978 “when imported commercial factor VIII was opportunistically filling the gap left by BPL’s failure to supply” and commented that therefore “it may be no surprise that clinicians … would continue to exercise that choice well into 1982/3”.[1511] In short, the convenience in treatment which Factor 8 concentrate offered, which led to its increasing use for home treatment, and the plentiful supply of concentrates from the US to fuel this, led to an increased demand for further supply, and made it more likely that factor concentrates were seen as necessary, not merely as highly desirable. It became, in his view, a case of supply creating demand, which then in turn created a need for increased supply.[1512]

This view, expressed in evidence by Dr Snape, was echoed in closing submissions to the Inquiry, arguing that the fact of licensing importation of foreign concentrate itself led to a demand for it to be available.[1513] The submissions were to the effect that if no licence had been granted there would have been a greater emphasis on producing more easily usable factor concentrate from the UK’s own resources: there would have been a greater urgency in pushing ahead with developing state-of-the-art production facilities.

If the regulator (ie the Licensing Authority) had rejected the applications made to it in 1972 for the licensing of Kryobulin and Hemofil M such that their licensed distribution would not have been permitted in 1973, it would have been on the basis that they posed too great a risk of transmitting hepatitis. This may then have led to a greater emphasis by those commercial companies on research into viral inactivation. It seems likely that, whether or not such steps were taken by those who wished to enter the market to compete against an apparently safer domestic product, greater impetus would have been given to developing a modern production facility in the UK.

The facts are that freeze-dried concentrates – whether commercial or domestic in origin – became seen as generally necessary by the end of the 1970s, when they had begun the decade by seeming desirable for the convenient advantages they offered for particular cases. The fact was also that cryoprecipitate prevented the worst of serious bleeds, could be used at home, though with some difficulty, and was responsible in itself for a significant improvement in life expectancy. It met the needs for treatment. The Achilles heel of cryoprecipitate was the difficulty in organising effective home treatment. It was by no means impossible – indeed, the number of centres which used cryoprecipitate for home treatment at one stage was in double figures – but it was not as easy nor as convenient as using concentrates. Home treatment would avoid the need to come to a hospital in pain, and then have to wait for a treatment which in total would take about half a day and which might need to be repeated if it was not effective within the half-life of Factor 8, usually about 12 hours.

The supply of cryoprecipitate was such that no cryoprecipitate needed to be imported. The need for such therapy, and the demand for it, was capable of being satisfied from the system of voluntary non-remunerated blood donation in the UK. There were sufficient donations to meet the need for all but a very small number of blood transfusions.[1514] So there was always self-sufficiency in blood.[1515] And the near unanimous evidence of the transfusion directors from whom the Inquiry heard is that there would have been no difficulty in making sufficient cryoprecipitate had more been needed. None would need to be imported. “Self-sufficiency” thus relates to the supply of fractionated blood products only – and principally to Factor 8 concentrates, since there was generally a sufficiency of Factor 9 concentrates to meet needs.

The supply of concentrates depended upon a number of factors. These were:

  1. The estimated need/demand for concentrates.
  2. The actual usage of concentrate.
  3. The obtaining of plasma for fractionation in sufficient quantities to make concentrates: this, in turn, depended on:
    1. the quantity of plasma available from donations of whole blood which, in turn, depended upon: the extent to which clinicians using transfusions in their treatment of patients were prepared to use red blood cells from a donation rather than whole blood, leaving the plasma component for other uses;[1516] whether using a transfusion was necessary at all; if it was, whether less blood was appropriate rather than the amount being used; and whether too much blood was wasted by being made available for a transfusion “just in case” it was needed, in which case the blood might not have been returnable to stock;
    2. whether donations consisting just of plasma could be made by a process which returned red blood cells and platelets to the donor (plasmapheresis): such a process would take advantage of the fact that the same donor could give donations of plasma much more often than donations of whole blood, because the body replaces plasma fully after a couple of weeks, but may take a few months to replace red blood cells;
    3. the willingness of regional health authorities to devote some of their resource to separating plasma from whole blood in order to send that plasma off to a third party (BPL)[1517] for processing.
  4. The capacity of BPL and PFC to produce factor concentrates if supplied with sufficient plasma, which depended in turn upon:
    1. the production method adopted (ensuring that as little coagulation factor activity was lost in the process as possible);
    2. the extent to which the facilities were designed to maximise production;
    3. the facilities available to warehouse plasma;
    4. the facilities available to store finished product prior to distribution;
    5. whether the plant operated on a “9 to 5” basis, or on a two or three (24-hour) shift system, which in turn depended on: the number of trained staff being adequate to man a two or three shift system; whether the staff, if available, were willing to do so;[1518] and whether the plant used a production method which enabled this;
    6. the proportion of any batch of product which was dedicated to quality control and to regulatory checks, and thus could not be used for distribution for treatment.
  5. The way in which and extent to which plasma and product production were funded.

All of these, in turn, depended upon the policies adopted by the government and the way and extent to which they were put into effect.

The policies were not in doubt after 1974. It was, and on the face of it remained, the policy of successive governments to achieve self-sufficiency in the near future, and to cease dependence on expensive (and less safe) imports of concentrates. However, that goal was not achieved. Self-sufficiency was not achieved until after 1990. The “near future” as planned in 1974/1975 had become the distant future. Such a failure calls out for explanation.

What was the estimated need for concentrate?

In August 1967, Dr Biggs had emphasised, to the Ministry of Health, the need for the supply of Factors 8 and 9.[1519]

The following year the plans for the new PFC building at Liberton had been changed to allow for more AHG to be produced than originally intended.[1520] Work had begun on erecting the new building for fractionation at Liberton in 1971.

In July 1972 Dr Charles Rizza at the haemophilia centre at Oxford asked the hospital’s director of pharmaceutical services to purchase Immuno Factor 8 concentrate at an estimated cost of about £15,000 a year. The purchase was needed because of Oxford’s pre-eminence as a treatment centre, with about half the patients coming from other regions. The purpose of providing the material was to increase the “safety margin for the treatment of urgent cases” and it would permit a shortening of the waiting list for “non-urgent operations.[1521]

In October 1972 the haemophilia centre directors from England and Wales met in Oxford. The general sense from the minutes of their meeting is that the contributors felt there was a general undersupply of concentrate, and that supply was also variable across the country.[1522] The supply constraints were not however so severe as to preclude Dr Maycock and Dr Bidwell saying they could supply enough freeze-dried concentrate from NHS sources to facilitate a trial of prophylaxis at Treloar’s school which the centre wished to mount.[1523] The idea of using a copious amount of NHS concentrate for this purpose did however lead to a discussion of whether this would unfairly prejudice supplies of concentrate elsewhere. Home treatment of people with haemophilia was discussed, but it was noted that some centres could not provide it because they did not have sufficient concentrate to do so.[1524] The chair, Professor Edward Blackburn, therefore wrote to the CMO asking for an expert committee to be set up to consider the supply of therapeutic materials to treat haemophilia and allied disorders. He commented that: “The great shortage of materials is limiting the treatment that can be performed, particularly the introduction of Home Treatment.” He emphasised that the directors felt there was “an urgent need to increase supplies of Factor VIII Concentrate”, adding: “Many feel that if a British preparation cannot be made available very shortly, the commercial preparations should be bought.[1525]

In January 1973 Dr Maycock urged the Department of Health and Social Security (“DHSS”) to “have constantly in mind the need to develop our own sources in the UK transfusion services.[1526] This was a plea for the Government to expand the capability of the UK to produce its own product. Though implicit in this letter, two weeks later he added his voice to that of Professor Blackburn to say expressly that the UK supply of AHG was inadequate.[1527] The CMO alerted the Permanent Secretary at the DHSS to the likelihood of increasingly heavy expenditure on commercial preparations which were being licensed, and which, “when … available [we] can hardly refrain from using”, resulting in a response suggesting a need to consider how far the home supply could be increased, leading to a lower cost.[1528]

Within a fortnight Hemofil was licensed for use in the UK. There is no indication, however, that the concerns expressed about shortfalls of supply between the haemophilia centre directors, Dr Maycock, and the DHSS had any influence on that decision.

At this stage, apart from a clear view that the supply of NHS-made concentrates was insufficient, it was not clear to what extent an increase was needed. That was to be addressed by an expert group set up in part for the purpose.

Expert Group on the Treatment of Haemophilia

The Expert Group on the Treatment of Haemophilia met for the first time on 20 March 1973. To determine how much factor concentrate should be produced by the NHS to be sufficient to meet all reasonable clinical demand the meeting needed to know how many people there were with haemophilia in the UK; how many required regular treatment; what the average amount required for treatment was; and what was likely to be required in future. None of these parameters was clear. There was no national register of people with haemophilia. Nor was there any accepted unit of measurement: “donor units” was used, although experience with cryoprecipitate showed that the amount of Factors 8 or 9 in the plasma of a donor could vary considerably, and even from time to time for the same donor.[1529]

Dr Biggs produced a paper in an attempt to deal with these problems. She estimated there to be 1,754 to 3,000 people with severe haemophilia in the UK.[1530] At the time, they were already receiving an aggregate total of 300,000 donor units; but needed between 400,000 to 700,000 per year, from cryoprecipitate and concentrate combined. Though over ten years “an attempt should be made to provide all of the necessary material” in the form of factor concentrate, an initial aim should be to supply concentrate made from 250,000 donations for use in home treatment, with the balance being cryoprecipitate.[1531] Having considered her paper, the meeting “generally agreed” with her lower figure – that 400,000 donations per annum were required – but if strenuous efforts were made to clear waiting lists for surgery, or if home treatment or prophylaxis were to take off, more would be needed.[1532]

Importantly, for what follows, the expert group also thought:

  1. t wasessential” that this issue be considered as “a U.K. exercise”;
  2. that self-sufficiency should be attained as soon as possible; and that (with a view to doing this);
  3. there should be consultation with regional transfusion directors about reducing the production of cryoprecipitate while increasing the production of FFP and possibly increasing plasmapheresis, thereby allowing for the increased sending of frozen plasma for fractionation.[1533]

The supply of plasma for cryoprecipitate and for sending it to make concentrate at that stage was 300,000 donations in a year.[1534] What was proposed was therefore a significant increase. The production capacity, after the current expansion of BPL had been completed, was said to be 135 litres per week (and thus around 350,000 donations per year).[1535]

The activity of Factor 8 was not then measured in international units: that standard measurement was proposed in 1970 but was not in general use until around 1973.[1536] Accordingly, many of the early estimates mentioned in evidence need to be converted according to an assumed rate. Comparing these standards (donor units, litres, and international units) with the activity to be gained from a single-donor cryoprecipitate creates further difficulty, because that single unit may be rich in Factor 8, or may be poor. In clinical use, a clinician will wish on the whole to ensure that an appropriate amount of activity is transmitted by an infusion, and thus prescription of packs of single-donor cryoprecipitate would be on the “high” side to allow for the difficulty of precisely determining activity, whilst aiming for a sufficient effect.[1537]

Medical Research Council Working Party

In March 1973 it had been thought that some 300,000 donor units were needed for that year’s treatment, and the Expert Group then meeting had estimated 400,000 would be needed for the future, both involving a mix of concentrate and cryoprecipitate. It had been considered that from within the 400,000 some 275,000 donations would be required to make factor concentrate if reliance on the use of imported concentrates were to be avoided.[1538]

Dr Biggs pressed her case for the higher figures she had quoted earlier through the MRC’s Blood Transfusion Research Committee Working Party on the Cryoprecipitate Method of Preparing AHF Concentrates.[1539] She foresaw a future when little or no cryoprecipitate would be used. Her paper noted that by January 1974 the supply of treatment for Factor 8 and Factor 9 replacement derived from approximately 300,000 blood donations per year, most of this was provided in the form of cryoprecipitate. On the basis of calculations, made by looking at what was being supplied to pupils at Treloar’s,[1540] it worked out that the amount of material required to treat all patients with haemophilia in Great Britain adequately would be between 547,540 to 750,000 blood donations per year. The conclusion was that that amount needed to be “fractionated annually to produce freeze-dried Factor 8 concentrates.[1541] The group’s report was later published in the British Journal of Haematology.[1542]

There was a dissentient voice. Dr Christopher Bowley, regional transfusion director in Sheffield, attempted to get his fellow regional transfusion directors to persuade haemophilia directors to change course.[1543] He thought that Dr Biggs was suggesting too great an increase in the amount of donations which would be used for fractionation.[1544] He made the point that she was basing her paper on the practice of the Oxford Haemophilia Centre which used “a very great deal more material (per case) than anybody else”. He said he had spoken with a number of senior haematologists and “they all thought, [that] given an adequate supply of good quality cryo[precipitate] and just a small supply of super concentrate for the major surgery or the patient with inhibitors, they would be very happy.” He added: “Bearing in mind that Factor VIII is inevitably wasted at all stages of preparing and using concentrate, it may well be that money and effort should be channelled towards more and better cryo.[1545] He had some support from Dr John Wallace, regional transfusion director in Glasgow and West of Scotland. He too thought that “Adequate amounts of a good quality cryoprecipitate would probably cover most clinical indications for factor VIII therapy.”[1546]

Both of their papers were discussed at a meeting of regional transfusion directors and haemophilia centre directors, and the minutes record that: “It was felt that once the new fractionation laboratories in Edinburgh and at the Lister Institute[1547] were in full production they should be able to meet the needs of the country provided sufficient plasma was available.[1548]

Despite this report, no central funding was provided.[1549] Dr Biggs then went public with her concerns about funding. Having given notice to Dr Sheila Waiter at the DHSS of her intention to write to The Lancet, she wrote that the reason for the shortage of Factor 8 was the expense of Factor 8 concentrate.[1550] At this stage, much of the Hemofil and Kryobulin, which was available through central contract with the DHSS for regional health authorities to purchase at a predetermined rate had simply not been bought. This seemed to her to be the consequence of the amount of money needed to buy it even at the preferential rates secured by collective purchase.[1551] Dr Biggs concluded by noting “the ridiculous impasse of large available stocks of therapeutic materials locked up in stores because no-one would buy them and, on the other hand, patients in dire need of this same material.[1552]

This led to a parliamentary question and response by Dr (later Lord) David Owen, who had become Minister of State for Health.[1553] The response did not challenge Dr Biggs’ central points: that concentrates were the optimum treatment; that commercial product was available but, despite that, UK domestic production should increase; and that home treatment was desirable as a goal.[1554]

At this stage, the Expert Group had set a goal of 250,000 donations to be used for Factor 8 concentrate (March 1973); this had been increased to 275,000 by regional transfusion directors following the Joint Steering Committee (June and July 1973);[1555] and the MRC Blood Transfusion Working Party had then endorsed a range from 547,540 to 720,000 donations (including cryoprecipitate) (January 1974). The figures were escalating. But even the goal of 275,000 donations for Factor 8 production had not been met a year after it had been given.[1556]

Dr Maycock had already seen dangers lurking for the voluntary non-remunerated blood donor system upon which the transfusion system in the UK was based, if the need to import commercial products to address what had been seen as a short-term need became a permanent demand. He recognised a need for blood transfusion services to be “self-supporting”;[1557] but this was seen by more than him alone. The SHHD considered that the “present dependence on commercial supplies of anti-haemophilic globulin concentrate and PPF [plasma protein fraction] posed a threat to the unpaid voluntary donor system” and meetings had been held between their representatives and those of the DHSS at which:

“the following principles had been reaffirmed:-

  1. The system of unpaid blood donation must be preserved in UK.
  2. In order to preserve this system the blood transfusion services in UK must be self-supporting.
  3. There should be agreed UK targets for provision of preparations of human blood.”[1558]

Though the facilities at BPL were cramped into too small a space, and its production capabilities were limited, Edinburgh was beginning to produce factor concentrates, and Dr Maycock was still of the view that overall UK production facilities were more than adequate to meet the demand for fractionated product.[1559] The DHSS, and he, therefore understood that it was the supply of plasma for fractionation which needed to increase if the shortfall in meeting anticipated demand was to be met from within the NHS.[1560]

The amount of plasma available to send to BPL for fractionation could be increased by arranging for a greater number of donations of whole blood; but it could also be increased by ensuring that no part of any donation was wasted. Where a patient’s need was for the replacement of red blood cells, traditionally whole blood had been transfused to meet it. Yet more than half of any donation of whole blood consisted of plasma, which such a patient did not need. Red blood cells could be separated from whole blood, to leave plasma – and since it was that plasma which contained Factors 8, 9 and other proteins of therapeutic value, and could be sent for fractionation, to separate red blood cells out in this way was to make use of the whole of a whole blood donation.[1561] Efforts were thus made by Dr Maycock and his colleagues to persuade clinicians to use less whole blood and more packs of concentrated red blood cells. Whereas in Scotland the service had by 1974 managed to use some 30-40% of donations in this way, efforts at persuasion were met with little success in England, which achieved less than 10%.[1562]

Whether the solution to the problem of increasing the supply of plasma lay in more donations, better use of present donations, or a happy combination of the two,[1563] it came at a cost. Under the system of finance then adopted in the NHS, each regional health authority managed its own budget. The DHSS considered (for good reason) that a large contributory factor to the lack of supply was an unwillingness on the part of regional health authorities to spend money on sourcing and providing materials to a body (BPL) over which they had no direct control. Neither BPL nor the National Blood Transfusion Service (“NBTS”) could direct them to do so; the DHSS would not direct them to do so; and the system of funding had thus far not permitted it. The regions had other demands on their finances which they saw as more pressing.

Pressure on the Government to act was exerted by clinicians, by the medical press, by NBTS, and by the advisory bodies which had been set up.[1564] The expense of having to purchase more and more concentrates from commercial enterprises, on a continuing basis, could harm not only the public finances, but possibly public health, and might put at risk the voluntary donor system itself. The Government was aware, too, that sacrificing national control over a valuable therapeutic supply, and rendering the UK vulnerable to a shortage of supply from third-party providers outside the control of the UK, could add to these difficulties. Accordingly, a policy began formulating within the DHSS that self-sufficiency should be achieved.

DHSS planning target

Between October 1974 and the end of the year, the policy evolved internally. On Christmas Eve 1974 a letter to regional administrative officers set it out. The DHSS recognised that there was an immediate need to provide AHG concentrate, equivalent (now) to some 275,000 blood donor units. It was because of this need, the cost of commercial alternatives, and the potential threat to the voluntary donor system if commercial firms considered it worth their while to establish panels of paid donors in the UK to obtain their own supplies, that the DHSS regarded it “as of the greatest importance[1565]… that the NHS should become self-sufficient as soon as practicable in the production of PPF and other blood products.[1566] Since supply of raw material in the form of plasma depended on the number of blood donations collected, and the extent to which clinicians were prepared to use blood in the form of concentrated red cells – ie not using whole blood, but only the red cell portion of it, leaving the balance (plasma) to be sent for fractionation – they were to be encouraged to do this. The reluctance of regional health authorities to fund a greater supply of plasma to the central production facilities was to be met by an exceptional step: the bill would be paid by the DHSS centrally. However, in deference to the principle of regional control over expenditure, this would not be by expenditure directly from the centre but by making funds available to the regions which would be earmarked for the purpose of increased plasma production.[1567]

On 22 January 1975 the self-sufficiency policy which had been devised during the previous three months was made public by Dr Owen. In answer to a written parliamentary question he said: “I believe it is vitally important that the National Health Service should become self-sufficient as soon as practicable in the production of Factor VIII including AHG concentrate. This will stop us being dependent on imports and make the best-known treatment more readily available to people suffering from haemophilia. I have, therefore, authorised the allocation of special finance to boost our own production with the objective of becoming self-sufficient over the next few years.[1568]

In his evidence to the Inquiry, Lord Owen made it clear that a major motivating factor, as far as he was concerned, was to ensure the safety of the patient. He had in mind the dangers of hepatitis, and the increased risks of this posed by the pool sizes used by commercial companies.[1569]

On 25 and 26 February 1975 he again pointed in Parliament to the £500,000 of special financing which was “to increase the existing production of Factor VIII”.[1570]

Yet again in Parliament, on 7 July 1975 Dr Owen described the Government’s policy as “to make the NHS self-sufficient in the production of Factor VIII as soon as practicable”.[1571]

The implementation of the policy was overseen by a policy official, Donald Jackson, in the DHSS.[1572] On 11 July 1975 he recorded that he had set targets which would produce plasma (for concentrate production) from 337,000 blood donations. “This is some 20% more than the total of 275,000 recommended by the Expert Group on Haemophilia but that figure must be regarded as the minimum.”[1573]

This led to an interesting insight into the extent to which ministers were kept abreast of knowledge within their departments. The Haemophilia Society met Dr Owen in December 1975, and attempted to point out that the DHSS was pursuing the wrong planning target. It was too low, given that Dr Biggs’ paper adopted by the MRC Working Party in January 1974 as a basis for future planning had suggested over 500,000. Dr Owen did not recognise Dr Biggs’ paper for the working party when the Society referred to it and he asked to see a copy. He told them he would look at the MRC study and would write to the Society giving the basis for the Department’s target.[1574]

The DHSS targets were expected to be met in 1977. Estimates of need shifted upward in the interim. In January 1976 Dr Bidwell, director of the PFL, wrote a confidential paper based upon her understanding of the number of patients with haemophilia and on “internal data from PFL and RTD (75) 26”.[1575] She made assumptions as to yield,[1576] and as to the appropriate figures produced by conversion from international units to plasma volume to weight in kilograms to number of donations. On the basis of these she calculated that the need would be for between 970,920 and 1,213,650 donations to be devoted to plasma supply (or 36-45 million international units).[1577]

If she was right, then the target which had been set, and which was to be met 18 months later, was, as she noted, already an aspiration to produce only around a third of what was needed.[1578]

When the Expert Group on the Treatment of Haemophilia and Allied Conditions next met (in May 1976) Dr Biggs advanced an estimate similar to that of Dr Bidwell. It was that the total requirement would be 40 million units of Factor 8 in all forms (both concentrate and cryoprecipitate) based on a population of people with haemophilia of approximately 3,000. This estimate did however, depend upon the accuracy of her assumptions about the amount of plasma that would be obtained from each donation.[1579]

The expected supply from the NHS was in the region of 31-34 million international units, provided that the rate of production of cryoprecipitate was maintained even as the production of freeze-dried concentrate increased. This left a shortfall of some 6-9 million from Dr Biggs’ estimate. In the end the Expert Group agreed not to fix a new specific target but to review it again when the original target figure had been attained.[1580]

The discussion in the Expert Group – in particular its belief that “with the extension of home treatment joint surgery etc the current target may represent no more than ⅓ to ½ of the amount of Factor VIII which may be required in 5 years time or less” – was reported to the Central Committee of the National Blood Transfusion Service. It was also reported that“The Department [of Health] is considering the implications of the new advice.”[1581]

At the Central Committee’s meeting on 22 June 1976, the chairman proposed a review of the clinical use of blood and blood products, including an examination of the overall use of blood and blood fractions, and “whether optimal use was being made of the raw material – donated blood”, no doubt being of a mind that if less were to be used to give transfusions, more plasma would be available from the same number of donations, and self-sufficiency could be achieved more easily.[1582]

The review was to be carried out in late 1976 by Thomas Dutton and Dr Waiter, both civil servants in the DHSS, who were joint secretaries of the Central Committee for the NBTS. In the course of preparing it, it became apparent that there were difficulties in obtaining sufficient information, but they referred the Committee to a paper one of them (almost certainly Thomas Dutton) had prepared earlier, as background to the problems the Committee was facing.[1583] This was itself a thoughtful review, covering a wide range of issues. For the purpose of this chapter, it is necessary only to say as follows. It commented that it was difficult to obtain a reliable estimate of the amount of any product which would be required in the foreseeable future, since in many instances the pattern of treatment which had developed might well have resulted from current shortages which might, in time, be overcome. It considered that clinicians were uncertain about their requirements, and this uncertainty created major problems for the blood service and the central blood laboratories, noting that “The clinicians now believe that they will require 3 times the amount of Factor VIII originally forecast” because the greater use of Factor 8 concentrates had opened up new treatment possibilities, such as home treatment and rehabilitative surgery. This fluidity in treatment and demand meant that “Self-sufficiency in blood products is clearly not a static situation which once achieved will require only infrequent modification. In its fullest sense it would mean attempting to keep up with developments in the world industry in blood products which shows few signs of reducing its activities despite WHO resolutions about the undesirability of relying on paid blood donors.” It noted that the health departments had decided to set up a small expert group to consider likely future trends in demand.[1584] The small expert group to which this was referring was the Working Group on Trends in Demand for Blood Products, which was established in January 1977.

Trends Working Group

On 13 January 1977, before the first meeting of the Trends Working Group, Dr Waiter attended a meeting of haemophilia centre directors. Professor Blackburn said that reference centre directors understood that the blood services could supply enough plasma to produce 40 million units of factor concentrate per year, which they regarded as a minimum requirement. However, they considered that there was a “hold-up in the expansion of fractionation in the U.K.[1585] Dr Waiter responded that the DHSS had understood that the capacity at Liberton, Elstree and Oxford was adequate.[1586] With the stated capacity of those centres, a target of 50 million international units could be met. The maximum capacity of each of the three production facilities was examined – 14-15 million international units for Elstree and Oxford[1587] combined, and (said Dr Iain Macdonald from SHHD) 60 million international units at PFC, though to achieve this would need some £25,000 for new equipment and extra running costs, which would include payment to staff to operate a 24-hour shift system of working.[1588]

The Trends Working Group had its first meeting in February[1589] and reported in December 1977.[1590] It considered not only the production and supply of factor concentrates, but of all forms of therapeutic blood products. It calculated that over a 10-year period the amount of Factor 8 used by the UK would reach a level which equated to 60 million international units per year – but the amount of albumin required by the UK meant that the plasma used for that would enable 74 million international units of Factor 8 per year to be produced.[1591] It envisaged that there would be a complete transfer from cryoprecipitate to fractionated freeze-dried concentrate but that “Considerable further investment in collecting, testing, processing and premises”would be needed to meet the proposed targets: it was thought that the“present UK capability is less than half what we regard as essential. Additional major investment is, therefore, also needed for this.”[1592]

The estimate for the next ten-year period used by the Trends Working Group to assess the needs of the UK population for factor replacements measured in international units usefully equates to 1,000 international units per 1,000 population: such that a population of, say, 56 million as it became in the 1980s would require approximately 56 million units,[1593] one of 60 million would need approximately 60 million units and so on. Thus as the population increased so too some increase in the quantity of fractionated product would be necessary. This explains why there were higher estimates for the requirements in ten years’ time than the amount thought optimum in 1977.[1594]

The minimum goal which had been set by the DHSS of 340,000 donations was met by mid 1977. Roland Moyle, who had by now succeeded Dr Owen as Minister of Health, reported this to Parliament the following year.

He reported that (by then) both BPL and PFL were working to full capacity, though that capacity was being increased.[1595] Between them, they produced approximately 15 million international units of Factor 8 concentrate per annum. About the same amount of cryoprecipitate was produced by NBTS each year. Production in England was thus some 30 million international units. Total usage of Factor 8 was reported to be approximately 45 million international units per annum. That suggested a shortfall of 15 million international units.[1596]

How the shortfall would be made good from domestic production (it was plainly being met by the purchasing of commercial concentrate for the time being) was addressed only by saying that the regions were “being asked to provide more fresh frozen plasma to the central processing laboratories where the National Health Service concentrate is produced.[1597] Quite what it was hoped to achieve if they did so, since both BPL and PFL were said to be working at full capacity, is unclear. In December 1978 Roland Moyle recognised that the commitment to self-sufficiency had “not yet” been met.[1598]

This came after further estimates of demand/need.

First, the estimates which had been reported by the Trends Working Group in late 1977 were largely thought to be reliable estimates for future need by the Standing Medical Advisory Committee in 1978.[1599]

This was no surprise to the DHSS. In a memo sent to medical and administrative civil servants, as well as Dr Maycock and Dr Richard Lane, Thomas Dutton observed that the report “essentially it says no more about Factor VIII requirements than some experts have been saying for years and which has now come to be generally accepted”.[1600] It was assumed in his memo that there would be general acceptance of the need to provide blood products on the scale envisaged by the Trends Working Group. He noted that a Blood Components Production Programme for the next ten years needed to be drawn up: the Central Committee for the NBTS and its Sub-Committee on the Central Laboratories was “on the basis of past experience not adequate for this task, even if their roles were changed from advisory to managing committees.” He continued:

“The production programme will require ‘managing’ in every sense (although it may be advisable to avoid the term and refer to ‘coordinating’) by people who are closely involved, and who are responsible for the outcome. They must be able to achieve results in a situation where success depends on the ability to persuade Health Authorities to co-operate, which in some instances may mean giving up activities [which have] hitherto been regarded as part and parcel of their function.”[1601]

His conclusion was that to meet the requirements foreseen by the Trends Working Group:

“economically and without a great deal of frustration, expensive duplication and uncertainty will require a major effort in co-ordination in a highly technical field. The constitution of the NHS and the understandable desire of Regional Authorities for autonomy are natural obstacles to sustained co-operation of the kind that will be required and it is proposed that the 10 year Blood Components Production Programme should be co-ordinated by a managing committee expressly set up for the purpose. In view of the inter-dependence of England & Wales and Scotland in blood products, particularly the need to be able to fall back on other plant if there is a local breakdown or contamination, it is proposed that the Blood Products Production Programme should be drawn up on a UK basis … The essence seems to be that without a measure of co-operation which has hitherto been unnecessary, there is little likelihood of achieving the targets set by the Trends Working Party, far less of achieving them at the minimum cost to the NHS and that for this purpose the setting up of a national machinery for co-ordination is unavoidable.”[1602]

These words echo the themes explored in the paper which he and Dr Waiter as joint secretaries had sent to the Central Committee of the NBTS: the problems caused by the regional structure and funding of the NHS; the consequent reluctance to provide plasma to BPL for fractionation; and the absence of any power to require plasma to be provided.

At the end of 1978, the Minister of State for Health, Roland Moyle, told Parliament “that self-sufficiency has not yet been achieved, and my Department is therefore reviewing production in relation to present demands and resources.[1603] The review which followed considered various options, which included handing the whole fractionation process to the commercial industry as well as increasing the capacity at UK facilities. The latter was thought more viable.[1604]

Plans for BPL capacity to meet forecast need

The next year there was again an upward shift in estimates of future needs, though slighter on this occasion than it had been in the past: current demand for Factor 8 was probably in the region of 60 million international units, but if clinical freedom continued, it was possible that the eventual requirements might well approach the 100 million international units per annum mark.[1605]

Those figures helped to inform planning of the capacity of a redeveloped BPL (of which more is written later). Although it was recognised that estimates for five years’ time were highly speculative, it was obvious that the new production facility then being contemplated required some attempt at meeting them. Dr Lane considered in 1979 that a redeveloped BPL could be commissioned to increase production capacity to 90 million international units between 1985 and 1990, as an intermediate stage,[1606] and then to scale up production to 120 million units for the years that followed.[1607]

By 1982, planning was eventually under way for a new building to constitute BPL at Elstree. The approval given by the Treasury in November 1982 for the reconstructed BPL was for a plant capable of fractionating 400,000 kilograms per annum.[1608] The overall plasma supply target was however likely to be somewhat higher, to include plasma used to produce cryoprecipitate, which still retained a useful therapeutic function, albeit that it was foreseen to have less of a primary role in treating bleeding disorders.

A plasma supply target of 435,000 kilograms, including this element for cryoprecipitate production, was broadly maintained thereafter, and although estimates thereafter continued to increase, there was a growing consensus that demand for Factor 8 would increase to the region of 100 million international units, perhaps a little over. The increases in estimated needs were now gradual compared to those that came before.[1609]

Commentary

It follows from this chronological account that the therapies thought desirable for the treatment of people with bleeding disorders during the 1970s required larger and larger amounts of factor concentrate. The causes of this are most likely to have been the availability of commercial concentrate in substantial quantities from 1973 onwards, the use of which created a greater demand for still more to be supplied. A significant driver was the demand for home therapy, which was most conveniently satisfied by freeze-dried factor concentrate which could be kept in domestic refrigerators and reconstituted as needed; but it was not the only issue. The increased lifespan of people with bleeding disorders following the success of cryoprecipitate treatment since 1966 meant more patients required treatment, and the more active lifestyles encouraged by the availability of treatment may have led to a further need for it. Some clinicians aspired to develop prophylaxis, which used a considerably greater amount of material; but this was not widespread. Operations requiring factor concentrate cover became more common. Sadly, though, the figures used as a basis for Dr Owen’s central expenditure of £500,000 to enable the regions to produce more plasma were based on estimates that were outdated even before the policy was announced. This was realised in the DHSS. It was thought by 1977 that there was a need for a blood products production programme to ensure an effective and coordinated approach to achieving self-sufficiency. There never was one. Self-sufficiency remained an important – but elusive – goal.

It is difficult to avoid a conclusion that as a matter of fact, too little was done too late. The need may initially have been focussed on the provision of more plasma to be made available from the regions to a greater extent than on expanded production facilities, but it was always about both.[1610] This was known to the DHSS. The opportunity was missed.

How much concentrate was actually used?

The extent to which Factor 8 replacement therapies (fresh frozen plasma, cryoprecipitate, NHS factor concentrates and commercial concentrates) were consumed in the UK year by year between 1969 and 1990 is best portrayed by a chart, as shown at Figure 1 below. Factor 9 is not shown since for most of the period of central interest the UK was self-sufficient in it.

The graph shows that the most used product was cryoprecipitate until 1977 and then commercial Factor 8.  Total product usage increased 1969 to 1990

Figure 1. Total UK Consumption (Factor 8)[1611]

Figure 1 is drawn from haemophilia centre directors’ statistics. The Inquiry is satisfied that these represent the best available and most complete source. Nonetheless, the data has limitations. The figures are based upon the information which was given by the centres. That was incomplete. Not every centre made returns, and some made them for some years, and some never. The nature of the returns had the appearance of having been hastily compiled in many cases, rather than the appearance of care being taken to cross-check. Further, product was consumed in vials or bottles which usually contained 250 international units. However, some bottles were larger. Not all of their contents might have been used. But the full contents will still count as “usage”. The usage of cryoprecipitate is particularly imprecise because it will have been prepared locally. Different laboratories in different regions will have achieved different levels of yield, making the estimates of the number of international units of Factor 8 provided to a patient imprecise. The consumption frequently appeared as “bottles”, from which a calculation of the number of international units contained was made based on an assumption as to the average number of units each bottle contained. Finally, since international units were only adopted as a recognised standard in 1973/4, the figures which are shown will have been calculated by the UK Haemophilia Centre Directors’ Organisation making a number of assumptions.

The graph shows the most used product was cryoprecipitate until 1977 and that use of NHS and commercial Factor 8 increased that year.

Figure 2. Total UK Consumption (Factor 8)[1612]

Figure 2 shows part only of the same chart as in Figure 1, in order to focus attention on three matters.

First, it was not until mid 1976 that cryoprecipitate ceased to be the main source for Factor 8 replacement: at the start of the year just over half the total consumption came from that source. The chart does not show a headlong rush, across the board, by clinicians to embrace factor concentrates. Given that the use of concentrates was especially prevalent in home treatment, it may be inferred that cryoprecipitate remained the main product used for much hospital treatment until later in the 1970s. It follows that when, in 1983, consideration was being given to whether a ban on the importation of imported concentrates might lead to there being no product capable of preventing serious bleeding, recent experience should have led to the realisation that cryoprecipitate was an acceptable alternative.[1613]

Second, it shows the dramatic effect of Dr Owen’s initiative in increasing the supply of plasma to BPL, but that instead of the effect of this being to reduce reliance on imported concentrates, the consumption of those also increased in volume at very much the same rate as did that of NHS product. Figure 1 shows that between 1974 and 1988 more commercial concentrate was consumed in the UK than NHS concentrate. Even by 1990 self-sufficiency had not yet been achieved in England and Wales, though it was effectively achieved just after that.

Third, it shows a spike in the intake of Factor 8 from factor concentrates and cryoprecipitate combined in the 1977 figure. It is not clear what was the cause of this: readers should however be aware that it might simply be an artefact of the figures, which otherwise show a steady increase in the uptake from the start of the 1970s until 1990.

The graph shows the 1975 and 1977 DHSS targets for Factor 8 were less than total usage but the MRC Working Party had made a higher estimate.

Figure 3. Targets and estimates of need (DHSS and MRC Working Party) against actual usage.[1614]

On the basis of these figures,[1615] despite the uncertainties of the data, it can be seen that the estimates of the MRC Working Party were generally in excess of what was actually used until the early 1980s, though the initial DHSS planning targets were not. The problem in achieving self-sufficiency was not, therefore, a failure sufficiently to estimate what would be needed: it was a failure to plan and produce it, partly because of lack of sufficient plasma, but mainly because the central production facilities were inadequate.[1616]

Supply of plasma for fractionation

Plasma for fractionation was of two kinds: recovered plasma and time-expired plasma. Recovered plasma was plasma which was separated, either manually or by machine, from a whole blood donation. It was recovered and immediately frozen and is thus identified as “fresh frozen plasma”. Time-expired plasma is plasma which is recovered from a donation which has been kept for so long that the red blood cells have ceased to be effective for treating a patient. Over time, the quality and acceptability of donated red blood cells diminishes. The same is true of some of the proteins contained in plasma. Thus Factor 8 loses activity over a short period of time: in the body, it has a half-life of around 12 hours. Accordingly, if it is to be harvested from blood, the blood has to be fresh. Otherwise the plasma component of the donation will not form a useful base from which to extract Factor 8 or Factor 9. However, albumin and immunoglobulins take longer to deteriorate. Time-expired plasma remains useful for making those products.

This leaves two matters relevant to understanding the supply of plasma. First, blood was collected regionally. If a region were then to recover plasma from whole blood to send to BPL or PFC, it would first have to be separated from the red blood cell component of blood. Facilities to make that separation had to be available, whether the separation was to be manual or by machine. Alternatively, plasmapheresis might be organised. This involves a donor giving only their plasma and not their red blood cells as well. Plasmapheresis was generally achieved by machine. Blood was taken from the donor and passed through a machine designed to separate the plasma from the red blood cells. The red blood cells would then be returned to the donor. Since plasma takes a couple of weeks or so to replenish itself within the body, whereas red blood cells take very much longer, it is possible without significant harm to the donor for a donor to give as many as 26 donations of plasma in a year by plasmapheresis. Donors who provide whole blood, and who do not have their red blood cells returned to them, may safely give blood no more than four times a year (if male) and three times a year (if female).[1617]

The second aspect of importance is to note that to focus upon the amount of plasma “received for fractionation” by the fractionating plants is not the same as asking how much plasma was “received for the production of Factor 8 and Factor 9”. Time-expired plasma, for instance, would be received for fractionation because it remained a valuable source of albumin and immunoglobulins, the quality and utility of which did not diminish quickly over time, but would not be usable for making factor concentrates because little or no clotting factor activity would remain once “time” had expired.

It is for these reasons that the figures for plasma supply have concentrated upon the supply of fresh frozen plasma. Most came as recovered plasma; a little came from plasmapheresis.[1618] None came as time-expired.[1619]

The chart below shows the supply of fresh frozen plasma to BPL each year.

This graph shows that the supply of fresh frozen plasma to BPL increased each year from 1976 to 1984, with slower growth from 1977 to 1980.

Figure 4. Fresh Frozen Plasma Received[1620]

Much of the information available to the Inquiry relates to the position in England and Wales. However, the policy until early 1978 (as to the change that then took place, more is written later) was to see the UK as one unit in terms of production of factor concentrates for use within the UK. That meant aggregating the production capabilities of both the Scottish and English production facilities. If, therefore, fresh frozen plasma was supplied to BPL and PFL in excess of their current capacities to process it, the Edinburgh facility would have been available to process it, providing always that it had not been supplied with a surfeit of fresh frozen plasma from within Scotland itself.

The story, broadly, of the period from 1967-72 is that there was little concentrate in circulation, and its use was limited. Nonetheless, there was a slowly growing demand for it, and a sense that the UK needed to increase its own production if commercial concentrates were not to dominate the market before long. There was a growing gap between the amount of fresh frozen plasma being sent from the regions to BPL for fractionation and the demand for finished product. Usage of commercial concentrate was low, until 1974-75, but it had by then become available generally rather than under the named patient exemption. In 1973, as can be seen, most treatment needs for people with haemophilia were met by cryoprecipitate. At least 80% of the usage was by this means. Cryoprecipitate was made regionally. Standards varied. But in general, no shortage of cryoprecipitate for treatment was thought to exist. A push for self-sufficiency began in 1973, when the convenience of factor concentrate for home treatment led to suddenly increasing demand.

There were three essential problems with increasing the supply of plasma to the central production facilities, over and above the question of financing it. The first was the way in which the National Blood Transfusion Service was organised. As the chapter on Organisation of the Blood Services discusses, and as Drs Harold Gunson and Helen Dodsworth were to observe in 1996: “A problem which BPL has had to contend with throughout its history is that it has never been in control of its plasma supply, with the exception of 1975 when the DHSS financed an increase. The only argument which could be used with the RTCs [regional transfusion centres] was persuasion.”[1621]

This became the subject of an editorial in The British Medical Journal in 1974.[1622] It argued that the blood services were ill-equipped to do the job as a modern transfusion service, suggesting that the “shortage” of blood and therefore plasma, allowing entry to the UK of the products of pharmaceutical companies was not a real shortage, but a consequence of poor administration, organisation and underfunding, and that there was urgent need for a national policy for the procurement and distribution of voluntarily donated blood.

Importantly, this was not simply a matter of comment from the medical press. It was recognised at an early stage by those civil servants most closely involved with advising ministers about blood supply. In a review in 1976 of the requirement for blood products and their availability, Thomas Dutton[1623] echoed some of the themes The British Medical Journal had highlighted by identifying that:

“The customary method of financing the NBTS is not conducive to the development of such a partnership and it was probably this more than any other single factor which led to the delay in mounting the AHG (Factor VIII) Concentrate production programme.”[1624]

This document, discussed above, is significant. It spoke not only of the problems of the way in which the blood service was organised, and the problems that regionalising autonomy had created in organising a national production service which depended upon the regions spending their money for these central purposes at the expense of some of their local priorities, but of the delaythis had caused in achieving sufficient production of factor concentrates.[1625]

A second factor, as Thomas Dutton and Dr Waiter pointed out, was the way in which blood supply was financed. This was through the regions, each of which in England was allocated its own finances. Although a clinician could approach a regional or hospital treasurer to obtain funding to purchase factor concentrates, regions were persuaded only with difficulty to spend money on producing plasma, freezing it, and supplying it to a body outside the region and its control (BPL).

Obtaining blood from donors was a matter for each region. It came with some cost. Separating plasma from that blood in order to transfer to a body which was effectively a third party involved staff, equipment, premises, and gave rise to some more cost. The regions could not be sure of any return on this expense for the benefit of the patients for whom they were responsible. Accordingly, persuading them to produce more for a benefit which was uncertain was always likely to be difficult. There was no compulsion upon them to do it.

It was not until Dr Lane secured the introduction of a return of factor concentrates to the regions in proportion to the amount of fresh frozen plasma they had supplied to BPL (“the pro rata scheme”) that there was much change.[1626] It was recognised that there were some centres which had a special need for the supply of concentrate, such as was necessary at Treloar’s. Allowance was therefore made in this scheme to accommodate them. After adjustments to the pro rata principle, to deal with special cases such as Treloar’s, the DHSS introduced the scheme with effect from 1 April 1981.[1627] It was successful in helping to ensure a greater supply of plasma from the regions to the central processing units in England and Wales.[1628]

By 1989, however, the increase of plasma supplies and factor concentrate production following the redevelopment of BPL coming into full production meant that some regions which sent large quantities to BPL might receive more Factor 8 than they could use, whilst others that were not “plasma rich” would be left in need. The original rationale for the system, encouraging regions rapidly to increase the quantity and quality of plasma supply to BPL at a time when demand for finished NHS product exceeded its supply, no longer applied. So, in 1989, a revised system of cross-charging was introduced following trials.[1629]

A third problem was the persistence of treating doctors using whole blood rather than red blood cells.[1630] Not only that, but many clinicians overused transfusions. Doctors had for several years been encouraged to transfuse to patients only that component of blood which they really required – in most cases, this would be the red blood cell component – and to transfuse no more than was strictly necessary.[1631] Ever since the Second World War, doctors involved in facilitating transfusion had been at pains to encourage a parsimonious approach to administering blood.[1632] These pleas initially fell on deaf ears. Clinicians tended to use techniques, including administering transfusions, with which they were familiar, in ways and in amounts with which they were familiar, rather than taking any active steps to change what had become their clinical habits.

It was difficult for those supplying blood to do much to change their practice. This was because in the first 40 years of the history of the NHS a huge respect was paid to the clinical judgement of practitioners treating their patients as they individually saw fit. This approach – “clinical freedom” – dictated that in the great majority of cases the guiding rule was that other doctors and in particular, administrators, would not significantly interfere in the treatment being provided.[1633] Medical treatment was thus individualised not only to the patient, but also by reference to the particular treating doctor and their habits. Unless a doctor wished, therefore, or could be gently persuaded, to take a different approach, it was not felt that anyone else had any right to interfere. The consequence of taking this approach, patient by patient on an individual basis, when it came to transfusion was that the collective interest of other patients was little considered. That collective interest lay in those blood components which were not going to be of any real use to the immediate patient being diverted for the benefit of those others, rather than uselessly put into the veins of the immediate patient who (almost by definition) did not need them. Persuasion, accompanied by some education, was the only tool, and it was not very successful.[1634]

Other steps to improve the quantity of plasma supplied for fractionation

Other steps were taken to improve the amount of plasma, and its quality, coming for fractionation to BPL.

Plastic bags and single plasma packs

In the very early 1970s, some regions were still using glass bottles with which to convey frozen plasma to BPL. Plastic bags were much more suitable. They were easier to freeze. They were easier to thaw. And they were easier to handle. First, there was a change from glass to plastic; and then a change in the size of bag used. This latter was important, but needs some explanation.

The system that operated was to use five-litre packs of plasma. It became apparent by mid 1977 that the system of using such packs was not compatible with good manufacturing practice, unless the “pooling” of the plasma donations before filling the packs took place in aseptic units.[1635] Such units would be expensive to develop. By contrast, single-donor plastic packs which formed a “closed” system for separating and handling plasma became seen as preferable. A further advantage in the use of single plasma packs was not only that open processing at regional transfusion centres would become minimal, reducing the need for redevelopment and capital expenditure to provide clean and sterile areas, but that such packs could more effectively be tested by RIA[1636] than five-litre packs. They were also suited to rapid thawing, which improved the yield of Factor 8 by as much as 60%.[1637] The change to single plasma packs did necessitate changes to storage facilities and production methods, but they lent themselves readily to the process of splitting open the bag (indeed, bags were designed specifically for the purpose in a wedge shape so that the contents could be more easily pooled in the process of manufacturing concentrate).

Trials using single plasma packs began in late 1980.[1638] The change to single plasma packs proved efficient in saving time and resources at regional transfusion centres.[1639] Although there is no direct evidence that a change from five-litre to single-donor packs of frozen plasma for fractionation increased the quantities of plasma coming to BPL, the combined effects of a pro rata scheme and the widespread use of single plasma packs were probably responsible for an increase in the amount of fresh frozen plasma received at BPL in 1981/82. It was the first time in five years, according to Dr Lane, thatthe input of plasma for fractionation had increased.[1640] It had increased by over a quarter. Not only, it seems, did the use of single plasma packs improve the quantity of the plasma received, but also its quality. The faster the plasma was frozen, the more the activity of the Factor 8 within it that was preserved. The yield from plasma collected in this way was thus always likely to be higher, and there was less risk of contamination.

Use of packed red blood cells

In the early 1970s, Scotland was well ahead of England in ensuring that as much blood as possible could be administered to patients without unnecessary plasma. This meant ensuring that they were “packed” red blood cells (sometimes called red cell concentrates). Until the mid 1960s, clinicians almost invariably used whole blood to replace red cells and to restore blood volume in patients.

A booklet entitled “Notes on Transfusion” was issued by the DHSS, SHHD and the Welsh Office in 1973.[1641] It began by saying that “Transfusion therapy should be undertaken only after careful assessment of the patient’s clinical condition to determine the nature and quantity of fluid to be transfused and the rate of administration. The patient may require whole blood, concentrated red cells or other blood components or one of the special plasma fractions” It then went on, in bold, to say:

“A transfusion should never be given without a definite indication; not only is this in the patient’s interest, since an element of risk is associated with every transfusion, but supplies of blood are not unlimited and with the ever growing demand for blood it is imperative that it should not be used unnecessarily.

The use of transfusion to correct moderate or slight degrees of anaemia that could be overcome as effectively, if more slowly, by other means, seems unjustifiable unless some cogent reason for speed of recovery exists. In some instances failure to institute simpler and safer but equally effective treatment earlier leads to the quite unnecessary use of blood transfusion.”[1642]

Similar calls were made throughout the 1970s.

Optimal additive solution: “SAG-M ”

Some clinicians were reluctant to use concentrated red blood cells because they had greater viscosity,[1643] were slower to transfuse, and often had to be pre-diluted with saline. Some concerns were raised that over-insistence on the use of concentrated red cells would lead to clinicians using plasma protein fraction in addition, to provide a product that flowed more easily, thus limiting the amount of additional plasma that might be released for fractionation.[1644]

In order to prevent red blood cells being too viscous to be administered easily, in turn leading to clinicians not wishing to use that as a component on its own, some of the plasma that might otherwise have been separated from them was kept together with those cells. This also provided the red blood cells with the nutrients they required to survive.

Although some 55% of blood is composed of plasma, the use of some of the plasma in this way to carry and support the red cells, meant that generally only 180ml of plasma could be removed from each donation.[1645] So far as viscosity was the problem, a fluid like saline could be used to provide greater fluidity. However, this would leave the red blood cells deteriorating, since they were unsupported by the nutrients they might otherwise have obtained from plasma, and defeat the purpose.

Accordingly, scientists searched for a solution they could add to packed red blood cells which might both counter the viscosity of the concentrated red cells and also provide those red cells with a source of energy. The optimal additive solution was discovered to be a mixture of saline, adenine, glucose and mannitol: “SAG-M” as it more conveniently became known.[1646] Its introduction permitted more of the plasma in a donation of whole blood to be separated out, for that could be replaced with SAG-M. This enabled approximately one and a half times as much plasma to be recovered from a donation as had been previously separated out, whilst doing more to preserve the longevity of the red blood cells at the same time.

SAG-M was probably introduced in September 1982.[1647] At the time it was anticipated that its use combined with the greater use of red cell concentrates instead of whole blood would enable around 75% of the plasma required for self-sufficiency to be obtained from the donations of whole blood then being made.[1648] At the regional transfusion directors’ meeting in January 1983[1649] it was recommended that SAG-M be introduced as soon as possible, and it was to be expected that it would yield a considerable increase in plasma which might be available to BPL for fractionation.[1650]

Apart from increasing donations of plasma (as to which the evidence before the Inquiry was that regional transfusion directors saw little difficulty in ensuring an increase), coupled with less use of whole blood, greater use of concentrated red blood cells and the use of SAG-M would secure close to the amount of plasma which BPL on its own could handle. Dr Lane recorded that it was in the last quarter of 1984 that plasma supplies to BPL increased as a result of SAG-M.[1651]

Plasmapheresis

A further way of increasing the amount of plasma available for fractionation was to recruit more donors specifically for plasmapheresis.

Plasmapheresis first began in the United Kingdom in early 1967.[1652] It is a process in which blood is removed from the body, plasma is separated from it, either manually or by machine, and the remaining part of the donation (red blood cells and platelets) returned to the donor on the same occasion. By December 1969 it was being suggested at a meeting of regional transfusion directors that perhaps plasmapheresis should be practised more widely and rather more intensively than was currently being done.[1653] By March 1973 Dr Maycock was suggesting that extended use of plasmapheresis might be the most economical way of obtaining the plasma required for fractionation,[1654] and the Expert Group on the Treatment of Haemophilia at its meeting of 20 March 1973 suggested that increasing collection of plasma by plasmapheresis should be considered.[1655] It was also a safer base material from which to manufacture concentrates, because donations obtained by plasmapheresis were larger in volume than those recovered from donations of whole blood. The amount of plasma from one donation of whole blood was 40% of that from a donor of plasma by plasmapheresis.[1656]

The argument was not all one way in favour of plasmapheresis. Regional transfusion directors showed some resistance.[1657] The Joint Steering Committee thought that the use of plasmapheresis might be difficult to justify as a means to meet targets because it inevitably exposed donors to risk.[1658] By January 1975 the Central Committee for the NBTS expressed the view that increased plasma supplies could be achieved either by increasing concentrated red cell use or by plasmapheresis: however, ultimately it was decided not to ask donors to subject themselves to frequent plasmapheresis.[1659] This was echoed by Scottish National Blood Transfusion Service (“SNBTS”) directors, who in a meeting in June 1975 also decided not to proceed with plasmapheresis: it would be premature to do so.[1660] When the Haemophilia Society told Dr Owen about the achievements of Dr Tom Cleghorn’s plasmapheresis unit at Edgware in December 1975, a civil servant at the meeting recorded that he “explained that there were professional differences of opinion about this process.[1661]

A fear, which inhibited the use of machine plasmapheresis, was that cardiac arrest might occur. Accordingly, a cardiac arrest team had to be available when this was practised. This made it impractical to use machine plasmapheresis for voluntary donors on a wide scale.[1662]

Over a few years, the landscape changed. Dr Angela Robinson[1663] became an enthusiast for plasmapheresis, and set up a pilot unit in Yorkshire in 1980.[1664] At the same time, Dr Gunson (now consultant adviser to the Chief Medical Officer on blood transfusion) expressed the view that plasmapheresis should be addressed further.[1665]

At this point it was clear that, in the US, pharmaceutical companies acquired their plasma almost entirely from plasmapheresis. A market in plasma had grown up. Plasma brokers obtained their plasma either from third parties who used plasmapheresis or by using plasmapheresis themselves. In the UK there had been reluctance to engage in it, despite the considerable increase in supply to which it might lead, because of fears for the safety of the donor. Machine plasmapheresis was looked upon sceptically. There were, however, mixed feelings. It was recognised how useful plasmapheresis might be in the UK, provided that it could be performed with reasonable safety.

Further movement came. In England, in 1981, Dr Gunson tabled a paper concluding that the number of plasmapheresis facilities should be increased. The Advisory Committee of the NBTS agreed to set up a Working Party, and liaise with Scotland. It was considered vital at least to consider plasmapheresis.[1666]

It was at this stage in the story (in 1982) that the availability of a new machine, the Haemonetics v50, became a turning point in the UK. A pilot study by Dr Robinson’s team showed that use of it enabled plasmapheresis more efficiently, and more safely.[1667] The plan in Scotland was now to increase the use of plasmapheresis,[1668] and no longer just to consider doing so.

Though there remained concern that the growth of plasmapheresis might deter donors[1669] and whether plasmapheresis could become an “economic proposition”,[1670] plasmapheresis slowly began to gain ground: and manual plasmapheresis to fall out of favour compared to machine plasmapheresis.[1671] Nonetheless, the routine recovery of plasma from whole blood donations, especially increased by separation of red blood cells and the use of SAG-M plasma, were still thought to be the methods of choice, with plasmapheresis making up for the deficit.[1672]

By mid 1984, studies in Scotland comparing machine and manual plasmapheresis and looking at optimal additive solution found that the cohort of plasma donors was highly motivated, there was a unanimously favourable response to machine plasmapheresis, the quality of plasma obtained was good and the costs comparable with manual plasmapheresis.[1673]

Plasmapheresis continued to grow – albeit unevenly – as a source of plasma. Thus, in 1986/87 it gave rise to just over 10% of the plasma sent to BPL.[1674] By 1989, 73% of plasma sent for fractionation came from the use of SAG-M (in plasma wedge packs), 13% from plasmapheresis and 9% from time expired blood.[1675]

Freeze-dried cryoprecipitate

Finally, the possibility of small pool freeze-dried cryoprecipitate was considered.[1676] It was thought that this might yield more Factor 8 units per litre than concentrate, because of the proportion of units inevitably sacrificed during the production of the latter, and be a safer proposition than concentrate because the pool from which it was made would consist of some 10 donors, rather than a pool contributed to by more than 1,000 donations, as was by now the case where factor concentrates were produced domestically.

Although a clinical trial was conducted in the West of Scotland, John Watt (director of PFC) saw that great practical difficulties might arise if freeze-dried cryoprecipitate were to be produced by PFC alongside Factor 8 concentrate. For these reasons, and additionally because the unit in the Law Hospital making it was closed down, such a product never gained traction in Scotland (or the UK more generally) as it had done in other parts of Europe.

Commentary

Increasing the supply of plasma for fractionation could be achieved by:

  1. increasing the number of donations both of plasma (for instance by plasmapheresis) and of blood from which plasma could be separated (as to which the evidence before the Inquiry was that regional transfusion directors saw little difficulty in ensuring an increase);
  2. not wasting donations through unnecessary transfusions;
  3. reducing the number of units transfused to a patient at any one time;
  4. increasing the use of concentrated red blood cells;
  5. using single plasma packs; and
  6. using SAG-M.

Taken together, these measures would secure close to the amount of plasma which BPL on its own could handle when re-developed.

It could – and should – also have been achieved by organising the blood services so that instead of being regionalised they had a national directorate which exercised executive control. Since this would have had both a national perspective and the ability to require its regional centres to provide enough plasma to achieve centralised functions such as blood product production, this alone would have probably ensured sufficient supplies of plasma to meet the demands of fractionation. The change of structure was what regional transfusion directors, and Dr Maycock, had been calling for. It was what eventually happened – but this was more than a decade later and far too late.[1677]

Of the measures set out at (a) to (f) above, all save using single plasma packs and SAG-M were practices which could and should have been adopted in the early 1970s at the latest: the better use of blood had been recognised as important for some time before then, but both changing the culture of overuse of blood, and increasing the practice of using the red cell component of blood rather than whole blood, were not emphasised. This was probably because of a misplaced fear that clinicians might see efforts to do this as impinging on their clinical freedoms, whereas the purpose would actually have been to secure better and safer treatment for patients.

The measures of using single plasma packs and SAG-M were taken in good time, after persistence in finding the technical developments that facilitated them.

What did the Government’s funding of half a million pounds achieve?

At the time that Dr David Owen told Parliament that he had authorised the expenditure of £500,000 to achieve self-sufficiency within two years, he understood that the estimates were such that 337,000 blood donations would need to be fractionated to satisfy them. He was given to understand that the production facilities were sufficient to produce what was needed (although this depended upon Scotland making a large contribution to total UK production, of which more later). Accordingly, the problem was seen as producing more plasma for fractionation and the DHSS allocated special finance to regional health authorities (“RHAs”) in 1975 in order to boost production of factor concentrate. Part of it was used centrally to provide additional equipment for BPL, and the rest was distributed to the regions for the specific purpose of increasing their supply of plasma to BPL.[1678]

After detailed discussions between the DHSS and the regional transfusion centres, the additional donations required to meet the self-sufficiency target were calculated. Five of the regions produced no fresh frozen plasma for factor concentrates at all.[1679] By contrast, Oxford and North East Thames were already producing large quantities.[1680]

The revised figures in the light of the discussions allowed for continuing use of cryoprecipitate, produced regionally. However, in some regions the effect of the proposed arrangements would be to halve the amount of cryoprecipitate which was then available.[1681] There was unease about this, not least because the demand for cryoprecipitate continued to grow throughout 1975 and 1976 in some regions.[1682]

As part of the allocation of the available funds to the regions, BPL also benefited. Arrangements were made to purchase additional laboratory equipment centrally; and for three Sharples centrifuges which were also to be provided for BPL.[1683]

By 1976/77, the special allocations appear to have become subsumed as a part of the standard regional allocations, revised “in the usual way” to take account of cost increases.[1684] Expenditure dedicated to self-sufficiency cannot thus be tracked as easily. Expenditure for 1976/77 was set at £433,000 before cost adjustments, including revenue allocations.[1685]

There were early signs that the programme was successful: in April 1976 Dr Maycock thought that the target of 340,000 donor units was likely to be reached earlier than set. By October 1976, 90% of the plasma target had been achieved.[1686] It was exceeded in July 1977 according to Roland Moyle in response to a parliamentary question.[1687]

Was production capacity sufficient?

The ability of the UK fractionation centres to convert the amount of factor concentrate available into factor concentrates for therapy was dependent upon the premises and facilities available, the yield, the system of production used, the availability of labour, and the distribution of production between the three production units.[1688]

Production figures

Dr Lane became director designate in 1977, and director of BPL in 1978. He produced documents in a proof of evidence for the HIV litigation. These set out the amounts of concentrate produced and issued for “clinical use”.[1689] Dr Maycock, his predecessor, and Dr Lane, were both responsible for compiling annual reports. BPL’s perspective on its own production figures is set out in the chart below.

The graph shows how BPL’s production of Factor 8 increased, with an increase from 1975 to 1978, level to 1980 and further increases in 1981 and 1983.

Figure 5. BPL Production of Factor 8[1690]

The production capacity of BPL was said in the documents discussed above to have been 14 million international units in 1975. The chart shows that little better than that was achieved (on BPL’s own figures) until 1980. The inadequacy of the premises was part of the reason for this. The premises at Elstree were owned by the Lister Institute until 1978. Production of factor concentrates and other blood fractions there was funded by the DHSS. However, the site was constrained. Dr Maycock said so, and strongly.[1691]

Dr Drummond Ellis was in charge of Factor 8 production at Elstree.[1692] In June 1976 he echoed the views which had been expressed by Dr Maycock and others in a minute, which examined whether production could increase to meet a target of 25 million international units per year. He said that: “Expansion to this level of production … would cause serious overcrowding of facilities, unless some additional building [is] work done … It should be noted that the existing AHF facility was not designed for the work being done and that it might be undesirable just to add extensions.”[1693]

Dr Lane was not yet involved in the management of BPL. However, from his later experience he commented that a limited expansion was possible without buildings being extended, but only as a stop-gap measure, and even this increase would require additional freezing and freeze drying capacity. There was very little further scope to manage this significant increase without additional expenditure.[1694]

The Trends Working Group, when it reported in 1977, concluded that considerable further investment in collecting, testing, processing and premises was required if self-sufficiency was to be achieved, and added “additional fractionation capacity is also needed, even allowing for some possible expansion of the Liberton plant’s output. The present UK capability is less than half of what we regard as essential. Additional major investment is, therefore, also needed for this.[1695]

This built on a message that Dr Ellis had been conveying since January of that year. He had told the haemophilia centre directors that a figure of 14-15 million international units was the maximum capacity for Elstree with the present plant and building, and even that included a proportion made in Oxford. The figures which the Trends Working Group spoke of as needed would be at least double that. He was speaking purely of the English facilities: at that stage. SHHD suggested that Liberton had a capacity to make 60 million units of Factor 8 per year (ie four times the capacity of Elstree as it then was), although to do this would require some new capital equipment and money for extra running costs to include payment for staff to operate a 24 hour shift system of working.[1696]

In 1977, therefore, it was already clear (and had been for a while) that if the UK as a whole were to become self-sufficient then the Government would need to provide some further funds to Edinburgh to enable fuller use of the production capacities there, or to provide substantially improved, enlarged and updated production facilities for BPL, or most probably to provide some combination of both. It might have seemed that the first of these options would be easiest to achieve within a short time-scale – a point to which this Report will later return – though the third was likely to be the most practicable.

The chart above shows that the comment made by Roland Moyle to Parliament in June 1978 that BPL was already at its full capacity,[1697] was correct at the time.

Using some of the funding which had been provided by the half a million pound investment by Dr Owen, BPL’s fractionation facilities were extended slightly. Its report for the year ending July 1977 shows that the facilities had been extended to make it capable of producing 17.5 million international units. On its then site, that seems to have been the maximum achievable without further modification.

Dr Lane, in waiting as director of BPL, focussed on the view that BPL should be rebuilt. A large investment was required in order to adhere“to the Department of Health’s principle that the Health Service shall make all possible attempts to become self-sufficient.”[1698] He called for more centralised management of NBTS to coordinate plasma supply, but ultimately noted that BPL would reach its maximum accommodation by the year end, and described the present state of production as hazardous in any event due to the constraints of the laboratory conditions and workload.[1699]

In 1976 the Government had taken the decision that the premises used to produce factor concentrate in the UK should comply with the provisions of the Medicines Act 1968. They had been exempt from the rigour of these requirements by virtue of “Crown immunity”.[1700] A consequence was that by 1977, the point reached in this account, the Medicines Inspectorate were due soon to inspect the facilities at Elstree. Dr Maycock anticipated that the report would not be favourable.

To sum up thus far: by the mid 1970s, the premises in England which were designed as laboratories, and lacked the space, size, design and equipment to function as production plants, were being pressed into service to the limit of their capacity to produce blood products. This was to an extent which caused a potential danger to the public, or to those working at the facilities, because without significant modification or, ideally, replacement, production could not comply easily with then-current standards of good manufacturing process. There were potential risks to safety. Given that the Minister of State for Health had identified that the English plants had reached full capacity, the Trends Working Group had identified the extent of the shortfall, Dr Lane had expressed forceful views, and the imminence of what was expected to be a highly critical report from the Medicines Inspectorate, the fact that the inspection did not occur until April 1979 was little to the point: the writing on the wall about a continuation of production facilities in their then present form could hardly have been clearer.

A replacement on a site large enough to contain a replacement facility was plainly going to be necessary – it was already needed – and it was difficult to avoid the conclusion that planning should start without delay.

The need to increase production capacity in England

In December 1977, Dr Maycock produced a report in which he said that the

“‘stretched’ capacity of BPL will be reached about the turn of the year. The experience of the past year suggests that thereafter the laboratory will continue to work in an atmosphere of uncertainty about future development … The present method of operation will become more difficult if the scale of fractionation grows. What is needed is a programme in which each region would be responsible for carrying out a planned growth pattern within a centrally coordinated plan for the NBTS in England and Wales. Without this, or something like it, it will be difficult to plan for the future of BPL and PF Lab.

Planning the future of BPL should not wait until the problems of PFC have been resolved … It would at least dispel the feeling of uncertainty at BPL if DHSS were to say whether or not it intends to secure its investment … in PFC at the expense of developing its own fractionation potential in NBTS”[1701]

Dr Lane insisted on Dr Maycock including that last sentence. He was concerned that the belief in the DHSS as to the capacity of Edinburgh, encouraged by what he thought to be over-optimism at PFC, might result in starving BPL of the investment in redevelopment which he considered it needed.[1702]

By the end of 1976, and the start of 1977 it was apparent that the fractionation capacity in England would need significant expansion. First, though, BPL had to face the problems caused to its even continuing as it was, by the applications in March 1976 it had had to make for product and manufacturers licences, following the Minister’s decision that the NHS facilities should be treated in the same way as commercial pharmaceutical companies for the purposes of the Medicines Act.[1703]

To further complicate the picture, the Lister Institute ran BPL. It was struggling financially. In 1975 it sold premises it had occupied for several years in Chelsea. By 1978 it could no longer continue to run the Elstree enterprise, and had to sell the site it occupied there to the DHSS.

This was a reflection of what had become a time of high inflation and economic difficulty. Inflation had exceeded 20% in 1975; and GDP had declined in both 1973 and 1975. Unemployment had peaked in 1976 at 1.5 million.[1704] Public expenditure had increased markedly between 1972 and 1975, but the two succeeding years saw a sharp reversal in that trend, including the need for the Government to approach the International Monetary Fund for a loan. However, in turn, public expenditure reversed again in 1978-79.[1705]

In short, in 1977 money was exceptionally tight, but this did not mean that governmental policies already declared were not to be followed through. It nonetheless imposed constraints on the amounts of capital to be committed to them; however, one of the features of the arguments over self-sufficiency in the 1970s is that it was confidently expected that within a short time after completion of a capital project there would swiftly be a return of much or all of that capital. The savings from no longer importing very expensive products from abroad appeared substantial.

Dr Diana Walford expressed her view of the period in this way, from memory: “funding, at the time, was exceptionally constrained … ‘the elephant in the room’ for all discussions, including the redevelopment of BPL and production of additional plasma for national self-sufficiency, was that funding from the Department’s budget for centrally-funded services, such as BPL, was inadequate and capital funding was especially hard to obtain.[1706] Thus, when it became apparent in October 1977 that any further plasma provided to BPL for fractionation would have to be dealt with in “unsatisfactory accommodation which the Medicines Inspectorate would be likely to condemn” it was decided to have a meeting to consider future production problems generally.[1707]

The DHSS thought it would be a useful occasion to take stock of the difficulties and to crystallise the possibilities for future planning. It was recognised that there was continuing pressure for more Factor 8 concentrate to be used, but BPL had reached the limit of its present production capacity. The implication of the recommendations of the Trends Working Group were that there would need to be a substantial expansion of production. Dr Lane proposed a solution. He suggested building new facilities outside the present building, re-equipping the existing shell for other associated purposes. However, for the reasons just explained, the DHSS felt that it was clear that:

“the current constraints on expenditure and the relationship existing between the Department and NHS field authorities were not conducive to the successful implementation of radical, expensive solutions to blood products production problems … Progress would most probably be achieved by concentrating on what needed to be done at BPL and a phased redevelopment solution, such as that put forward by Dr Lane, seemed to be worthy of further examination. The need to expand blood products production, provided this was done on the basis of low-cost, selective development, was now being accepted by the Department, and the importance of maintaining a separate production unit for England and Wales and of not being wholly reliant on the Scottish PFC at Liberton had recently been affirmed.”[1708]

Cost was now a major factor hindering the achievement of self-sufficiency. Though much has been made in closing submissions[1709] and some evidence has been given[1710] of the constrained financial position of the UK during the 1970s, the material available to the Inquiry suggests that the picture was more nuanced. There were “sharp increases in both public expenditure and borrowing which occurred in the years from 1972 to 1975. Public expenditure had been propelled to its highest-ever level both in absolute (real) terms and as a share of output … The succeeding years 1976/7 and 1977/8 saw a sharp reversal in this trend with very sharp falls in public expenditure, in turn reversed in 1978/9.[1711]

It might appear from this that, given sufficient will[1712] and forward planning, there were periods during the 1970s when public money was available for capital expenditure without the financial position of the UK precluding it.[1713] It should also be borne in mind that, as was often suggested at the time, commercial concentrates were more expensive to produce than were their domestic equivalents. Successful investment in production infrastructure was thus likely to produce a continuing revenue saving.[1714]

It is difficult to avoid a conclusion that as a matter of fact, too little was done too late. The need may initially have been focussed on the provision of more plasma to be made available from the regions to a greater extent than on expanded production facilities, but it was always about both.[1715] This was known to the DHSS. The opportunity was missed.

Stop-gap

The conclusion of the October 1977 meeting was that BPL should draw up a list of options for future developments keeping in mind the DHSS constraints.[1716] Accordingly, BPL produced a proposal designed to bridge the gap between the inadequate current plant and a new redeveloped BPL and double Factor 8 production over a four year period from 1978 to 1982. It was also intended to increase the pool size from 300 to 600 litres on grounds of efficiency.[1717] In his report on the year Dr Lane described the programme in this way:

“As its name implies, ‘Stop-Gap’ is a temporary measure to increase production, in order to go some way to meeting the recommendations of the Trends Working Party, while the future development of BPL and PF Lab is considered. The urgent need to start this deliberate planning cannot be overemphasised. Even if it were assumed that planning and building could be achieved in five years, the present BPL building would be 18 years old (in the sense that its planning and schedules of equipment were virtually completed in 1965, apart from subsequently imposed reductions) and is below the standards now regarded as necessary.”[1718]

The plan envisaged a programme of works reallocating space within the existing laboratory with some additional accommodation, upgrading certain equipment, modifying some fractionation techniques, introducing single donor bags for the supply of plasma to BPL and modest increases in staff and materials.

The plan was received positively at the DHSS. However, the proposals had to be reviewed and amended in the light of the cessation of the Lister Institute’s involvement in the running of BPL on 17 April 1978. Revised plans had to be submitted, in December 1978.

When she gave evidence, Dr Walford agreed that it was or should have been known to all those involved with BPL in the second half of the 1970s: (1) that the demand for blood products was rising; (2) that it was or should have been obvious that BPL was outdated and too small, and that significant investment was required in order to be able to meet the rising demand; and (3) that the Government had determined that BPL should meet standards equivalent to those expected of commercial plants, notwithstanding Crown immunity, but it was or should have been obvious that BPL would not meet those standards.[1719]

This had also been obvious to her predecessor, Dr Waiter.[1720] It had therefore been known to the DHSS for some years.

The Scientific and Technical Committee advising BPL made it very clear in March 1979 that there was an urgent need for planning substantial additional capacity: one member who was a biomedical engineer said “he had pointed out to the Department some 12 months ago that it was improvident to expect the major BPL plant to continue to function much longer without major breakdown.”[1721]

David Smart, who had particularly valuable experience in the pharmaceutical industry, was deputed to produce a report looking at options for BPL. He favoured complete redevelopment which should show a “rapid and growing return on the investment with all the capital expenditure paid back in the first 15 months of full-scale operation”.[1722] The view of Dr Walford was that “he was spot on.”[1723]

The Medicines Inspectorate began its long-awaited inspection of BPL in April 1979. It concluded that planning for essential improvements should begin immediately; called out for urgent action; and observed that the current position was not a recent situation.[1724]

It had not yet concluded its inspection when the Scientific and Technical Committee of the Central Blood Laboratories Authority (“CBLA”) met in June. The Committee was briefed that serious deficiencies had however been found in practically all aspects of the laboratory examined so far, for example in documentation, quality control, environmental control, availability of pharmaceutical advice and in the scheme for training staff. “Although it might be some time before the Inspectors report would be available it was apparent that changes were needed at BPL”. Disquiet was again expressed by the Scientific and Technical Committee that nothing had been done already to put the defects at BPL right.[1725] Urgent action was needed.

It was at this point that the Health Services Division of the DHSS[1726] became concerned that the costs of upgrading under the stop-gap proposals could result in a situation where the government would have to choose between upgrading on the one hand, and going ahead with the stop-gap on the other hand. The DHSS therefore said that no money would be available for either.[1727] This decision not to take a decision caused delay in achieving whichever option was eventually to be chosen.

Within a fortnight, the report was signed off. It was every bit as damning as had been predicted. The inspectors identified that BPL could not be readily adapted to large-scale manufacture; personnel had not had the opportunity to gain experience of modern large-scale production requirements in the pharmaceutical industry; production had outgrown the premises in which it was undertaken; and the laboratory was so short of space for cold storage, quarantine of raw materials, in-process materials and finished products, receipt and dispatch, packaging and warehousing generally that it was neither practical nor safe to increase throughput, even if the necessary production facilities had been available. “If this were a commercial operation we would have no hesitation in recommending that manufacture should cease until the facility was upgraded to a minimum acceptable level.” Notwithstanding that, its conclusion was that “as blood products are essential to the health and well-being of the nation and as alternative sources of supply are severely restricted, production at Elstree may continue provided that certain aspects of standards of production and control are improved immediately and that the planning of other essential improvements in these standards commences immediately with a view to very early implementation.”[1728]

On 7 September the Department’s Inspection Action Group concluded that the Inspectorate’s Report revealed that the production, control and other arrangements at BPL fell so short of the standards required of commercial establishments that had the BPL not enjoyed Crown privilege the Group would have felt bound to recommend immediate suspension of activities. It chose not to recommend this. It did so because it took the view that a) the NHS was heavily dependent on BPL’s blood products; b) existing production was understood to be barely adequate and a break in production was likely to interfere with essential supplies to patients; c) alternative supplies of all materials were not available; d) there was no guarantee that such alternative supplies of some products as were available were safer than those produced by BPL;[1729] and e) it was unaware of any weight of established evidence that BPL’s products had caused harm to patients.[1730]

In September 1979 there was another meeting of the Scientific and Technical Committee for the Central Blood Laboratories. The members did not support commercial development, and recommended that no time should be lost in planning a completely new NHS plant, even if it was felt that other possibilities had to be examined concurrently. David Smart’s estimate from his experience was that a new plant could be achieved in three years.[1731]

John Harley later wrote a memo which reflected his recollection of the Committee. He said that he had tried without success to discourage Professor Patrick Mollison (the chair of the Scientific and Technical Committee) from initiating planning because no decision had yet been taken to build a new fractionation plant at BPL and there was no budget even for the preparation of a development plan.[1732]

In December 1979 a submission was made to ministers on the future of BPL.[1733] They were asked to consider and agree: (a) that there should be a short-term upgrade to BPL, accepting that it would fall short of the full recommendations of the Medicines Inspectorate; (b) a decision in principle to rebuild BPL but without commitment to method or timing; and (c) further exploration of options for rebuilding within the NHS or in collaboration with industry.[1734] In January 1980 Dr Gerard Vaughan, current Minister of State for Health, agreed to the first and third of those proposals.[1735] There was thus no commitment even to a decision in principle to rebuild BPL.

In February 1980 Dr Lane was authorised to proceed with the stop-gap project. The DHSS anticipated that the sum of £750,000 would be available for the capital development of BPL in 1980/1981.[1736] Dr Lane’s view was that even relatively short-term redevelopment projects would cost more, and possibly as much as £2-2.5 million over two to three years.[1737]

In April 1980 ministers asked to see whether there might be scope for further savings in the stop-gap measures. They were concerned that there might be no justification to incur short-term costs on upgrading if BPL was to be rebuilt. The minister’s position was thus understood to be that expenditure should be limited to matters of necessity.[1738] Accordingly, in May 1980 Dr Lane sent a revised programme of works for BPL to John Harley at the DHSS. The proposal was named MARP01 to distinguish it from the previous stop-gap proposals.[1739] He was then asked at a meeting in June to put forward proposals on the basis that only £500,000 was available to spend on short-term developments at BPL.[1740]

Eventually, on 29 July 1980, Dr Vaughan approved a submission that had been put to him proposing expenditure of £1.3 million over two years. He rejected options involving lesser expenditure.[1741] It was not, however, until 2 February 1981 that Dr Lane was given formal authority to proceed to tender with MARP01: this was because the regional health authority which was the client for the purposes of the redevelopment work had insisted on considering the proposals and changes to those proposals before authorising the works.[1742]

Ministers decided to defer the eventual decision on whether to build a new laboratory within the NHS until other possibilities had been investigated.[1743] The investigation showed that “no British firm had the necessary expertise in the manufacture of blood products and only foreign firms had approached the Department, with a view to processing British plasma on the existing basis and, in addition, to processing and re-exporting plasma from overseas.[1744] According to Dr Walford, by May the Deputy CMO, Dr Harris, was “exasperated”:

“Q: Because he thought ministers were taking the wrong course?

A: There was a clear course of action which was necessary at that time. It was barn-door obvious, and yet somehow or other, it wasn’t possible to make progress, and so what Dr Harris was doing there was actually saying ‘We must get on’; he was encouraging that the final submissions should go to ministers; and that … ministers should be encouraged to understand that at the end of the day they took responsibility if something went wrong.

Q: Is it right to understand we’re still talking about what should be done in the short-term here?

A: Absolutely in terms of keeping things going for the short-term, but there had to be a commitment to rebuild because without that clearly the entire edifice was not going to be fit for purpose.”[1745]

Long-term redevelopment

Long-term upgrading was considered in September 1980 by the Scientific and Technical Committee.[1746] It was reported to the Committee that officials had been instructed by ministers to investigate the possibility of collaborating with private industry. Initial discussions had taken place and a paper was due to be sent to ministers setting out the pros and cons of such an arrangement.[1747]

Worries were expressed about various aspects of this, including the fear that it would be “impossible to prevent the contamination of the UK material with imported hepatitis viruses.[1748] A collaboration with Beecham was considered.

A submission went to ministers on 14 November 1980. In that document Peter Wormald of the DHSS said that it was important “we should reach a decision in principle now: that is we should either reject the commercial option or decide that we intend to implement it subject to satisfactory negotiations. A decision merely to continue negotiations, without commitment in principle, would prolong uncertainty, encourage continued argument and further damage morale at BPL. And it would not be fair to Beecham.[1749]

This time, there was a decision. Dr Vaughan announced that there would be “no commercial management of Blood Products Laboratory: modernisation programme already underway.[1750]

In December 1980, more than eighteen months after the Medicines Inspectorate report, and a number of years after it had become clear that a new fractionation plant was needed (whoever was to run it) ministers instructed officials to begin work on planning and designing a new BPL.[1751] Up to this point, Dr Walford in her evidence agreed with the view contemporaneously expressed by Dr Peter Dunnill.[1752] She said that “it had been a totally chaotic, protracted and difficult process, and needlessly so”. She remembered her frustration about “innumerable and repetitive meetings which generally ended without moving matters forward to any appreciable extent.[1753]

Although planning and designing what might be a new BPL had been authorised, there had yet been no decision in principle to actually get on and build it. Indeed, by October, ten months further on, a minister asked the Policy Steering Group for the redevelopment of BPL to give consideration to the use of an existing factory which could be adapted at a relatively low cost to meet requirements.[1754] No such building could be found.[1755]

A lack of ministerial decision, ministerial uncertainty, and the absence of long-term management arrangements now complicated matters further.[1756] By March 1982 approval was sought for planning to proceed on the assumption that BPL would process all plasma for England and Wales since PFC did not have additional capacity without introducing three shifts.[1757]

In July 1982, the Policy Steering Group proposed a new laboratory be built at BPL at a cost of £21.1 million spread over the years 1982/83 to 1985/86.[1758] The Treasury then approved the redevelopment in principle, and agreed that the project should be “fast-tracked.”[1759] Final approval was given for redevelopment in the sum of £22.6 million on 11 November 1982.[1760]

Finally, on 1 December 1982, a special health authority was established to take responsibility for the management of BPL, PFL in Oxford and the Blood Reference Group Laboratory. This provided a management structure which had been thought lacking and was thought necessary for any redeveloped premises.[1761]

The first programme of works had been limited by financial constraints, identified broadly above. One of the aims of these stop-gap works had been to allow for production to be expanded: this was despite the Medicines Inspectorate recommending in their report that no expansion should take place until the concerns identified in that report had been addressed.[1762] Despite the decision to waive Crown immunity, some flexibility was permitted for NHS facilities. Although manufacturing licences were not mandated by statute for NHS manufacturing facilities, the policy was that they should nonetheless comply with the standards which the Medicines Division would require commercial firms to observe.[1763]

In a later report of 4 February 1986, Dr Lane said that the final cost of MARP01 was £2.8 million. He thought that a significant part of this inflated figure was “absorbed by repetitive design and interruptions in implementation” and by “simple procedural problems” such as, for instance, the role of North West Thames regional health authority as the nominal client, despite the project being funded centrally by the Treasury through the BPL budget.[1764]

Redevelopment of BPL

Construction finally began in May 1983. The original completion date was to be July 1985.[1765] In a written parliamentary answer about this, Kenneth Clarke, the Minister of State for Health, stated that “the Government decided in 1982 that self-sufficiency in all blood products … should be achieved, and that the Blood Products Laboratory at Elstree should be rebuilt to provide the capacity to manufacture the blood product needs of the National Health Service in England and Wales.” It was then said to be on time for completion in January 1986.[1766]

Kenneth Clarke repeated this in February 1985: “We decided in 1982 that this country should become self-sufficient in blood products.[1767] Lord Owen reacted to the opening words, since in his view (correctly) there had never been a time since his announcement of the policy of self-sufficiency when it had not been government policy to do so. He told this Inquiry that it was “extremely odd” and “as if the previous Labour Government programme had never existed.”[1768] This perhaps serves most to emphasise the fact that little seemed to have been achieved towards implementing that policy since 1977, such that it may have seemed to many to have “dropped off the radar”.

It was eventually finished at a cost of £60 million with an additional £7 million for essential extras in summer 1987. The estimated date for achieving self-sufficiency slipped from 1986 to 1989.[1769]

Documents from 1986 record ministers saying that the project had been “badly handled[1770] and a “shambles”.[1771] Lord Norman Fowler told the Inquiry that the overspend on the project “reflects poorly on the Department”.[1772] His evidence, and the contemporaneous documents, suggest that what contributed to delays and expense included: underestimation of the initial project during the tender stage;[1773] the complexity of the plant and the build;[1774] the various re-designs which took place during the project, in part due to new technology, and in part due to the fast-track “design and build” approach; poor management by the CBLA; and a lack of close oversight by the DHSS.[1775] Nonetheless, Lord Fowler also pointed to some favourable features: the building was probably completed two to three years earlier than would have been the case with “conventional methods”;[1776] an incoming chief executive of the CBLA, who was not implicated in the project’s history, is reported to have told officials that on the basis of a lifetime’s experience in the pharmaceutical industry the building represented value for money.[1777] He noted that ministers invested money at a very considerable level, especially given the financial pressures on health spending: it was given priority at the expense of other pressing needs.[1778]

Commentary

The principal reason why self-sufficiency was not achieved in England and Wales until after 1990 was that it was only then that BPL in its redeveloped form came fully on stream. It is highly likely that if this had been achieved significantly earlier there would have been less infection, in particular with HIV. Opportunities were missed. As early as 1967 Dr Rosemary Biggs had anticipated a need to ensure that the production of factor concentrates in the UK would be sufficient to avoid a need to purchase expensive commercial concentrates from abroad. She had identified the risk of insufficiency, and she was a leading clinician, at a leading treatment centre. Nor, as set out above, was she in any sense a lone voice. Scotland took note of the same risk, and broadly achieved self-sufficiency as a result. Although there was sufficient product in the rest of the UK (whether it was cryoprecipitate or concentrate, or both) for treatment without the need to import from abroad until around 1973, and arguably for a brief while thereafter, the writing on the wall was clear. A desire to promote home treatment, coupled with the increasing lifespan of people with haemophilia once cryoprecipitate began to be used after 1966, and evidence of a growing demand for concentrate from centres such as the Oxford Haemophilia Centre, showed that BPL was ever likely to be called upon to produce more. It needed to be put in a position to do so, both by sufficient supplies of plasma and adequate, safe, modern production facilities.

A number of factors hindered the achievement of this. There was no overall, nor any coherent, plan to provide factor replacement therapies to treat clotting disorders. The supply of plasma for fractionation was initially inadequate. This was largely because of the structure of the NBTS, which had no power to direct that supply, and because of the way that finance for that supply was regionally determined. However, it was the inability of BPL to process the quantities that would be necessary to meet the 1977 estimates, let alone the increased demands thereafter, which proved the main problem.

The production capabilities of BPL and PFL combined were insufficient largely because the premises were being pressed into service as production units when they were never designed for that purpose. There was no sufficient thought given by Government to meet the future needs of haemophilia therapy in England, Wales and Northern Ireland (although there was such forward planning in relation to Scotland); and nothing was done to meet the urgent need for revamping or replacing BPL which had been apparent throughout most of the 1970s. Though BPL was upgraded in 1972, Dr Maycock’s observations about BPL as it was speak of its inadequacy. It is plain from what he and others were saying in the first half of the 1970s that far better facilities needed to be provided at BPL, and this could not be done without a substantial redevelopment of some kind.

Civil servants in the DHSS were well aware of these matters. The paper written by Thomas Dutton in 1976 was insightful in identifying the problems of the previous few years. It described devising and effectively managing a balanced programme for the preparation and distribution of blood components including clotting factors as “probably the most urgent task facing the NBTS”. It spoke of a need to fractionate something approaching one million blood donations annually, substantially more than the amount which BPL was then capable of achieving, and the plasma to make which was not being made available by regional health authorities. It was plain from his paper that one difficulty was achieving the supply of sufficient plasma to BPL to enable it to fractionate sufficient Factor 8 concentrate; but it is also clear that it was recognised within the DHSS that the production facilities were unlikely to be adequate.[1779]

Dr Owen’s initiative resulted in a greater supply of plasma, and helped to assure that such a supply would be able to continue. However, the improvement in supply rapidly demonstrated the bottleneck in the system in England, Wales and Northern Ireland – the lack of facilities to produce sufficient Factor 8 concentrates domestically: by 1978 both BPL and PFL were recognised by the government to be “working at present full capacity”,[1780] and it must have been realised a year earlier, when the 1975 target of 340,000 donor units was met,[1781] how little headroom was left before hitting this ceiling.

There was no planning in time to meet this anticipated need. Dr Walford was correct when she said that Dr Lane was right to describe the difficulties in meeting the problems of the NBTS as having been “accentuated by the growth in requirement during the 1970s of plasma products, an exercise in production maintained without adequate planning, co-ordination or finance from the outset.[1782] She agreed that there appeared to be no overarching plan in order to achieve self-sufficiency. Yet self-sufficiency was government policy from the end of 1974.

Regional health authorities were described by Dr Walford as “fiefdoms”;[1783] they decided how best to allocate their resources to the needs of their region as they perceived them. Minority needs, such as those of people with haemophilia, suffered by comparison with many of the other matters on which a health authority might wish to spend its money. The system gave little incentive to a regional health authority to collect plasma from within its region, freeze it quickly, and send it to BPL for fractionation. There was no guarantee of any return upon the expense incurred in doing so.

The decision was taken in 1974/75 that self-sufficiency was important. It was to be achieved. Peter Wormald, Under-Secretary in the DHSS, understood that the reasons for the policy were “that imported products were very expensive and in general were considered to carry much higher risk of cross-infections, particularly hepatitis”.[1784] It was important for these reasons, but also because the WHO had recommended it; in part because it was appreciated as a matter of principle that plasma sourced in a region in which some diseases were endemic would risk transmitting those diseases to the country to which it, and any product made from it, was exported.[1785] It was achieved (far) too late.

On the evidence, there was always likely to be sufficient cryoprecipitate capable of being produced to meet the demand for that form of haemophilia therapy. However, clinicians, and no doubt many patients too, wished for the added advantages which factor concentrates offered, particularly in terms of home therapy.[1786] Belief that factor concentrates were needed, rather than highly desirable, in part fuelled the drive to produce such concentrates domestically. The main problems essentially started first, before David Owen’s initiative took effect, with securing a sufficient supply of plasma, through a system which was uncoordinated; and which depended on the extent to which plasma could be recovered from donations made of whole blood. The more that blood could be separated into the components in which it was really needed, the more plasma was more likely to be available for fractionation. This however, depended upon a willingness amongst clinicians who used transfusion in their practice being prepared to accept concentrated red blood cells rather than whole blood. Blood was collected as whole blood, when there was room for some plasmapheresis. The provision of plasmapheresis in England was slow to be initiated. Much of the need to import factor concentrates (and the additional disease they brought with them), was an overenthusiasm amongst the treating clinician for the advantages of factor concentrate without paying enough regard to the downside risks of exposing patients to the viruses that they knew that those concentrates were likely to contain (hepatitis) and those they might have supposed to be there (unknown viruses which could do damage and which were more prevalent in the countries from which the product came). Clinicians accepted that the risk was inevitable, rather than seeking to reduce it. Categorising the risk in this way made the use of commercial products more acceptable.

The second major problem was that if plasma was supplied in the quantities needed for a sufficiency of concentrate, the production facilities needed to manufacture that sufficiency from that supply were simply lacking.

It is difficult to differ from the view expressed by Dr Walford that the process in deciding to develop BPL had been “totally chaotic, protracted and difficult” and “needlessly so”.[1787] The delay between April 1979, when the Medicines Inspectorate conducted their inspection and condemned the premises operated for BPL as unsuitable for a pharmaceutical company, and the middle of May 1983, when construction began, was just over four years. Dr Walford was right to describe the delays as “unconscionable”.[1788] The central problem that led to this delay was ministers being unable to make a decision, in part because of the consequential capital expense. But it was obvious that significant expense would be incurred whether or not the plant was operated by a British pharmaceutical company or BPL.[1789] Moreover, to look at April 1979 is to look at the wrong date as a starting point. It should have been no later than the end of 1976, when it was already known that the premises in which BPL operated were unsatisfactory, unsuitable, liable to be condemned by the Medicines Inspectorate, and simply could not produce enough products to meet the government’s, the profession’s and the patients’ requirements. The failure to make full use of the Scottish facility contributed significantly to the problems with supply.[1790]

On the available information, for periods during the 1970s the public finances were significantly stretched. However, planning for a new facility should have begun: it was plainly going to be needed if the policy announced by Dr Owen were to be achieved. It was government policy. It never ceased being government policy. The failure to achieve it is a sorry tale.

The Department of Health and Social Care (“DHSC”) was right to submit that in 1973 the forecasts for the factor concentrates that might be needed had not yet been made.[1791] However, as the facts set out above show, a potential need had already been signalled (both in Scotland, and by Dr Biggs); was put forward by the MRC Working Party in 1974,[1792] and by 1976, as Thomas Dutton’s paper shows, it was clear that redeveloped premises would be required not only because of their own shortcomings, described above, but to meet what was by then to be anticipated as demand grew. By October 1977, if not before, the need for redevelopment had become irrefutable.

The DHSC then points to the withdrawal of the Lister Institute from the running of BPL, the constraints of the site BPL occupied, and the time it took to purchase land from the Lister Institute to enable expansion.[1793] It has asked the Inquiry to consider delays caused by the effects of the Medicines Inspectorate report, the stop-gap improvements (and consideration of how far and at what expense they could be carried out if there was to be a development at public expense), the possibilities that private industry might be involved in a rebuilt plant, and the financial constraints upon the health service and the state more generally. These, it is submitted, explain the delays.

The points made in the evidence of Dr Walford, as set out above; the length of time it took to actually make a decision to proceed (when it was, as Dr Walford put it, “barn-door obvious” what that decision should be), and the length of time the issue had been brewing – BPL as a production facility was condemned by the Medicines Inspectorate in 1979, and the deterioration to a state in which it could simply not go on as it was had been seen to be coming by Dr Maycock in the early 1970s, and must have been visible to observers – compel the conclusion that despite these submissions the findings set out above are right.

Was the potential of PFC unused, and if so why was this?

In wartime planning, two main plants were to process plasma for the UK: one south, one north.[1794] Some idea of the proposed split of production between the central processing plants as they had by then become was given in 1965. Dr Cumming was an enthusiastic advocate of forward planning.[1795] The Ministry of Health agreed with the SHHD, informed by Dr Cumming’s views, that PFC would fractionate plasma for the NHS using plasma collected by the Newcastle, Leeds, Manchester and Liverpool regions, leading to Antihaemophilic Globulin (“AHG”) from 10,000 bottles of blood.[1796] It was supposed that the Scottish facility would supply at least one third of the UK need for AHG, ensuring that the extra costs of Scotland providing for England would be recoverable from central government.[1797]

The SHHD suggested that there should be a management policy committee covering both BPL and PFC to ensure that a common production policy would apply to both laboratories.[1798] This eventually happened in 1973 when a Joint Steering Committee[1799] was set up to consider common policies on, amongst other things, which of BPL and PFC was to fractionate what products and in what amounts, the provision of plasma to PFC from south of the border, which country was to supply which region, and the standardisation of blood products. It was precipitated by the grant of product licences to two commercial firms “which might entail large sums being spent by NHS authorities on these products.[1800] The Scottish CMO, Sir John Brotherson, wrote from Scotland offering further support for production capabilities south of the border, saying “It has been represented to me by clinical colleagues that this is an initial step which will lead to the purchasing of human donor blood in this country, with consequent erosion of the voluntary donor principle. Clearly apart from its social unacceptability and the consternation that this would cause in our Blood Transfusion Services, the cost implications for the Health Service could be very substantial.[1801]

By 9 May 1968 it had been decided that 1,000 litres of fresh or fresh frozen plasma could be processed weekly by PFC, using pools of eight litres. This size of pool could be handled conveniently and quickly and was supported by clinicians using the material.[1802] It was reckoned that if the north of England utilised PFC fully Edinburgh might have to cope with 2,500-3,000 litres per week.[1803] Importantly, “Mr. Watt considered that, as there was only a small possibility of expansion at Elstree, Edinburgh should be prepared to cope with the requirements of a larger part of England than originally intended so that the total growth of requirement could be handled by the two Blood Products Units. Dr Maycock considered this approach to long-term planning was the only correct procedure.[1804]

The SHHD’s proposal that the new blood products unit “should operate on the continuous flow principle and should be designed to a workload of 1,500 litres of plasma per week; but it will be capable of adaptation, without substantial structural alterations, to operate at levels up to 3,000 litres per week should this become necessary” was approved in principle in October 1968.[1805]

The following month, it became clear that PFC’s construction would cost more than the sum approved, and that despite a planned 1,500 litre capacity, it would for this financial reason only be equipped initially to operate at 1,000 litres per week. This would reduce the estimated cost.[1806]

It had already been agreed that Scotland could make a charge on the English regions it supplied with AHG once it was operational. In December 1968, presumably in light of the same financial pressures, the SHHD sought a contribution from the DHSS for the capital cost of Liberton in return for not including any element of capital charge in the amounts due when blood products were later to be supplied. Based on the supposed extent of the supply of finished product to English regions, England’s share of the capital cost would come to more than one third. In the event, the DHSS eventually invested £400,000 in PFC towards the capital costs,[1807] which would amount to somewhere between a quarter and a third of the costs as then estimated.

In March 1969, in line with the approach thus far, Dr Maycock indicated that he expected one third of the plasma from England would be processed at Liberton.[1808]

The delays which had already occurred were such that at the end of December 1969, when a final cost estimate was approved, it was known to be unlikely that PFC would begin production before the end of 1973.[1809] Construction began in 1971. It continued until 1974. It then progressively, though slowly, came on stream.[1810]

As noted earlier in this report, the Expert Group on the Treatment of Haemophilia emphasised the importance of taking a UK view of provision for haemophilia therapy: it considered such a unitary view “essential”.[1811]

John Watt was the director of PFC whilst it was in development and for some time after. His view was that PFC was designed just to handle a minimum of 1,500 litres of plasma per week working on a 46 week year but with capacity to increase to at least 3,000 litres per week.[1812] At the minimum level of working, it was expected that 1,000 litres of plasma would come from Scotland each week, and the remaining 500 litres would come from England. John Watt observed: “How this will work in practice is difficult to define at the present time[1813] since there is no plasma available in England to send to Scotland. Elstree is, for the present, able to absorb all available plasma from the English Blood Transfusion Service. This is a matter for some concern since it affects the economic viability of this Centre.[1814]

In his evidence, Dr Peter Foster was able to explain that, in the light of John Watt’s experience around the world as a consultant, a fractionation centre should support a population of at least 15 million to be economically viable. “So the population of Scotland was too small, in his opinion, for this to be economically viable, and that is why he saw England as being essential to the future survival of PFC, as well as obviously to the benefit of England.[1815]

As PFC came more fully on stream, in 1976, the SHHD asked the DHSS when PFC would be provided with English plasma for fractionation.[1816] The reply was, essentially,“Not yet.” The programme to secure plasma for the production of Factor 8 blood products in England and Wales was underway in order to meet Dr Owen’s pledge to achieve self-sufficiency, but an increase in plasma supply was not anticipated “for some months yet.” BPL was said to be able to process all supplies in England and Wales for at least another year though the situation would be kept under review.[1817]

In 1976, nonetheless, ministers continued to stress the need for collaboration between Scotland and England in manufacturing blood products.[1818]

As mentioned above – repeated here to give a coherent account of what happened in respect of PFC – a haemophilia centre directors’ meeting was told in January 1977 that the maximum capacity of BPL at Elstree with its existing facilities was 14-15 million international units of Factor 8. Yet it appeared that 40-50 million units would be required for self-sufficiency. Dr George McDonald, from Glasgow, told the meeting that PFC had the capacity to make 60 million units a year but to do this would require finance for additional equipment and running costs including staff payments to enable a 24-hour shift work system. The minutes record that agreement in principle had been reached between the DHSS and SHHD and “plans had been made to divert plasma from south of the border to Liberton when Mr Watt was ready to receive it. It was planned that the factor VIII from this plasma would return to Centres south of the Border.[1819]

There is evidence that PFC then received some plasma from England in 1977. By July 1977, the amount received totalled 20,000 litres. But that is where it stopped. In July at a meeting of the directors of SNBTS, PFC and SHHD, it was agreed that “a system acceptable both to BTS England and Wales and to SNBTS would have to be evolved”before further supplies were arranged,and the need for this“should be borne in mind by those presently negotiating the supply of plasma from England to PFC.[1820]

By January 1978 no further plasma had yet been supplied from England. It was again agreed that a detailed plan was needed before large scale processing of plasma from England and Wales should begin in Scotland.[1821] The meeting agreed that Scotland should provide for its own supply of fractions before undertaking work for the NBTS in England.[1822]

Thus, by the start of 1978, the position could be summarised in this way: during the Second World War and immediately afterwards it had been seen as necessary that there should be two large plants – one south and one north – between them covering the demands of the UK as a whole for blood products.[1823] After AHG was first developed in the late 1950s, it was foreseen that a demand for it as a concentrated form of therapy was likely to grow. Accordingly, planning for a new premises in Scotland included provision for making it. Scotland was to share UK production of this with the existing facilities at Elstree. Those premises were in need of some modernisation, and their capacity was limited. Planning proceeded on a UK-wide basis since, on this basis, the northern unit, being in Edinburgh, was intended to serve not only Scotland but also the north of England; and Elstree might not be capable of producing enough. The DHSS paid a substantial proportion of the capital cost of PFC development on this basis.

Throughout the early 1970s it had been said, repeatedly, at meetings of different committees, by the DHSS, and by the minister concerned, that PFC and BPL should together supply the UK with the necessary blood products. As a unit, the commercial viability of PFC as newly built was questionable if it were only to serve the population of Scotland. Matters even went as far as some plasma being supplied from England to Scotland for fractionation. But what never happened was a final agreement on detail: as to the amount of plasma which PFC would be asked to fractionate; how much of that would be fresh frozen plasma from which factor concentrates could be made, and how much would be time-expired plasma; the arrangement under which northern regions of England would, out of their own budgets, finance the production of plasma to a plant outside the scope of the NBTS for return as concentrate;[1824] and the level at which Scotland would be reimbursed for the expense in making provision otherwise free of charge to the English regions.[1825]

It is also plain from the earlier part of this chapter that BPL did not have the capacity by 1977 to produce the amounts of Factor 8 concentrate demanded by clinicians in England. Those demands could not be satisfied in the immediate future without reliance either on importing products from America, or making use of PFC in Liberton.

Despite this history, and what might seem its logical endpoint (namely the use of PFC), a U-turn was signalled on 22 August 1977.

On that date a joint meeting between the DHSS and SHHD was held to discuss mutual problems. Dr Maycock had written to the DHSS to the effect that 25,000 litres of plasma per year (500 litres per week) would be available for fractionation by PFC, and it was thought that supply of this could begin in the autumn.[1826] The product required would be plasma protein fraction.[1827] The minutes record that:

“Dr Lane, who was to succeed Mr Maycock in about 12 months time said that it was his intention to concentrate on the production of Factor VIII at the BPL. The latter and the laboratory at Oxford, were both funded by DHSS and it would be wrong, in his view, to send plasma from Regional Transfusion Centres in England to the PFC, if this had the effect of leaving spare capacity at Elstree and meant service charges having to be paid. In his view this would have the effect of duplicating costs. He envisaged that only time expired plasma would be sent to the PFC and was unwilling to enter into any long term agreement to have regular quantities of plasma fractionated in Edinburgh.”[1828]

It was pointed out that any fundamental departure at this stage from what had already been agreed about the fractionation by PFC of plasma from England could seriously jeopardise the working arrangements at PFC, and in particular could raise questions about the need to introduce shift working. While PFC could function with or without plasma from England, a sustained commitment to processing English plasma required agreement on regular quantities of plasma, providing continuity of production over a period of some years.[1829]

John Watt saw the implications of what had been said at the meeting, in particular by Dr Lane. He recognised it as a reversal of the pre-1967 agreement that PFC would receive plasma from the five northern regions of the NBTS. He realised that Dr Lane considered Scotland should not produce any factor concentrates for England. Dr Lane saw PFC as being useful to England only for recovering albumin from time-expired plasma, and saw even this only as a short-term arrangement which would be reviewed on an annual basis.[1830]

It had apparently been said in the meeting that there had never been any intention for NBTS northern regions to be delegated to supply plasma to Scotland. In light of the history set out above, much of which John Watt repeated in his indignant letter, he regarded this as simply incorrect. The documents produced to the Inquiry show that he was right on this.

He went on to note that

“it is my own belief, supported by the Working Group on Trends in Demand for Blood Products, that the UK requires to process plasma at a rate of about 10 000 litres per week. That is about 2.75 million donations of plasma per year. There is capacity potentially available for this purpose in the country as a whole and, adopting the spirit of the first joint meeting, this does not require any major rebuilding in Scotland or in England.”

He commented that the views expressed were “surprising since they followed on the decisions of the first meeting of the group that it was the intention of the group to consider, plan and act on problems for the processing of plasma on a UK basis and that the two main centres would proceed on an equal status and not on the basis of a master/servant relationship.[1831]

The problem of securing an agreement acceptable to the workforce if asked to run a 24- hour shift system (which would be necessary if plasma from England was to be processed by PFC to any considerable extent) led to a letter from the SHHD, in which a possible solution was proposed.[1832] The letter appears to have provoked an internal memo within the DHSS:

“even if all the difficulties in shift working at Liberton could be overcome tomorrow, it would not be regarded as sensible policy to put all our eggs in the Scottish basket as the planners appear to have originally intended. We must concentrate much more of our attention on building up the capacity of Elstree so that in normal times it would provide us in England and Wales with our full requirements of the key products identified by the Trends Working Party. (Albumins and Factor VIII) without the necessity of dependence on Liberton … I am bound to admit that it is difficult to see how the Scots could function effectively at Liberton on anything like full capacity without guaranteed supplies of plasma from England and Wales. And if by some means this problem of input could be resolved, there would still remain the difficulty of possible over-production of some at least, of the blood products. What may be needed is a complete re-appraisal of their sources of supply and outlets for production, including a possible export market. But this is for the Scots, not ourselves, to institute. We are attempting our own review exercise for England and Wales, as I have said, based on the realisation that a fully integrated ‘UK’ approach to the fractionation of blood plasma is not a practical proposition, and this is causing us headaches enough. However, it is not going to be easy to disentangle ourselves from the implied moral (and actual financial) commitment that our predecessors undertook in connection with the building of Liberton.”[1833]

This memo cannot pass without comment. The underlying assumption in the memo was that development at Elstree rather than utilisation of an existing plant of PFC was what was needed. The suggestion that “all our eggs”will be put in the Scottish basket is a gross overstatement: the proposal had actually been for an equitable division between the two units, one south and one north, for the whole of the UK. No reasons are given for “the realisation”that a fully integrated UK approach was not practical. Yet the author recognises that if his proposal is to be accepted it will make life exceptionally difficult for PFC, and that he is proposing an action which reneges on a commitment. Since Roland Moyle would say the following summer in Parliament that BPL was working at full capacity,[1834] it might appear that the policy goal of self-sufficiency was being sacrificed on the altar of “England for the English” and “Scotland for the Scottish”.

Since, about this time, it had become apparent that Elstree was in need of complete overhaul or replacement, and might face the condemnation of the Medicines Inspectorate, it might be that the author was motivated by a desire to see a bigger and better BPL built, and feared that might not happen if it were only going to receive plasma from the southern two thirds of England. That ultimately should have been a decision for ministers. A surprising aspect of the change from a joint UK approach in which Scottish participation was essential is that there appears to have been no ministerial involvement. This is despite what might be seen as a U-turn on policy, the abandonment of a “UK approach”which had been endorsed by ministers previously, and pursuing a policy which would make much of an existing plant in Scotland redundant despite the investment which the UK as a whole had made in its construction. Views of this type, and those of Dr Lane, in effect marked the death knell for a contribution from the Edinburgh plant towards a goal of self-sufficiency. Though on its own PFC may not have been able to achieve it, a very substantial contribution could have been made.

This approach was misguided. It proved harmful. It was determined without direct ministerial involvement. It prevented the UK-wide achievement of self-sufficiency.

Relations between Dr Lane and John Watt were part of the history that led to this. They were never good. Dr Lane belittled the claims made by John Watt for the production capabilities of PFC and for its claims in respect of yield.[1835] John Watt (and SNBTS generally) thought that those in England had underestimated the likely demand. They would have known that Dr Lane’s approach was Anglo-centric, and focussed upon achieving a bigger and better BPL, which he was to lead as director. They must, also, have considered that the change of director from Dr Maycock to Dr Lane caused a change in policy toward the contribution Scotland could make to the UK.

Matters might have rested there. There was nonetheless further consideration of the possibility that PFC might make a contribution to the supply of factor concentrates in the UK.[1836] On 20 June 1979, John Harley and Thomas Dutton from the DHSS visited SHHD and Liberton, meeting a number of people from both. John Harley reported to the Under-Secretary, Peter Wormald, that Liberton had spare capacity, and recommended that the question of England’s use of that capacity should be re-opened. He recorded that there would be difficulties and suggested how they should be tackled, beginning by asking the Scientific and Technical Committee to investigate problems relating to the quality of materials and products and to determine standards acceptable to both England and Scotland. Peter Wormald asked him to proceed. He recalled:

“There appear to have been some follow-up discussions, but I was not copied into the papers and I do not know what action, if any, resulted. It is possible that the discussions took a back seat in September when the Medicines Inspectors’ report on BPL was presented. However, I remained keen to get a return on our investment in Liberton. Greater point and urgency was given to consideration of this (a) by the Medicines Inspectors’ report on BPL, and because of the need to plan the long term capacity of BPL or whatever was to succeed it; and (b) because of the desirability of reducing purchases of commercial blood products.”[1837]

He noted that “although no decision had been yet on the redevelopment of BPL, there was a need to determine its future capacity; which in turn raised the issue of what part Liberton would play in meeting capacity.”[1838]

In September 1980 Dr Walford visited PFC, with colleagues, and understood from her visit that “Liberton had a substantial capacity for expansion” notwithstanding “substantial”staffing difficulties.[1839]

In 1981 a trial of shift working was conducted at Edinburgh.[1840]

By October 1981, some 90% of the blood products used in Scotland were manufactured in Edinburgh, making Scotland virtually self-sufficient.[1841] John Watt is recorded as having nonetheless stressed that the plant was considerably underutilised and could process blood to serve a population of around 25 million.[1842] However, this would require some capital investment. New warehouse and storage facilities would be required. So, too, would a shift system if there were to be continuous working in Edinburgh. The Edinburgh plant had been designed to accommodate continuous flow production. This was John Watt’s design. Dr Foster told the Inquiry how he had improved it further.[1843]

This theoretical potential depended in practice on how the factory coped with a 24-hour continuous process. It already operated on a process known as “continuous small volume mixing” (“CSVM”) but it was not until October 1981 that there was a test study to show those designing the BPL plant how CSVM worked in practice. This had not happened before, it seems, because of a shortage of plasma for continuous CSVM, and a lack of agreement as to the terms upon which those working at the plant would be employed. As it happened, instead of examining three shift working (three eight hour shifts),[1844] the trial involved two shifts, each of 12 hours.[1845]

The plant was observed in operation for a week by representatives of BPL. David Wesley[1846] (from BPL) produced a balanced, detailed report on this feasibility exercise. Critically, it concluded that the CSVM system was capable of continuous operation for periods of at least 120 hours.[1847] With the exception of quarantine storage, through to inspection and dispatch, all support sections appeared to have the capacity to handle the product from 1,000 litres of time-expired plasma per day. However:

“Storage space throughout the whole of PFC was at a premium and increased production as achieved during week 2 and 3[1848] will probably place an even greater strain on what is already an overloaded system. The production of Stable Plasma Solution, from plasma storage through to the release of the finished material, relies heavily on each link in the chain functioning correctly. Because of limited in-process storage capacity what may appear at first to be a minor problem, overcome without undue difficulty under normal conditions, may develop because of ‘knock-on effects’ into a much more serious situation when dealing with the products of continuous CSVM operation. The capability of PFC to withstand this type of problem was not challenged during the Feasibility Exercise.”[1849]

A Policy Steering Group meeting followed on 18 December. Mr Hibbert reported on shift working, having attended as an observer during the experiment. Though he did not expect the findings of the exercise to prove conclusively that continuous working would overcome the short-comings of the existing system, the experiment had shown that the equipment could function on such a basis. However, his general impressions were that PFC was capable of improvement: its layout was not ideal, and its output might be increased if the present system were changed. There were doubts that it was more cost effective than BPL. However, he noted that PFC hoped eventually to serve the northern English regions. The group thought it essential to obtain a firm commitment from the SHHD as to the amount of plasma from England which PFC could fractionate.[1850]

There is a telling comparison of the views expressed in 1990 as to what the shift working experiment actually showed. Professor John Cash suggested it provided evidence that Scotland had “very substantial spare capacity” to assist with fractionation of plasma from England and Wales.[1851] He considered that it was a grave error of judgement not to use it.[1852]

Dr Lane, for his part, considered that the results were inconclusive: the trial had been short, and Factor 8 had not been produced by it since the trial involved time-expired plasma which would have no Factor 8 activity.[1853] He thought it unrepresentative of what would occur in practice, and that it was not sustainable for PFC to operate on a 24 hour basis without further investment in facilities, plant and equipment.[1854]

John Harley of the DHSS and Angus MacPherson of the SHHD had however communicated contemporaneously about the results of the trial. It had concluded satisfactorily but (said Angus MacPherson, agreeing in this respect with Dr Lane though without the same dismissive tone) “PFC, Liberton, could process substantial quantities of English plasma only if further ancillary facilities can be provided, and … more land will be needed for the building required.”He gave a figure of around £6-7 million being necessary with an appropriate proportion of the capital cost for the additional facilities to be funded by the DHSS. The necessary building works could be completed in approximately two and a half years, and agreement through the Whitley Council would be needed before staff at PFC could be expected to work in shifts regularly.[1855]

On 1 March 1982, the Policy Steering Group convened once more and agreed that as PFC would not be able to fractionate any substantial quantity of English plasma without the introduction of a three shift working system, plans for the redevelopment of BPL should not proceed on the assumption that it, PFC, would process plasma for England and Wales. Instead, PFC should process plasma for Scotland and Northern Ireland.[1856]

It appears that it was not appreciated that of the £6-7 million identified by Angus MacPherson, a substantial proportion would be needed to rectify the shortcomings which the Medicines Inspectorate had identified in its processes.[1857] Their report on PFC was not at all as critical as it had been in respect of BPL. Nonetheless there were faults which undoubtedly required to be put right, and these would involve the expenditure of considerable funds.

It was the estimate of £6-7 million that put a stop to any further discussions with Scotland.[1858] It may be that a paper prepared in January 1982[1859] is the best source now available for a decision which appears to have been confirmed in September 1982.[1860] It said that “Given that the present BPL has to be redeveloped … it would be more expensive to build a smaller BPL (£18.6 million) and invest £4 million in PFC than to build a BPL capable of achieving self-sufficiency (£21.03 million). In any case, in the view of DHSS officials, it remains highly doubtful whether a shift-working agreement can be negotiated with staff at PFC without serious repercussions on pay of other groups in the NHS and the Industrial Civil Service.[1861]

In oral evidence to the Penrose Inquiry, Dr Foster later commented that “the decision ultimately was to build the large plant for the whole of England and Wales and not to send plasma to Scotland, and that was justified on some costings that I think, looking at now, could be seen to be quite wrong.[1862] He amplified this in his written statement to this Inquiry.[1863] If PFC processed plasma to produce Factor 8 for England, then less would be needed to be spent on BPL, since BPL would then process smaller quantities of plasma, and since smaller quantities required less manufacturing space and equipment, it would cost less to build.[1864] He pointed out that Dr Cash’s figure for the additional costs of utilising PFC to process plasma from England for Factor 8 production was £1.2 million[1865] and there would be a saving of more than that sum under the provisional estimates for the building of BPL.[1866]

Commentary

In summary, after 1977 planning was not conducted on a UK-wide basis. Instead, the two principal production units, at Elstree and Liberton, were each seen as primarily serving the interests of the country in which they were located, rather than the joint interests of both. Plans which had been developed, and funded, in the mid 1960s envisaging two equal units of production were no longer pursued. Dr Lane saw PFC as a rival, bidding for funds that were needed to renew BPL,[1867] rather than as a partner in an enterprise to ensure a UK-wide sufficiency of product. The costs of renewing BPL were considerable. There was no alternative but to incur them, given the contents of the Medical Inspectorate’s report and the policy of achieving self-sufficiency. This in turn led to the Government considering whether the production unit should be run by Beechams rather than by the NHS. It might not have helped the development of a larger unit at BPL[1868] if, at relatively modest expense, the CSVM system operated in Edinburgh could be pressed into service to meet a third or more of the demands for products in England and Wales. Whether or not this was part of the reason for his doing so, he expressed scepticism over the abilities of PFC to produce product in the quantities they said they could. As almost the other side of the coin, it is to be inferred that John Watt was frustrated[1869] that, as a result of delays in reaching agreement as to the supply of fresh frozen plasma from England in sufficient amount to utilise PFC as had been intended, its operation was rendered less efficient than it should have been. John Watt and Dr Cash thus wished to emphasise the advantages they saw PFC as having over BPL, particularly those of the scale of production and the length of time it would take to provide the same levels of production from a new build in England.

A contributory cause to the failure to utilise Scotland was the failure in England and Wales to obtain sufficient fresh plasma before Dr Owen’s initiatives in 1975 to 1977. Since all the available plasma from the north of England was being recruited to production in BPL it left none available for PFC. The way in which the regions were funded to provide for blood products may thus be seen both as a cause of the continued importation of commercial concentrate into the UK[1870] and a contributory cause of the failure to utilise PFC to bridge the gap between supply and demand. In challenging economic conditions the increase of use of commercial product itself led to a lack of incentive to plan immediately for a replacement for BPL, because of the extent of capital commitment that would involve.[1871] The state of BPL was then officially recognised as so poor as to require urgent expenditure: but rather than this being seen as a need which urgently must be met, it in turn led to the Government delaying any long-term plan in order to consider the extent to which pharmaceutical companies might be able to play a part in future factor concentrate provision for the UK. Delays which were described as “unconscionable”and compounded by indecision then simply made the problem worse.

A further, sad, feature of this story is that the focus was largely on making more product available. It was not, as it should have been, making more safer product available.

If self-sufficiency had been achieved, the views generally expressed to the Inquiry, with which I agree, were that a considerable number of cases of AIDS would have been avoided, and, it follows, a significant number of deaths.[1872]

Self-sufficiency was achieved with relative ease in Scotland. The only exception to this was when it came to producing hepatitis-free heat-treated product. BPL got to that goal first – though Scotland was ahead when it came to heat treating all NHS product to be supplied. These themes, and the sad events which remind any reader that self-sufficiency did not by itself provide total protection against the risks of AIDS and other blood-borne diseases, are picked up in greater detail in the next chapter on Viral Inactivation.

Self-sufficiency: Northern Ireland

Between 1972 and 1998, Northern Ireland was governed by direct rule from Westminster. This included responsibility for blood collection and other health matters.

In 1973, following a major restructuring, health care in Northern Ireland was administered by four health and social boards representing the east, north, west and south, covering a population of approximately 1.5 million. The Northern Ireland Blood Transfusion Centre, which was part of the Northern Ireland Blood Transfusion Service was based in Belfast. Its director from 1968 to 1980 was Colonel T E Field. He was replaced in June 1980 by Dr Morris McClelland.

There was one haemophilia centre. It was based at the Royal Victoria Hospital, Belfast. It was designated a reference centre in September 1981 by the Northern Ireland Office. The director of the Centre between 1978 and 1999 was Dr Elizabeth Mayne. The hospital itself was part of the Eastern Health and Social Services Board.

Northern Ireland was too small to produce factor concentrate itself, though it was able to make cryoprecipitate from local blood donors.[1873]

Accordingly, haemophilia patients in Northern Ireland who were treated with concentrate received NHS products manufactured elsewhere in the UK, or imported commercial products. It received what were described by Dr Mayne as limited quantities of concentrate from Elstree and Oxford.[1874] Because it was too small a population to support a fractionation centre manufacturing concentrate of its own, Northern Ireland was never itself self-sufficient in blood products. Rather, it has to be viewed as part of the greater entity which is the UK.

From 1980, thought began to be given to obtaining factor concentrate from Edinburgh. This reflected the view of Dr Morris McClelland that there were capacity issues at BPL, but apparent spare capacity at PFC.[1875] In December 1980, the health departments serving the four nations of the UK discussed the role that Edinburgh might play in supplying Northern Ireland with blood products. This had impetus because under the pro-rata distribution system introduced from 1 April 1981, Northern Ireland would not be entitled to any factor concentrate: none of the plasma which it collected was sent to BPL.

This graph shows from 1976 to 1983, the majority of Factor 8 used in Northern Ireland was commercial and then NHS product usage increased.

Figure 6. Total Northern Ireland Consumption of Factor 8 (NHS & Commercial)[1876]

Following a pattern similar to that in England and Wales, cryoprecipitate usage dropped off after 1976 as factor concentrate use increased. However, the amount of NHS concentrate used in Northern Ireland was minimal compared to the amount of commercial concentrate until around 1984 when for a period of time the use of NHS concentrate began to, and then did, overtake the use of commercial concentrate. The amount of factor concentrates made from voluntary donors is reflected in the chart from 1982/3 onward.

BPL ceased sending its factor concentrates to Northern Ireland with effect from 1 April 1981. PFC did not begin to fractionate plasma until some eighteen months later. There were a number of reasons for this delay. The main one was that the plasma in Northern Ireland was not tested for the presence of Hepatitis B by using an RIA test. Plasma in Scotland was. Accordingly, it would run an unacceptable risk of transmission of Hepatitis B if plasma from Northern Ireland which had been inadequately tested for the presence of hepatitis were to be mixed in the Edinburgh processors with plasma from Scotland which had been more reliably tested. Next, plasma supply from Northern Ireland needed to be increased. Existing refrigeration facilities for plasma were not adequate for freezing the amounts of plasma which would be required to be transported (in a refrigerated truck, and quickly, from Northern Ireland to Edinburgh) and there were some other logistical and administrative difficulties to be overcome. Though the exact date is not clear, it appears that it was in the autumn of 1982 before supplies began from PFC.[1877]

The consequence of Northern Irish plasma being sent to BPL, and Northern Ireland in return receiving Factor 8 concentrate manufactured at Liberton, is that the quantity of NHS factor concentrate used in Northern Ireland increased dramatically.[1878]

Final observations

Seeking to ensure that the UK did not need to rely on blood products from abroad in order to treat haemophilia was the aim of many of those involved –clinicians who raised the problem early, transfusion directors who arranged for supplies for the UK effort at local expense, fractionators who prided themselves on their professionalism, a minister and some civil servants who were far-sighted, the Haemophilia Society which raised the gap between policy objective and actions – and yet ...

The account in this chapter is haunting as to what might have been. If the DHSS had been more curious about Scottish forward planning in the 1960s, and had echoed it in making provision for England, Wales and Northern Ireland; if they had heeded the full range estimated by the Medical Research Council Working Party in 1974 as well as the increased estimates later coming from the Trends Working Party; if they had in 1976 faced up to the reality that the Medicines Inspectorate were almost certain to condemn BPL as it was, and had grasped the implications of their minister declaring by mid 1978 that BPL and PFL were both working to capacity; if they had briefed Dr Owen in 1974 that what he was being asked to agree – to increase the flow of plasma – would not be enough, so that the Haemophilia Society wouldn’t have had to try to tell him a year later … then matters might (indeed should) have turned out differently.

If, too, the delays in taking a decision to rebuild BPL had not been “unconscionable” – to take the years from 1978 until 1982 to do so, all the while offering commercial products, riskier than should have been supplied to patients in need of treatment, was a disgrace – then much disease might have been avoided.

Each and all of these opportunities were missed. The consequence was that commercial concentrates carrying HIV virus, as well as non-A non-B Hepatitis, were supplied in greater quantities from the US to patients in England, Wales and Northern Ireland[1879] than would otherwise have been the case, not just until 1985 but until the late 1980s so far as Hepatitis C was concerned for, with the exception of Scotland, the UK was still not self-sufficient.

It is not that some in the DHSS did not see the problem and do what they could to draw the attention of others to it but, overall, the DHSS bears a greater part of responsibility for the failures to achieve self-sufficiency than do others. Ministerial indecision contributed to delay in the late 1970s and early 1980s, but (so far as can now be determined given that many of the submissions to ministers have disappeared) ministers may have been inadequately briefed; and the documents that survive suggest that Roland Moyle was not consulted about the decision to abandon the UK-wide approach that had hitherto been adopted. However, they bear some responsibility too for the systems that contributed to what occurred: they include: regional financing, leading to lack of central control over what was a National Blood Transfusion Service in name only; the licensing of commercial concentrates which led to demands for more and more concentrate which were difficult to resist; a lack of central guidance on the importance of giving blood transfusions only when necessary. Those responsible for such systems might wish to reflect that they have the luxury of drawing them up at some leisure in conference rooms and Cabinets, but their consequences are often felt in conditions of urgency and emergency at patient bedside, in wards and hospitals where there is little time to pause and reflect.

What happened was now a long time ago (although its consequences still resonate today on the lives of people who were infected and their families). Some of the voices from those times still ring down the ages, and set the right note on which to end this chapter.

First, Dr Cumming (writing in 1975): “those inadequacies which exist in modern developed countries are entirely the fault of the organisation of the Service … One of the important features of a successful voluntary blood donor system is forward planning.[1880]

Dr Smith (writing in 2020, about 1975):

“Although national self-sufficiency in blood products was strongly endorsed by the WHO [PRSE0003476], no-one could claim that the principle, and its consequent responsibilities, were embraced as energetically in England as in Scotland … Coming from Edinburgh to Oxford in 1975, I was shocked by this lack of appetite for self-sufficiency at a national level … We were constantly being reminded that it was not DH practice to spend large sums on ‘speculations’.”[1881]

Dr Snape’s final words to the Inquiry sum up what happened. He said he had tried to focus on what BPL/PFL did from 1970 onwards, what was achieved and “what stopped us doing more?” Then his final two paragraphs:

“Now I know that my evidence is the only oral testimony from BPL/PFL. If I had to summarise what we achieved in just a few words, it would be too little, too late.

I know that BPL/PFL staff worked tirelessly, achieved a great deal, but that influences external to BPL/PFL stopped us doing more and doing it sooner. And for that I’m profoundly sorry.”[1882]

Lord Owen (giving oral evidence):

“We all wanted, for everything. You have to choose, and of course it is difficult. But we did it … The Department made their observations, I made the decision, they loyally followed it, they chased, they guarded, they put it in things, and that limited amount of money achieved a substantial way towards self-sufficiency. It didn’t achieve it but, against a rising trend of demand, it overachieved what we expected. So I don’t think there is any reason for anybody who was involved in the Department during that period to hang their head in shame about this at all.”[1883]

Many reading this chapter will think that, contrary to his views, he was less well served than he should have been, since from such documents as remain available it appears he was not informed of the MRC Working Party estimate based on Dr Biggs’ paper by his civil servants,[1884] nor made aware of the creaking nature of the production facilities even if a sufficient supply of plasma might be achieved. Had he been, at a time early in his (short) period of office, the outcome might have been different.

3.14 Viral Inactivation

This chapter looks at whether heat-treated products could and should have come earlier in the UK. It