How Blood Transfusions Played a Role in Hep C
Last updated: October 2021
Blood transfusions: They’re a life-saving option in modern healthcare. However, the journey to get to that point has crossed 100s of years - and it’s had its challenges. One of these challenges is the possible transfer of viruses, bacteria, and parasites from one person (the donor) to others (the recipients). Blood diseases (such as those which have been passed on through blood transfusions in the past) can also be transferred through drug use and sometimes through sex, giving rise to stigmas about these diseases.
If you live with hep C, you’ve likely also faced these stigmas. My hope is understanding the history of blood transfusions will give you insight, and maybe even a talking point, against the negativity.
The history of blood transfusions
We take what we know in healthcare for granted sometimes. It's strange to think at one point we didn’t even realize our body circulated, or moved, our blood around. But that discovery, plus the development of tools (such as intravenous (IV) syringes) were the beginnings of our ability to do blood transfusions.1 Despite the information and tools, after the first use of blood transfusions, many questions remained:
"Will blood transfusions work on all sick people? How much blood is too much to take, and/or too much to give? Can people be given blood transfusions from animals? Or does it have to be person to person? Will the transfusion be direct or will the blood need stored (and how?)?" And so on...
Some of the first documented blood transfusions were in the 1600s. During this experimental time, transfusions were done animal to animal or animal to person. Some transfusions worked, but many did not (imagine someone’s body trying to use the blood of a lamb!). Some of these first transfusions were so unsuccessful that they created controversy and were eventually banned in some places.1,2
It wouldn’t be until nearly 200 years later, in the early 1800s, that the first successful human to human blood transfusion was documented. It saved the life of a young mother who was losing blood after giving birth.1
Why blood type matters
That success and others that followed were pure luck. The reason? Blood types weren’t discovered until the early 1900s, and even later, the Rh factor (Rh is the “positive” or “negative” label that goes with your blood type; example: “O positive”; “A negative”). Matching blood types and Rh factors during transfusions are critical in preventing severe reactions and death. Since those early days, discoveries continued to be made and the process of cleaning and storing blood developed.
Screening blood for disease
Yet, it wasn’t until the 1980s and 1990s that screening out donors and their blood became more common. Tests were developed to determine if certain viruses, such as HIV, hep C, and others, were present in a donor’s blood.1,3
Before these safeguards were set in place, the risk of blood borne transmission through a transfusion was higher. This is one possible reason why Baby Boomers are affected by hep C more than other generations (with an estimated 5x higher risk).4 Fortunately, today, the risk of hep C through blood transfusion in the US is fairly low.
Regardless of how you developed hep C, the history of blood transfusions is part of the hep C story. It’s part of learning more about the condition, its transmission, and preventing infections in the future.