One Explanation for the Delay in Treatment of Nonwhite Patients with COVID-19 (And What Educators Can Learn from It)

Estimated reading time: 5 minutes, 2 seconds

“A retrospective analysis of over 7,000 patients with COVID-19 found that pulse oximeter devices — tools that measure oxygen levels in the blood and that are used in virtually every U.S. hospital — overestimated blood oxygen levels in non-White patients. The inaccuracy made these patients appear healthier than they were and delayed recognition of their eligibility for specific COVID-19 medications recommended by the Centers for Disease Control and Prevention.” (Source).

My husband forwarded me the article fron Hopkins that I am quoting above. It reminds me of a lot of biases we know exist from technology in the world. We know from Joy Buolamwini’s research about how facial recognition is less accurate on darker faces.

I don’t have the details, honestly, about exactly why this happened or how it could have been prevented. But it seems like a trend, doesn’t it? That some tool is deemed useful and necessary for some function, and it turns out that it doesn’t perform its function equally well for some groups of people. Often minorities. Which implies that decisions to use the tool were not accurately tested for different groups. This has happened in different contexts in medicine. And it’s also true on different educational situations.

Think about this simple example of any type of exam and the estimation of how much time it would take for someone to finish it. There is often no differentiation between a native or non-native speaker in how long it would take them to read the questions let alone respond! There are often accommodations made for people with conditions like ADHD but these are not always nuanced. And there are other, more difficult to document reasons why an exam would not be an equitable way to assess learning. Of course.

Today, in a keynote after I presented a theory on oppression, to an academic learning tutor audience, someone asked how students asked to present opposing views in an essay can find opposing views to this. And I remember that I was stumped for a minute, but then compared theories of social justice to meritocratic theories. Then I thought of it. Whoever said that the only or best way to write an essay is to look for “opposing” views, to “debate”, when we could just really look at “alternative perspectives in dialogue” instead, and we could easily show how different perspectives on social justice exist, but are not in opposition to each other. Whoever made the essay that balances argument and counterargument is the be-all and end-all of early university education????

I know the analogy with the pulse oximeters is not perfect. There’s no comparison in terms of people’s health and actual life or death situations with the examples I give here.

But it is an example of how a particular “norm” came about then turned out not be a norm at all, and how certain groups were harmed by it.

I recently heard a podcast around a book that argues that averages can be so harmful. The book is called The End of Average: How to Suceed in a World that Values Sameness by Todd Rose. The podcast was so good!

The tag line is a bit odd. I didn’t read the book, but I feel like the tag line should be “How to Survive in a World that Mistakenly Assumes Sameness When There is Difference”, you know?

With this pulse oximeter device, I assume the mistake was that the device produced inaccurate readings for Asian, Hispanic and Black patients. Whether this related to color of skin or some other thing, I bet that there was a point in time where, on average, the device worked, without checking if it worked differently for different ethnic groups or genders or such. I can’t know for sure how they tested it, but this latested development implies some “sameness” was assumed when it was not in fact accurate to assume so. Interestingly, the research is not saying that Asian, Hispanic or Black ppl have different oxygen rates, but that the DEVICE measures their oxygen rates inaccurately. This is important, because it does not mean that our internal bodies function differently because of our race or ethnicity, but rather than the devices humans made discriminate against us by color of skin (or such) or that the people who approve these devices are not careful about checking the efficacy or accuracy of these devices on people with different skin. I don’t know how the device works,or if this is I a reasonable question for medical engineers, but it sounds like it might be? You know, I went and found the full study, and it cited 3 sources (one of them from 1991!) that mentions inaccuracies of pulse oximetry for measuring oxygen levels for some races, and so I don’t understand why medical practitioners continued to use it mainly during COVID when there were warning signs. This has cost lives. The study says “While pulse oximetry has become a fundamental tool in diagnosis, triage, and management decisions in the acute care setting, the device’s lack of accuracy in certain populations has not been adequately investigated or addressed, although it has been recognized for several decades”. And that is just not acceptable. The study shows that “the race and ethnicity–based discrepancy of pulse oximetry exposes a fundamental flaw in the acquisition rather than interpretation of data”. And so basically, it is a machine error that humans decided to ignore, and not a human (interpretation ) error. It isn’t the only reason outcomes were worse for minority patients, but it is one that could have been avoided given that data already existed on its inaccuracy, and there are alternative ways of measure arterial blood games that are more accurate! Or at the very least, adjust the interpretation of readings and assume that a borderline reading for people of certain races implies a worse reading.

OK so maybe I just wanted to write about the medical article. The better edu analogies will come later!

Image of a dark skinned hand using a pulse oximeter from Pixabay.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.