Alarming Study Tries to Automate Labeling of Patients with Opioid Use Disorder
/By Crystal Lindell
An alarming new study attempts to automate the process of labeling chronic pain patients with Opioid Use Disorder (OUD), by using a computer to scan doctors’ clinical notes.
A team of researchers analyzed medical records for over 8,000 patients with chronic pain, and then used an automated process to scan clinical notes from doctors, patient demographics, and diagnostic codes. The automated process was then compared to whether patients had already been given a diagnostic code for OUD.
The researchers claim that their automated approach out-performed diagnostic codes when it came to finding patients with OUD. The codes are a key part of healthcare and are used by doctors not only to make a diagnosis, but to get reimbursed by insurers for treating patients.
The authors say the diagnostic codes are “unreliable and underused,” and claim that their automated approach will do a better job predicting which patients are at risk for OUD and which once already have it.
“This automated data extraction technique may facilitate earlier identification of people at risk for and who are experiencing problematic opioid use, and create new opportunities for studying long-term sequelae of opioid pain management,” wrote lead author Alvin Jeffery, PhD, an Assistant Professor in the Department of Biomedical Informatics at Vanderbilt University Medical Center.
Jefferey and his colleagues say chronic pain patients treated with opioids are “at high risk of developing an opioid use disorder,” and cite a single study estimating the risk is as high as 18%. Most research puts the probability much lower, at about 1%.
There are a number of other alarming things about this research.
First and foremost, using any sort of automated process to label patients as having opioid use disorder is incredibly dangerous. Especially if that automated process applies the label to more patients than doctors already are.
Also, the researchers used the “Addictions Behaviors Checklist” to determine if patients have OUD. Unfortunately, that checklist is known for lumping in a lot of patients who simply have untreated or under-treated pain.
For example, one of the items on the checklist is “patient running out of medications early” – which means anyone who isn’t being prescribed enough pain medication could qualify as having OUD.
Another criteria on the list is “patient expressing concern about future availability of narcotics” – a normal thing to be worried about when opioid shortages are widespread and opioid-phobia is rampant in the medical community.
Other red flag terms they search for in doctors’ notes are “hoard,” “stash,” “left over” and “storing.” This also overlooks the fact that prescription opioids can be difficult to get, causing many patients to keep leftover ones, just in case they needed them in the future. A recent PNN survey found that 32% of patients hoarded unused opioids.
Once a patient gets labeled with OUD, it can quite literally ruin their lives by making it more difficult for them to get their pain adequately treated. That doesn’t just apply to patients with chronic pain. If a surgical patient experiencing post-op pain (as nearly all do) has “possible OUD” in their chart, doctors are much less likely to prescribe opioid pain medication.
I genuinely worry that we are reaching a point where computers and artificial intelligence will be used en masse to label patients with damaging mental health diagnoses like OUD. And if that takes root, there won’t be any way for patients to counter the diagnosis.
In fact, one of the biggest problems in healthcare is that there is no right to due process. Once you are given a medical verdict, so to speak, you are often stuck with it.
We should all be worried that these types of automated diagnostic tools will also be expanded beyond opioid users to label other patients with stigmatizing mental health conditions that impact the quality of the medical care they receive going forward.
At this point, I’m not sure what patients can even do to stop this from happening, but my hope is that bringing more awareness to the issue will at least slow its progression.
I personally would judge any doctors who would rely on any sort of automated process to give a patient a dangerous label like OUD, even if such a process is mandated by the hospital they work for, or even the government.
I hope that doctors will have the moral fortitude to fight back on these types of things. Although if the medical community’s recent history with opioids is an indication, I’m not convinced most of them will stand up for their patients.