Physicians aim to benefit from advances in data an artificial intelligence
Imagine how helpful it would be to a physician if a nonverbal patient could somehow communicate their pain level. Or if that 200-page electronic health record could be quickly mined for just the lab results.
Both of these things are already happening and more innovations like them are on the way, according to the panelists at a presentation at the HIMSS23 Conference in Chicago.
Don Woodlock, head of Healthcare Solutions at InterSystems, led the discussion how technology was changing care delivery.
Philip Daffas, CEO and managing director of PainChek, talked about how AI was helping care givers understand the pain levels of nonverbal patients. “The AI tool analyzes facial expressions, the care person observes them, and then creates a score that is documented into the EHR,” says Daffas. “It allows the caregiver to assess their pain, administer controls, and then reassess how it is working.”
Daffas pointed out that the AI component is not about taking away jobs or trying to be better than a doctor, but to provide information for the physician or nurse in ways not always humanly possible. For example, the AI tool analyzes facial expressions and does it through a three-second analysis.
“Can a human do it? Yes, if they are really well trained,” says Daffas. “The challenge is to pick up the minor expressions not visible to the naked eye, up to nine different ones at a time. Once the AI is trained properly, you know you have a reliable data set that can help take a pain assessment.”
Initially developed for nonverbal dementia patients, Daffas says the technology is being expanded to analyze pre-verbal infants that can help doctors assess the pain levels of children who can’t yet communicate.
This AI application can assist physicians in accurately diagnosing pain levels, reduce reliance on pain-killing drugs, and help the patient.
Jay Nakashima, executive director of eHealth Exchange, spoke about how data sharing and interoperability can help physicians understand the patient in front of them. In a care setting, when a patient shows up and is asked about what medications they take, the response might be “a red one, a blue one, and a green one,” – an answer not particularly helpful to a doctor. But when a doctor can see not only what meds were prescribed, but also which prescriptions were actually picked up, the physician gets a fuller picture of the patient’s total health.
As data exchange becomes more standardized, more data will follow the patient regardless of where they are, and that data will be more useful and accessible to the doctor. For example, Nakashima points out that instead of a doctor having to sift through 200 pages of a medical record, the lab results or a medication list and be quickly pulled out.
In addition, AI, when combined with these standards, has the potential to offer diagnosis help. The AI can scan through pages of notes and data, and not just do a keyword search, but also look at the intent, and form a hypothesis about what the patient may be suffering from. This hypothesis would then be presented to the doctor for review, Nakashima says. This isn’t meant to replace the doctor, but supplement the doctor’s decision-making process.
While more data is moving above, Nakashima says it is critical to maintain patient and provider trust, and every organization should have clearly defined governance on how the data is used, protected, and whether it will be sold or not.