The Wrong Way to Use AI in Healthcare

Lawsuits are beginning to pile up against insurance companies participating in the Medicare Advantage program. The complaint? The wrong way to use AI in healthcare is with faulty algorithms to approve or deny claims. While AI can be extremely helpful in streamlining administrative tasks — comparing physician notes with Home Health assessments and nursing notes or reading hospital discharge documents — it seems not to be any good at deciding whether to approve or deny care.

The Wrong Way to Use AI in Healthcare Example 1

The Minnesota case, November, 2023, UnitedHealth Group:

  • An elderly couple’s doctor deemed extended care medically necessary
  • UnitedHealth’s MA arm denied that care
  • Following their deaths, the couple’s family sued UnitedHealth, alleging:
    • Straight Medicare would have approved the extended care
    • United uses an AI model developed by NaviHealth called nH Predict to make coverage decisions
    • UnitedHealth Group acquired NaviHealth in 2020 and assigned it to its Optum division
    • nH Predict is known to be so inaccurate, 90% of its denials are overturned when appealed to the ALJ level
    • UnitedHealth Group announced in October, 2023 that its division that deploys nH Predict will longer use the NaviHealth brand name but will refer to that Optum division as “Home & Community Care.”

The family’s complaint stated, “The elderly are prematurely kicked out of care facilities nationwide or forced to deplete family savings to continue receiving necessary medical care, all because [UnitedHealth’s] AI model ‘disagrees’ with their real live doctors’ determinations.”

The Wrong Way to Use AI in Healthcare Example 2

The Class-Action case, December 2023, Humana:

  • A lawsuit was filed on December 12, 2023 in the U.S, District Court for the Western District of Kentucky
  • It was filed by the same Los Angeles law firm that filed the Minnesota case the previous month, Clarkson
  • The suit notes that Louisville-based Humana also uses nH Predict from NaviHealth
    • The plaintiffs claim, “Humana knows that the nH Predict AI Model predictions are highly inaccurate and are not based on patients’ medical needs but continues to use this system to deny patients’ coverage.”
    • The suit says Medicare Advantage patients who are hospitalized for three days usually are eligible to spend as many as 100 days getting follow-up care in a nursing home, but that Humana customers are rarely allowed to stay as long as 14 days.
    • A Humana representative said Humana their own employed physicians see AI recommendations but make final coverage decisions.

What Makes This Possible

According to experts we speak with, there are many ways to use data analytics. The insurance companies named in the lawsuits use predictive decision making. This way of analyzing data compares a patient to millions of others and deduces what treatment plan might be suitable for one patient, based on what was effective for most previous patients. Opponents of this method have called it “data supported guessing.”…

Read Full Article