According to the suit, this is a violation of a California state law requiring insurers to carry out a “thorough, fair and objective" investigation into every claim. Instead of giving each case the individual attention it deserves, the company trusts its PxDx algorithm, which they use to help reduce their labor costs by decreasing the amount of time doctors need to spend looking at each claim. It also saves them significant money by denying so many claims.
The suit states: "Relying on the PXDX system, Cigna's doctors instantly reject claims on medical grounds without ever opening patient files, leaving thousands of patients effectively without coverage and with unexpected bills."
In one particularly concerning case, a California woman received an ultrasound at her doctor's orders because they suspected she could have ovarian cancer. Although a cyst was found on her left ovary, Cigna denied the claim for both the ultrasound and a follow-up procedure on the grounds that they were not medically necessary. As a result, she had to cover the cost of these ultrasounds herself.
Her situation is far from an outlier. Numerous other patients had a similar experience. According to the suit, "The scope of this problem is massive. For example, over a period of two months in 2022, Cigna doctors denied over 300,000 requests for payments using this method, spending an average of just 1.2 seconds 'reviewing' each request."
In many cases, patients appealed; roughly 80 percent of the appeals resulted in the initial decisions being overturned.
The suit comes not long after an investigation by ProPublica this spring that revealed details of how Cigna's algorithm approves and denies claims in batches of hundreds or thousands at once and flags discrepancies between a diagnosis and what the company has deemed to be acceptable procedures and tests for the associated underlying ailments.
In May, House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-Washington) wrote to the company expressing her concerns that policyholders were paying for medical procedures out of pocket that their health insurance should have covered.
Although medical doctors do have to sign insurance denials, the system used by Cigna does not even require them to open up the patient's records to review them.
This is just one of the many ways that patients can suffer when healthcare organizations trust artificial intelligence with duties that should be carried out by trained professionals. While some people believe that AI could be helpful in narrowing down certain diagnoses or helping healthcare professionals contend with paperwork, there is significant room for error, not to mention issues related to patient privacy.
Cigna is not the only company that has been overhauling its processes to incorporate AI. Google's cloud division started offering new tools to help process healthcare claims with AI this spring, boasting that it can help to streamline decision-making and keep data organized. Bupa and Blue Shield of California are already using the new tool.
The law firm that is representing the plaintiffs in this case, Clarkson Law of Malibu, is no stranger to AI-related lawsuits. The same firm filed a suit against the company behind ChatGPT, Open AI, as well as Google and its generative chat bot Bard, for stealing data from millions of people, including writers and artists whose work is copyrighted, in order to train their AI products.
Sources for this article include: