Attorney General Rob Bonta | Attorney General Rob Bonta Official photo
Attorney General Rob Bonta | Attorney General Rob Bonta Official photo
OAKLAND — California Attorney General Rob Bonta today issued a letter to the Biden Administration, supporting a new federal rule that would help ensure technology used by healthcare providers is safe, effective, and deployed without reinforcing unjust racial bias when addressing health and disease. The rule, titled “Health Data, Technology, and Interoperability”, establishes guidelines on the development and use of predictive algorithms in healthcare. These fast-growing tools are used by many providers to make life-changing decisions, such as which patients to refer to specialists, who to offer charity care to, which diseases to screen a patient for, or whether a reaction to an infection might be deadly. With the dawn of more sophisticated AI-powered tools, the influence and pervasiveness of technology in healthcare decision-making is expected to rise — despite its potential for bias, inaccuracy, and lack of accountability that has already raised concerns among experts. That’s why in his letter today, Attorney General Bonta urged the federal government to move quickly in finalizing the proposed rule to help address racial and other disparities from being built into tools that are making decisions about the health of patients every day.
“Most of us think of technological tools as impartial and objective, but the truth is that all too often, they reflect the same biases and prejudices that plague our human world,” said Attorney General Bonta. “Data, algorithms, and artificial intelligence are becoming more and more crucial to how our healthcare system operates. Work must be done to shape them into powerful tools — to fight injustice instead of perpetuating it, to protect our most vulnerable communities instead of marginalizing them. That’s why I launched an inquiry into bias in healthcare algorithms last year, and it's why I’m glad the Biden Administration is paying attention to this important issue. Fighting for fair, equitable, and affordable access to healthcare for all Californians remains one of the top priorities of my office.”
In August 2022, the California Department of Justice (DOJ) launched a groundbreaking investigation into whether commercial healthcare algorithms – types of software used by healthcare providers to make decisions that affect access to healthcare for California patients – have discriminatory impacts based on race and ethnicity. This gives DOJ a unique vantage point on the Biden Administration’s proposed rule.
The rule requires healthcare software developers to be more transparent about the data they are using to model their algorithms and establishes certification guidelines for these programs. This is important because if algorithms are trained on a narrow or limited dataset, they can inadvertently learn and perpetuate biases present in that data. For example, a 2019 study found that a widely used algorithm used to help hospitals identify high-risk patients was racially biased.
In his letter today, Attorney General Bonta wrote that the proposed rule would implement important transparency standards of rapidly evolving health technology that could have life-changing impacts on people. Users would be able to review the source of the algorithm modeling data and provide feedback, which could promote trustworthiness, and incentivize the development and wider use of fair, appropriate, effective, and safe health decision-making software that utilizes artificial intelligence.
A copy of the Attorney General’s letter can be found here.
Original source can be found here.