HARTFORD, Conn. (AP) — Connecticut needs safeguards on state government’s use of artificial intelligence including algorithms at child welfare and other agencies to prevent discrimination and increase transparency, an advisory panel to the U.S. Commission on Civil Rights said Thursday.
The Connecticut Advisory Committee to the federal commission called on state lawmakers to pass laws regulating such systems, which have sparked concerns in other parts of the country.
The problem, critics say, is algorithms can use flawed data that can disproportionately identify minorities, low-income families, disabled people and other groups when agencies make decisions on removing children from homes, approving health, housing and other benefits, where to concentrate law enforcement and assigning children to schools, among other uses.
“The state of Connecticut makes thousands of decisions that impact the lives and civil rights of residents every day,” said David McGuire, chair of the Connecticut Advisory Committee. “When the state uses an algorithm residents should know which agency is using the algorithm, the reason it is being used, and assurances that the algorithm is fair.”
The committee did not identify any specific instances of discrimination and bias in Connecticut’s use of algorithms, but said it would release a more comprehensive report within the next few months. The panel also pointed to a study that said some Connecticut agencies did not release full information on their use of algorithms when asked under public records laws.
Concerns about such use of artificial intelligence, or AI, led the Biden administration in October to issue its Blueprint for an AI Bill of Rights urging government action to safeguard digital and civil rights. Several states have passed their own AI laws.
An investigation by The Associated Press last year revealed bias and transparency problems in the increasing use of algorithms within the country’s child welfare system.
McGuire said the Connecticut panel’s review of the issue is the first by the U.S. Commission on Civil Rights or any of its 56 advisory committees. The commission was established by the Civil Rights Act of 1957 as an independent, bipartisan federal fact-finding agency.
Supporters of using algorithms say they make government systems more thorough and efficient through the use of data.
The Connecticut advisory committee is urging state lawmakers to pass laws that would require independent audits of algorithms, including assessments of potential biases, and mandate under state records laws that information about agencies’ use of algorithms be publicly available.
Democratic Gov. Ned Lamont’s office did not immediately respond to a request for comment. Democrats control both chambers of the General Assembly.
There will be a data privacy bill in this year’s legislative session addressing government use of algorithms, said state Sen. James Maroney, a Milford Democrat and co-chair of the General Law Committee.
“Algorithms can negatively impact all of us,” he said. “There are different instances of unintended consequences, whether it’s discrimination sometimes in hiring. It can discriminate against age. We’ve seen other examples where it’s discriminated against people based on basically being poor. ... And then also there are unfortunately racial disparities in some of the decisions made when using automated decision-making processes.”
Senate Republican Leader Kevin Kelly and House Republican Leader Vincent Candelora welcomed a review of how the state uses algorithms.
“People might be surprised to realize that it’s not human beings behind a desk that are making some of these decisions, but it could be computer generated,” Candelora said. “We need to know what goes into those programs that are making those decisions, because I believe it impacts policy.”
The Connecticut advisory panel pointed to a Yale Law School report released last year that said certain Connecticut agencies did not release full information about their use of algorithms in response to its requests under the state’s Freedom of Information Act.
“Responses to Freedom of Information (FOI) requests confirmed both that existing disclosure requirements are insufficient to allow meaningful public oversight of the use of algorithms, and that agencies do not adequately assess the effectiveness and reliability of algorithms,” the report said.
“The FOIA responses generally revealed that agencies are insufficiently aware of the potential problems posed by their algorithms and unconcerned about the lack of transparency,” it said.
The law school said it requested information on algorithms from the state departments of Children and Families, Education and Administrative Services.
The Department of Children and Families provided the only complete FOIA response to the law school on the use of algorithms to identify at-risk children, the law school said. The agency disclosed basic information but not its source code, which the agency said it did not have and asserted was protected as a trade secret, the law school said.
The Education Department produced only partial information about an algorithm it uses to assign students to schools, while the Department of Administrative Services provided no information on an algorithm used to hire state employees and contractors, according to the law school.
A spokesperson for the Department of Children and Families said the agency was asked in 2021 about an algorithm it stopped using in 2019 because of a lack of staffing. The algorithm was not intended to protect children or prevent improper agency intervention, according to DCF.
The Department of Administrative Services said it was working on a response, and officials at the Department of Education did not immediately return a message.