Home » News » Algorithms promised efficiency. But they’ve worsened inequality
News

Algorithms promised efficiency. But they’ve worsened inequality

Teachers at Philip’s West London School predicted that he would earn 2 A and B grades on his exams, which would have comfortably secured his place to study law at the University of Exeter.

On August 13, the student sat at home trying to access the website that would confirm whether or not he had a place in college.

“I was upstairs trying to get [the website] to load and my mom was down there doing the same, “he told CNN.” She opened it and screamed. And they refused me.

“I didn’t feel too good,” added Philip. “Yeah, I was pretty upset about it. But everyone I was with was in a similar situation.”

The model gave Philip a B grade and 2 Cs. The teenager was not alone; nearly 40% of grades in England have been downgraded relative to grades predicted by teachers, with students in state-funded schools being hit harder by the system than their peers in private schools. Many subsequently lost their place at the university.

An outcry ensued, with some teenagers protesting outside the UK Department for Education. Videos of the student protests have been shared widely online, including those in which teens chanted, “F ** k the algorithm!”

After several days of negative headlines, Education Secretary Gavin Williamson announced that students would be given grades predicted by teachers, instead of grades assigned by the model.

The algorithm chosen was aimed at ensuring fairness, ensuring that the score distribution for the 2020 cohort follows the pattern of previous years, with a similar number of high and low scores. He relied on teacher predicted grades and student rankings to determine grades. But above all, it also took into account the historical performance of schools, which benefited students from more affluent backgrounds.

Private schools in England, who charge a fee to parents, usually have smaller classes, with grades that could not be easily standardized by the model. The algorithm therefore gave more weight to the scores predicted by teachers for these cohorts, which are often richer and whiter than their downgraded peers in public schools.

“One of the complexities we have is that there are many ways an algorithm can be fair,” said Helena Webb, senior researcher in the Department of Computer Science at the University of Oxford.

“You can see an argument where [the government] said [it] wanted to achieve results similar to last year. And at the national level, this could be considered as [being] fair. But what was right for individuals is completely missing.

“Obviously, this algorithm reflects and reflects what has happened in previous years,” she added. “So it’s not [reflect] the fact that schools could [improve.] And of course, this will have worse effects on public schools than on the well-known private schools that have consistently higher grades. “

“What made me angry was the way [they] treated public schools, “said Josh Wicks, 18, a pupil from Chippenham in Wiltshire in the west of England. His grades were downgraded from 2 A * and from A to 3 Aces.

“The algorithm thought that if the school had not reached [high grades] before, [pupils] I can’t get them now, “he told CNN.” I just think it’s condescending. “

The political storm has left ministers in Boris Johnson’s government scrambling for an explanation, following strong criticism over his handling of the coronavirus pandemic. Covid-19 has killed more than 41,000 people in the UK, making it the worst affected country in Europe.

Why are some algorithms accused of bias?

The algorithms are used in all parts of society today, from social media and visa application systems to facial recognition technology and exam scoring.

Technology can be free for moneyshort governments and for companies seeking innovation. But experts have long warned of the existence of algorithmic bias, and as automated processes become more widespread, accusations of discrimination are spreading as well.

“The A-levels stuff is the tip of the iceberg,” said Cori Crider, co-founder of Foxglove, an organization that challenges alleged abuses of digital technology. Crider told CNN the algorithms replicated the biases found in the raw data used.

Students hold placards as they demonstrate outside the Central London Department of Education on August 14.

But Crider cautioned against the impulse to simply blame policy issues on technology.

“Anyone who tells you it’s a technical problem is [lying],” she says.

“What happened [with the exams] is that a policy choice was made to minimize rating inflation. It is a political choice, not a technological one. “

Foxglove and the Joint Council for Immigrant Welfare recently challenged the UK Home Office over its use of an algorithm designed to disseminate visa applications. Activist groups alleged that the algorithm was biased against applicants from certain countries, which automatically made it more likely that those applicants would be denied visas.

Foxglove alleged that the screening system suffers from a feedback loop, “where past bias and discrimination, fed into a computer program, reinforces future bias and discrimination.”

“We have reviewed how the visa application streaming tool works and we will be rethinking our processes to make them even more streamlined and secure,” a UK Home Office spokesperson told CNN.

“But we do not accept the allegations of the Joint Council for the Welfare of Immigrants made in their application for judicial review and while the litigation is still ongoing it would not be appropriate for the department to comment further.”

Crider said the problems Foxglove found with past data leading to biased algorithms were obvious elsewhere, highlighting the debate over predictive policing programs in the United States.

In June, the California City of Santa Cruz banned predictive policing on the grounds that analytics software agents used in their work discriminated against people of color.

“We have technology that could target people of color in our community – it is technology that we don’t need,” Mayor Justin Cummings told Reuters news agency in June.

“Part of the problem is that the data gets fed in,” Crider said.

“Historical data is introduced [to algorithms] and they reproduce the [existing] bias. “

Webb agrees. “A lot of [the issue] is about the data the algorithm learns from, “she said.” For example, a lot of facial recognition technologies have come out … the problem is that a lot of [those] the systems were formed on a large number of white male faces.

“So when the software is used, it’s very good at recognizing white males, but not very good at recognizing women and people of color. And that comes from the data and how the data was fed into it. ‘algorithm.”

Webb added that she believed the problems could be partly mitigated by “more attention to inclusiveness in datasets” and a push to add more “multiplicity of voices” around algorithm development. .

Increased regulation?

Activists and experts told CNN they hoped the recent debates over algorithms would lead to more oversight of the technology.

“There is a lack of regulatory oversight over how these systems are used,” Webb said, adding that companies could also choose to self-regulate.

Some companies are increasingly vocal on the issue.

“Some technologies risk repeating the models developed by our biased societies”, Adam Mosseri, CEO of Instagram wrote in a statement in June about the company’s diversity efforts. “While we do a lot of work to help prevent subconscious bias in our products, we need to take a closer look at the underlying systems we’ve built and the areas where we need to do more to avoid bias in these decisions. .

Facebook, owner of Instagram, then created new teams to examine biases in the company’s systems.

“I would like to see a democratic retreat on [the use of algorithms]”Said Crider.” Are there areas of public life where it is not acceptable at all to have these systems? “

As the debate continues in boardrooms and universities, these automated systems continue to determine people’s lives in many subtle ways.

For Philip, the UK government’s abandonment of the exams algorithm has left him in limbo.

“We sent an e-mail to Exeter [University] and they phoned and they’re in kind of a mess, ”he said, adding that he hoped to be able to return to his place. “I think I’ll just postpone now anyway.”

He said he was grateful to receive his scheduled grades, but said the experience had “gone rather badly”.

“[The government] had months to deal with this problem, “he said.” I understand that there is a lot going on in health care, but […] that’s a pretty bad performance. “




Source link