Fri. Apr 19th, 2024

Crime prediction software ‘adopted by 14 UK police forces’

Police officer typing

At least 14 UK police forces have made use of crime-prediction software or plan to do so, according to Liberty.

The human rights group said it had sent a total of 90 Freedom of Information requests out last year to discover which forces used the technology.

It believes the programs involved can lead to biased policing strategies that unfairly focus on ethnic minorities and lower-income communities.

And it said there had been a “severe lack of transparency” about the matter.

Defenders of the technology say it can provide new insights into gun and knife crime, sex trafficking and other potentially life-threatening offences at a time when police budgets are under pressure.

One of the named forces – Avon and Somerset Police – said it had invited several members of the press in to see the Qlik system it used in action, to raise public awareness.

“We make every effort to prevent bias in data models,” said a spokeswoman.

“For this reason the data… does not include ethnicity, gender, address location or demographics.”

But Liberty said the technologies lacked proper oversight, and moreover there was no clear evidence that they had led to safer communities.

“These opaque computer programs use algorithms to analyse hordes of biased police data, identifying patterns and embedding an approach to policing which relies on discriminatory profiling,” its report said.

“[They] entrench pre-existing inequalities while being disguised as cost-effective innovations.”

Predictive software

Liberty’s report focuses on two types of software, which are sometimes used side-by-side.

The first is “predictive mapping”, in which crime “hotspots” are mapped out, leading to more patrols in the area.

The second is called “individual risk assessment”, which attempts to predict how likely an individual is to commit an offence or be a victim of a crime.

Read the full story here : (BBC NEWS)

[DISPLAY_ULTIMATE_SOCIAL_ICONS]