Academics at Sheffield Hallam University have devised a new set of guidelines for police forces using computer software to tackle crime.

Computer programmes are used by many police forces in the UK to plan where to send patrols, determine which suspects are likely to reoffend, and even decide if a crime is worth investigating.

A scoring system was also used by the Metropolitan Police to create the controversial gangs matrix – a database of suspected gang members in London – and are used by forces to identify criminals using facial recognition technology.

But there are drawbacks with ‘algorithm-led’ policing, which might for example mean some groups of people, such as young black men are disproportionately targeted by police.

Now a team of researchers have devised a new framework designed to help police use algorithms more fairly.

Jamie Grace, a senior lecturer in law at Sheffield Hallam University, worked with Durham Constabulary and colleagues at the University of Winchester and Cambridge University, to create and pilot ALGO-CARE.

The framework questionnaire, which consists of eight bunches of prompts spelling out ALGO-CARE,  is being used by police data scientists to carry out a series of risk assessments to make sure their software won’t lead to innocent people being targeted.

The programme has been endorsed by the National Police Chiefs Council and will hopefully be used in the design of algorithm tools for police forces across the UK, which have been implemented partly as a result of austerity, and the need target police resources more efficiently and in smarter ways.

The guidelines aim to help police officers and lawyers to predict the risk of using tools that calculate an offender ‘risk score’ and also safeguard the data rights of suspects and victims.

Mr Grace said: “The police have decided that they want to make this kind of technology more ethical. ALGO-CARE sets out a checklist of ethical and legal considerations forces need to comply with in order to ensure that they are using their software fairly and effectively.

“The aim is to solve algorithmic injustice and impropriety in high-stakes, public sector decision-making in order to make software-led policing more robust and ethical and ultimately ensure defendants have a fair trial.”