The Spanish government this week overhauled a program that allows police to use algorithms to identify potential repeat victims of domestic violence after authorities raised questions about the system's effectiveness. Announced.
The program, VioGén, requires police officers to ask victims a series of questions. The answers are entered into a software program that generates a score ranging from no risk to extreme risk, with the aim of flagging women who are most likely to repeat abuse. The score helps determine the police protection and other services women receive.
A New York Times investigation last year found that police rely heavily on the technology and, in most cases, accept decisions made by the VioGén software. According to the Times, some women who were classified by algorithms as having no or low risk of further harm later suffered further abuse, including dozens who were murdered. It turned out that
Spanish officials said the changes announced this week were part of a long-term planned update to the system, which was introduced in 2007. He said the software has helped police departments with limited resources protect vulnerable women and reduce the number of repeat attacks.
With the latest system, VioGén 2, the software will no longer be able to classify women as not at risk. Police will also need to input more information about the victim, which will lead to more accurate predictions, officials said.
Other changes are aimed at improving coordination between government agencies involved in incidents of violence against women, including making it easier to share information. In some cases, victims will receive an individualized protection plan.
Equality Minister Ana Redondo told a press conference on Wednesday that “machismo is knocking on our door and doing it with a violence we have never seen before.” “It's not time to take a step back. It's time to take a leap forward.”
The use of algorithms to guide the treatment of gender violence in Spain is a widespread example of how governments are turning to algorithms to make important social decisions, a trend that will grow with the use of artificial intelligence. It is expected that this will happen. The system has been studied as a potential model for governments in other countries seeking to combat violence against women.
VioGén was founded on the belief that algorithms based on mathematical models can serve as an unbiased tool to find and protect women who might otherwise be missed by police. Yes or no questions include: Was a weapon used? Was there a financial problem? Did the attacker exhibit dominant behavior?
Victims classified as high risk received more protection, including regular home patrols, access to shelters, and police monitoring of their abusers' movements. Those with lower scores received less assistance.
As of November, Spain had more than 100,000 active cases of women diagnosed by Viogen, with around 85% of victims classified as having little risk of being harmed again by their abusers. . Spanish police officers are trained to defeat any evidence, but The Times revealed that the risk score was accepted about 95 per cent of the time.
Victoria Rosell, a Spanish judge and former government representative specializing in sexual violence issues, said the government needed a period of “self-criticism” to improve Viogen. He said the system could become even more accurate if it could pull information from additional government databases, including those from health and education systems.
Natalia Morlas, president of victims' rights group Somos Mas, welcomed the changes and said she hoped they would lead to better risk assessments by police.
“Adjusting the risk to victims is critical to saving lives,” Moras said. She added that it was important that the system remained under close human supervision, as victims “must be treated by humans, not machines.”