mastodontech.de ist einer von vielen unabhängigen Mastodon-Servern, mit dem du dich im Fediverse beteiligen kannst.
Offen für alle (über 16) und bereitgestellt von Markus'Blog

Serverstatistik:

1,5 Tsd.
aktive Profile

#predictivepolicing

0 Beiträge0 Beteiligte0 Beiträge heute

First the police, now Big Tech wants to put 'crime-predicting' tech in UK probation services.

A lack of transparency and reliance on flawed data means that institutional racism will be hardwired into the justice system.

All at the expense of dignity and rights.

theguardian.com/society/2025/j

The Guardian · Tech firms suggested placing trackers under offenders’ skin at meeting with justice secretaryVon Robert Booth

"Predictive policing technologies infringe human rights “at their heart” and should be prohibited in the UK, argues Green MP Siân Berry, after tabling an amendment to the government’s forthcoming Crime and Policing Bill.

Speaking in the House of Commons during the report stage of the bill, Berry highlighted the dangers of using predictive policing technologies to assess the likelihood of individuals or groups committing criminal offences in the future.

“Such technologies, however cleverly sold, will always need to be built on existing, flawed police data, or data from other flawed and biased public and private sources,” she said. “That means that communities that have historically been over-policed will be more likely to be identified as being ‘at risk’ of future criminal behaviour.”"

computerweekly.com/news/366626

ComputerWeekly.com · MPs propose ban on predictive policingVon Sebastian Klovig Skelton
Fortgeführter Thread

#KI bei der #Polizei - #Vera übernehmen Sie! Hier könnte gleich ein Mord geschehen!

Die bayerische #Polizei möchte mithilfe einer Software der US-Firma #Palantir Verbrechen verhindern, bevor sie verübt werden. Ist nun jeder verdächtig? 

'"Verfahrensübergreifende Recherche- und Analyseplattform", so heißt die Software, um die es hier gehen soll. VeRA, eine Abkürzung, die Datenschützern Zornestränen in die Augen treibt. Ein Werkzeug sei das, von dem die Stasi nur hätte träumen können, sagen sie. Bayern nutzt es seit rund neun Monaten.

Kriminalbeamte wollen mit der Software "Predictive Policing" betreiben, also vorhersagebasierte Polizeiarbeit.'

zeit.de/2025/26/ki-polizei-bay

DIE ZEITKI bei der Polizei: Hier könnte gleich ein Mord geschehenDie bayerische Polizei möchte mithilfe einer Software der US-Firma Palantir Verbrechen verhindern, bevor sie verübt werden. Ist nun jeder verdächtig?

"At their heart, these technologies infringe human rights."

Last week @sianberry tabled an amendment to the UK Crime and Policing Bill that would prohibit the use and deployment of dangerous 'crime-predicting' police tech.

These systems will subject overpoliced communities to more surveillance. More discrimination. More injustice.

Sign the petition to BAN it ➡️ you.38degrees.org.uk/petitions

Oops! AI did it again... you're not that innocent.

Nectar, a 'crime-predicting' system developed with #Palantir, could be rolled out nationally after a pilot with Bedfordshire police (UK).

Data such as race, sex life, trade union membership, philosophical beliefs and health are used to 'predict' criminality so people can be targeted for #surveillance.

inews.co.uk/news/police-use-co

The i Paper · Police use controversial AI tool that looks at people’s sex lives and beliefsSenior MPs and privacy campaigners have expressed alarm at the deployment of Palantir’s AI-powered crime-fighting software with access to sensitive personal information
Antwortete im Thread

@heidilifeldman

#USpol #TheBrownSpiderWeb

(2/n)

👉2025 is set to become #1933 and"#1984" at the same time.👈
With the real #KingMaker's (#PeterThiel) #spyware and #surveillance products (#Palantir,) 2026 is set to add a next-generation ingredient to #Fascism: #PredictivePolicing, a brand new way in RL to persecute #ThoughtCrime.

@heidilifeldman

That said, I agree 100% with the excellent @guardian article:

theguardian.com/commentisfree/

At #DOGE "...#AI is...

Antwortete im Thread

@tg9541 @mattotcha

#UKpol #UKpolitics
#Precrime #ThoughtCrime #FreeSpeech #PeacefulProtest
#CivilRights #Legal

👉A friendly warning to the #Starmer Government👈

(3/n)

... advent of #PredictivePolicing and the continuing crackdown on the right to #PeacefulProtest in the #UK, it seems that the #Starmer government seems to be following down that road.

👉The despicable use of anti-terror force by 30 #policemen in a place of #worship in #London against six young women👈 discussing...

‘Predictive’ policing tools in France are flawed, opaque, and dangerous.

A new report from @LaQuadrature, now available in English as part of a Statewatch-coordinated project, lays out the risks in detail.

The report finds that these systems reinforce discrimination, evade accountability, and threaten fundamental rights. La Quadrature is calling for a full ban—and we support them.

📄 Read more and access the full report: statewatch.org/news/2025/may/f

Fortgeführter Thread

Wie Algorithmen in #Deutschland Straftaten „voraussehen“ sollen #PredictivePolicing

"In dem Bericht „Automating Injustice“ werden ausgewählte Systeme untersucht, die in Deutschland von der Polizei, Strafverfolgungsbehörden und Gefängnissen entwickelt oder eingesetzt werden. Außerdem werden öffentlich zugängliche Informationen über solche Praktiken analysiert, um zu erklären, wie die Systeme funktionieren, welche Daten sie verwenden, weshalb sie zu einer stärkeren Diskriminierung führen können und generell eine Gefahr für die Grundrechte sind......."

algorithmwatch.org/de/predicti via @algorithmwatch

AlgorithmWatchAutomatisierte Polizeiarbeit: Wie Algorithmen in Deutschland Straftaten „voraussehen“ sollen - AlgorithmWatchDie Polizei, Strafverfolgungsbehörden und Justizvollzugsanstalten in Deutschland versuchen immer stärker, Straftaten digital „vorherzusagen“ und zu „verhindern“. Der Bericht „Automating Injustice“ gibt einen Überblick über solche algorithmischen Systeme, die in Deutschland entwickelt und eingesetzt werden.

Very proud to have a chapter in this handbook! Many thanks to Nathalie Smuha for the invitation!

One of the pertinent questions I ask in the conclusion is: Should the money that is invested in predictive policing applications not be invested instead in tackling causes of crime and in problem-oriented responses, such as mentor programs, youth sports programs, and community policing, as they can be a more effective way to prevent crime? #AI #Policing #Predictivepolicing cambridge.org/core/books/cambr

Cambridge CoreLegal, Ethical, and Social Issues of AI and Law Enforcement in Europe (Chapter 18) - The Cambridge Handbook of the Law, Ethics and Policy of Artificial IntelligenceThe Cambridge Handbook of the Law, Ethics and Policy of Artificial Intelligence - February 2025

Very proud to have a chapter in this handbook!

One of the pertinent questions I ask in the conclusion is: Should the money that is invested in predictive policing applications not be invested instead in tackling causes of crime and in problem-oriented responses, such as mentor programs, youth sports programs, and community policing, as they can be a more effective way to prevent crime?
lnkd.in/dQufjqRE

lnkd.inLinkedInThis link will take you to a page that’s not on LinkedIn

Perils of predictive policing

Amnesty publishes a report warning of the perils of predictive policing

February 2025

Many TV detective series have technology at their core as our heroes vigorously pursue the wrongdoers. CCTV cameras are scrutinised for movements of the criminals, DNA evidence is obtained and of course fingerprints are taken. The story lines of countless detective series feature forensic evidence as a key component of police detection. The series and stories are reassuring by displaying law enforcement officers using all the techniques – scientific and technological – to keep us all safe and lock up the bad guys. Using science and algorithms to enable police forces to predict crime must be a good idea surely?

It is not. The Amnesty report, and other research, explain in great detail the problems and what the risks are. One of the persistent biases in the justice system is racism and it would be worth reading the book The Science of Racism by Keon West (Picador, pub. 2025). The author takes the reader through copious peer reviewed research conducted over many years in different countries explaining the extent of racism. Examples include many cv studies (US: resume) where identical cv’s, but with different names which indicate the ethnicity of candidates, produces markedly different results. There are similar examples from the world of medicine and academia. Racism is endemic and persists. As Keon West acknowledges, a similar book could be written about how women are treated differently.

The Amnesty report notes that Black people are twice as likely to be arrested; three times as likely to be subject to force and 4 times as likely to be subject to stop and search as white people. With such bias in place, the risk is that predictive policing might simply perpetuate existing prejudice and bias. The concern partly centres around the use of skin colour, where people live and their socio-economic background all used as predictive tools.

People have a deep faith in technology. On a recent Any Answers? programme (on BBC Radio 4), a debate about the death penalty and the problem of mistakes, several people showed a touching faith in DNA in particular inferring that mistakes cannot happen. People are mesmerised by the white suited forensic officers on television giving a sense of science and certainly. Technology is only as good as the human systems which use it however. There have been many wrongful arrests and prison sentences of innocent people despite DNA, fingerprints, CCTV and all the rest. Mistakes are made. The worry is that predictive policing could enhance discrimination.

People who are profiled have no way of knowing that they have been. There is a need to publish details of what systems the police and others are using. The police are reluctant to do this the report notes. What is the legal basis for effectively labelling people because of their skin colour, where they live and their socio-economic status?

The police are keen on the idea and around 45 forces use it. The evidence for its effectiveness is doubtful. The risks are considerable.

Previous

Amnesty's new report shows that the police are supercharging racism through predictive policing.

At least 33 UK police forces have used prediction or profiling tools.

“These systems are developed and operated using data from policing and the criminal legal system. That data reflects the structural and institutional racism and discrimination in policing and the criminal legal system.”

#policing #police #precrime #predictivepolicing #codedbias #ukpolitics #ukpol

theguardian.com/uk-news/2025/f

The Guardian · UK use of predictive policing is racist and should be banned, says AmnestyVon Vikram Dodd