On its website, the council says the Think Family database, which the app draws on, includes information from about 50,000 families across the city collected from agencies including social care, police and the Department for Work and Pensions. It says it highlights “vulnerabilities or needs” and uses “targeted analytics” to help identify children at risk of sexual or criminal exploitation.
Critics say the reality is that this risks children from minority ethnic or poorer backgrounds being profiled as being involved in gangs or county lines operations.
Because it's overwhelmingly the case that that's the demographic that makes up such operations. So of course the tech will highlight this. What else could it do?
The computer is obviously racist and needs to be sent on a course, there can be no other explanation.
ReplyDelete"The computer is obviously racist and needs to be sent on a course..."
ReplyDelete😂
It is all them "ones" and "zeros". The curse of a "binary" system.
ReplyDeleteThe poor "zeros" are condemned to being nothing. If only they could self identify as being a "one" or a "one plus" or even an "x" - a free entity, not limited by constraints concocted by a racist ubermenschen. Then the computers would be free to give any answer not constrained by callous logic or hard facts.
If the computer gives a "Wrong" answer, just re-run the program again and again until it gives an acceptable answer.I
Then shout "Yeah, maaan. The computer sez that ah is right.