advertisement


Police using facial scanning

Regarding institutional racism in the UK police, it's not really disputed, as Andrew C concedes. There's a mountain of evidence.

True, but I’d counter that it’s not prevalent in all areas of policing and policing interaction with communities. I can say that from personal and professional experience. This is why I don’t like sweeping generalisations...
 
Police didn't build the tech. The tech people didn't understand themselves, or society, so they built racist tech. Not sure how a definition could have stopped them from doing this.

Is it not possible that at least some of the 'tech people' were non-Caucasian non-racists? Or would that preclude them from the job.
 
True, but I’d counter that it’s not prevalent in all areas of policing and policing interaction with communities. I can say that from personal and professional experience. This is why I don’t like sweeping generalisations...
OK, but this technology is being tested by the Met.
 
I'm not saying that all people in tech are racist, just that a lack of understanding of how racism works, in terms of unconscious and institutional bias, can result in those biases finding their way into the tech itself. Systemic racism in the tech industry is, as Sue says, a widely acknowledged problem: here's what I got when I Googled "racism tech industry":

racism tech industry

Regarding racism in facial recognition software, again, the problem is widely understood to go beyond the open source software mentioned by Buolamwini in the article I linked to: her own research addresses three commercial systems (also see here, on both open source and commercial software). Maybe the Neoface people supplying the UK police have sorted all this out: maybe I shouldn't make assumptions. On the other hand, since the software hasn't undergone demographic testing for this known problem, why not?

Regarding institutional racism in the UK police, it's not really disputed, as Andrew C concedes. There's a mountain of evidence.

That study you linked to shows that the algorithms produce lower accuracy results for women, blacks and 18-30 year olds.

Are you suggesting that the police and tech industry are also sexist and ageist?
 
Help me understand folks: the Met are using FR tech to pick out (identify) known people on an existing watchlist. Surely it wouldn’t make one iota of a difference whether the technology had inbuilt racism, it’s looking for those on the watchlist isn’t it?

Arguments about racism and the watch list itself are secondary to my question.
 
Help me understand folks: the Met are using FR tech to pick out (identify) known people on an existing watchlist. Surely it wouldn’t make one iota of a difference whether the technology had inbuilt racism, it’s looking for those on the watchlist isn’t it?

Arguments about racism and the watch list itself are secondary to my question.
To address specifically the effects of the (possible!) in-built bias of the tech, if it's not good at identifying black faces, and keeps coming up with false positives, then a lot more black people than white people are going to get stopped for no reason, and probably put through the ringer, and possibly wrongly prosecuted.
 
I think you misunderstand the criminal justice system if you think people may end up being wrongly prosecuted . If the system suggests a match but is wrong , yes I can see someone being stopped unnecessarily . Maybe it is asking too much for them to identify themselves and assist the police. That’s another question though isn’t it?

And I thought something was said to be put through the ‘wringer’, like an old device my mum had ...
 
I think you misunderstand the criminal justice system if you think people may end up being wrongly prosecuted . If the system suggests a match but is wrong , yes I can see someone being stopped unnecessarily . Maybe it is asking too much for them to identify themselves and assist the police. That’s another question though isn’t it?

o_O
 
Agreed, hence my point about being specific. There are 40 or so other forces...

Just a personal gripe. Take or leave as you wish.
Fair enough.
Although this is fascinating:

Summary
This data shows that:

  • in 2017/18, the biggest difference in the arrest rates between Black people and White people was in Dorset (where Black people were almost 12 times as likely to be arrested as White people), followed by Devon and Cornwall (where Black people were 10 times as likely to be arrested as White people)
  • Black people had the highest arrest rates per 1,000 people in every police force area for which there was data
  • the arrest rate for Asian people was 3 times higher than the rate for White people in West Mercia
  • the arrest rate for people with Mixed ethnicity was 3 times the rate for White people in Essex, Gloucestershire and Staffordshire
  • in London, there were 19 arrests for every 1,000 ethnic minority people compared with 12 arrests for every 1,000 White people
https://www.ethnicity-facts-figures...and-the-law/policing/number-of-arrests/latest
 
Those cops are sexist as well: arresting more men than women. They need to arrest more women to balance those figures right?

What if more of the suspects (as described by the victims) are men though? How should they get round that awkward little fact?
 
To address specifically the effects of the (possible!) in-built bias of the tech, if it's not good at identifying black faces, and keeps coming up with false positives, then a lot more black people than white people are going to get stopped for no reason, and probably put through the ringer, and possibly wrongly prosecuted.
Not sure I agree. The software, I think, puts up a list of possible ‘hits’ for a human operator to check. So, face in crowd, software says ‘it could be this guy off your watchlist, or maybe this guy’. Operator checks mugshots of the possible hits and compares to the real time image, making a decision as to whether a stop is justified or not. If it’s a poor match, nobody gets a tug; a better match, somebody gets asked to ID themselves.
 
To address specifically the effects of the (possible!) in-built bias of the tech, if it's not good at identifying black faces, and keeps coming up with false positives, then a lot more black people than white people are going to get stopped for no reason, and probably put through the ringer, and possibly wrongly prosecuted.

You argue bias and then ignore your own evidence that shows greater "bias" against women.

The worst delta value (in your evidence) for black faces is 7.7% but the worst delta for women is 19.2%

Screenshot 2019-05-20 at 21.29.54 by The Biglebowski, on Flickr
 
Those cops are sexist as well: arresting more men than women. They need to arrest more women to balance those figures right?

What if more of the suspects (as described by the victims) are men though? How should they get round that awkward little fact?
What we need is some way to compare the number of arrests to the number of convictions for different ethnicities - that might give us some indication of whether a disproportionate number of black people were being arrested arbitrarily. Then we could do something similar with men and women, to see if something similar were going on there, as you suggest.

40927836073_305e7f73e9_z.jpg


That's interesting!

47842127212_8f42cf9cda_z.jpg


Hmmm...

https://www.ethnicity-facts-figures...tribunals/prosecutions-and-convictions/latest
 
Am I reading that correctly? A racial bias against white people because they’ve got the highest conviction ratio?
 
Not sure I agree. The software, I think, puts up a list of possible ‘hits’ for a human operator to check. So, face in crowd, software says ‘it could be this guy off your watchlist, or maybe this guy’. Operator checks mugshots of the possible hits and compares to the real time image, making a decision as to whether a stop is justified or not. If it’s a poor match, nobody gets a tug; a better match, somebody gets asked to ID themselves.
I dunno. If the software is saying "That might be a guy off your watchlist!" about more random black punters than white...
You argue bias and then ignore your own evidence that shows greater "bias" against women.

The worst delta value (in your evidence) for black faces is 7.7% but the worst delta for women is 19.2%

Screenshot 2019-05-20 at 21.29.54 by The Biglebowski, on Flickr
Not sure what your point is - that gender bias is also built into the software? That's ... a relief. What?
 
Am I reading that correctly? A racial bias against white people because they’ve got the highest conviction ratio?
I don't think you are, no. What we're looking for is arbitrary, unnecessary arrests, remember, perhaps indicating racial bias.
 


advertisement


Back
Top