Racism (28): Shooter Bias

When called to the scene of an on-going crime, police officers often have to make split-second decisions whether to shoot or not. There’s chaos, darkness, running, shouting, shooting perhaps, and no time to determine who’s who and who’s likely to do what. Training can help, but in most cases officers just rely on instincts. In other words, these are the ideal situations for the revelation of personal biases.

Because of the nature of those situations, officers sometimes make mistakes and shoot innocent persons or unarmed suspects. Now, somewhat unsurprisingly there’s research telling us that it’s more likely for white people to shoot unarmed black suspects than unarmed white suspects. This bias is called the shooter bias, and it’s not the monopoly of police officers (as lab tests with ordinary citizens have confirmed). (More here).

It seems that a lot of people have internalized the stereotype about dangerous black men, even those who would not think of themselves as having done so.

More posts in this series are here.

Gender Discrimination (32): Gender Stereotyping of Robots

Our prejudices must be very deeply ingrained if we even stereotype robots. From an interesting new paper:

Previous research on gender effects in robots has largely ignored the role of facial cues. We fill this gap in the literature by experimentally investigating the effects of facial gender cues on stereotypical trait and application ascriptions to robots. As predicted, the short-haired male robot was perceived as more agentic than was the long-haired female robot, whereas the female robot was perceived as more communal than was the male counterpart. Analogously, stereotypically male tasks were perceived more suitable for the male robot, relative to the female robot, and vice versa. Taken together, our findings demonstrate that gender stereotypes, which typically bias social perceptions of humans, are even applied to robots. (source, source)

If we can’t manage to treat inanimate robots without sexism and prejudice, then what hope is there for our fellow human beings of the other gender?

Interestingly, the complaint seems to go both ways. Robots, in the general sense of the word, have been known to exhibit sexism. Siri and Google for example are said to favor “male terms” and solutions when autocorrecting of suggesting phrases. Obviously, prejudice in robots and in software, to the extent that it exists, only reflects the prejudice of their makers.

More posts in this series are here.