A team of scientists in the University of Virginia are said to have developed an artificial intelligence with which they can detect and quantify the physiological signs associated with racial bias. In simple terms, this wearable device will be able to detect when someone is having racist thoughts.
Although the above paragraph might sound ridiculous because that’s how it does. In actuality, the machine which will be built will not auto-magically determine a person to being racist by merely looking a them or measuring their pulse neither will it be able to determine people’s thoughts but this might soon become a breakthrough, here is how.
As of now, the standard way of identifying racial bias is through a method called Implicit Association Test (IAT) which looks at a series of images and words then try to associate them with “light skin”, “dark skin”, “good”, and “bad”. A good check of how this works can be checked out via the Harvard website here.
There’s also research indicating that learned threat responses to outsiders can often be measured physiologically. In other words, some people have a physical response to people who look different than them and we can measure it when they do.
- Advertisement -
So the UVA team earlier mentioned were able to join up these two ideas by taking 76 volunteer students and have them take the IAT and then measure their physiological responses with a wearable device.
So you might be left wondering how true this is and if any of these results are anywhere but fictional. The team’s research paper has it that “Our machine learning and statistical analysis show that implicit bias can be predicted from physiological signals with 76.1% accuracy.”
The number doesn’t seem high for a machine learning endeavor and even the fact that they’ll use a number of colored cartoon fasces to determine physiological difference might not just say it all.
Quick take: Any ideas the general public might have over some kind of wand-style gadget for detecting racists should be dismissed outright. The UVA team’s important work has nothing to do with developing a wearable that pings you every time you or someone around you experiences their own implicit biases. It’s more about understanding the link between mental associations of dark skin color to badness and the accompanying physiological manifestations.
In that respect, this novel research has the potential to help illuminate the subconscious thought processes behind, for example, radicalization and paranoia. It also has the potential to finally demonstrate how racism can be the result of unintended implicit bias from people who may even believe themselves to be allies.
You don’t have to feel like you’re being racist to actually be racist, and this system could help researchers better understand and explain these concepts.
So to conclude, I’d just say this system doesn’t automatically detects racial biases in people but is able to predict it. It’s able to shed some light on the physiological effects associated with implicit bias like a diagnostician would initially interpret a cough and a fever as being associated with a certain disease even though further examination will be needed to confirm the diagnosis. This AI doesn’t label racism or bias, it just points to some of the associated side effects.
More of the publication can be found here on arXiv.