top of page
  • EWC Community

Can we program robots to not become sexist and racist?



By: Catherine Tan


While humans are biased, robots don’t assume stereotypes. They don’t think things like black men are more dangerous, and women tend to the house – or so we thought. In a recent experiment, programmed robots repeatedly identified blocks with a Black man’s face as “criminal” and chose blocks of women and people of color when asked for representations of “homemaker” and “janitor.”


This is not the first case of biased artificial intelligence algorithms. Crime prediction algorithms in past years also unfairly label innocent Black and Latino people as criminals. Facial recognition systems struggle with identifying all people of color.


The most common way to train virtual robots is on the CLIP model. This model visually classifies objects by scanning the internet for images and text captions and scraping information from its billions of findings. The biggest issue is to find data sets that aren’t biased – a feat that Abeba Birhane, a senior fellow at the Mozilla Foundation, acknowledges is near impossible.


So far, robots have been largely perceived as neutral due to the activities they are generally tasked. Robots moving objects around warehouses or sorting objective numerical data lack the opportunity to show bias. However, due to the pandemic and labor shortage, companies are looking to develop more robots to replace human tasks such as caring for patients or delivering goods. These tasks would involve more human and robot interaction and reveal stereotypical biases in the robot’s programming that may have been previously undetected. This shift will challenge researchers to fix AI datasets before the technological adoption spreads too far.


For now, scientists want robots to suspend judgement until they are given enough information, rather than assuming Black men fit the category of “criminal.” Researchers also encourage companies to audit the AI algorithms they use to train their robots, diagnose flawed behavior, and improve any persisting issues. Who will emerge victor in the race between correcting robot biases and the corporate gold rush to adopt cheaper robot labor?

0 views0 comments

Recent Posts

See All
bottom of page