top of page
  • EWC Community

A Problem Seeded in Robots Now May Eventually Affect the Future



By: Amy Li


People can be sexist—they think one gender is better than another. They can be racist, believing one race is superior to the rest. Both occur when someone stereotypes a gender or race. This is common for humans. But robots—trained with Artificial Intelligence—somehow have developed discrimination against certain people.


Andrew Hundt is one of the fellow researchers on this study, who was interviewed by the Washington Post. In the article, Hundt listed some real-life scenarios. For example, if asked to bring a child a beautiful doll, the robot would return with a white one. When picking products, the robots would usually pick products with white men on them. Jobs with less payment and less social status, like “homemaker” and “janitor,” were often given to Black or Latino women. Women are less likely to be doctors than men, the system also concluded. When deciding who was a “criminal,” the robots chose Black men nine percent more than white men.


It could be the system’s problem—the robots were trained with a model called “CLIP.” It is a model built by putting together billions of images and captions, and it is very popular since making it requires less effort and is cheaper than building software from scratch. Researchers then trained virtual robots with CLIP, then gave them 62 commands. The results are in the previous section—somehow the system favored white men more. Scientists said, however, that they weren’t supposed to respond because they weren’t given the information to make a judgment.


“With coding, you usually just build new software on top of the old,” Zac Stewart Rogers, a supply chain management professor at the University of Colorado, said. He said that when the program is built on top of flawed roots and people are getting them to “do more,” there would be problems. However, robots are still perceived as neutral since their use is still limited to simple things, like moving cargo and goods, or sweeping the floor. But now, when people are programming robots to get into more advanced tasks, the bias against people of color and women becomes obvious. If people don’t want 1865 (around the time segregation started) to happen again, robots need a new program before taking on more important jobs.

0 views0 comments

Recent Posts

See All
bottom of page