Some people are unhappy about what Alexandria Ocasio-Cortez is saying here. People not like to imagine that software cannot have politics, intentionally or otherwise.
Socialist Rep. Alexandria Ocasio-Cortez (D-NY) claims that algorithms, which are driven by math, are racist pic.twitter.com/X2veVvAU1H
— Ryan Saavedra (@RealSaavedra) January 22, 2019
Whenever I was earning my master’s degree, I took a number of courses on the ethics of technology and the history of technology in general.In one of my classes we learned that bridges, yes bridges can have politics. There was an architect, Robert Moses, that was hired by New York City to design and build bridges. Given that NYC is an island and there’s a lot of water there, building bridges is a pretty big deal. Robert Moses was a racist. He also hated poor people. So, if you’re hired to build a bridge from one part of the city to another part with beautiful parks and outdoor spaces that you wanted to have whites and rich people use but not poor people, how would you do that?
If you build a bridge that uses traditional arches underneath with no top support, any vehicle can cross. However, if you build a bridge that has a maximum height allowed, then you can limit that types of vehicles that can cross the bridge. If you built the bridge low enough, then you can prevent buses from crossing the bridge. Buses that would be carrying poor people of color.
It’s just a bridge, how can it be racist? A bridge is just a thing. Something built by people. However, those people have biases and intentions. These are built into that technology. While A bridge may not be racist, this one IS because of the racism used to build the bridge.
If a bridge can have biases intentionally built into it, there is no doubt that software will have biases built into them. We’ve seen time an again that beauty algorithms where the AI didn’t like dark skinned women. In those cases the people building the training set of images had biases. The engineers didn’t like dark skinned women and didn’t include a significant amount of them in the training set.
Soap dispensers aren’t able to detect dark skinned hands, because the engineers working on them didn’t think of test the sensor on someone with dark skin.
If you have ever had a problem grasping the importance of diversity in tech and its impact on society, watch this video pic.twitter.com/ZJ1Je1C4NW
— Chukwuemeka Afigbo (@nke_ise) August 16, 2017
These aren’t intentional biases. It’d be difficult to imagine a group of engineers all sitting around the room saying, “wouldn’t it be me great if we prevent dark skinned people from properly washing their hands? MWahahahahah.” No that’s not what happened. What happened is that the QA team was made of people that look more like me. The dispenser worked perfectly for them, QA passed! This isn’t an intentional bias, but it’s a bias none the less. It’s called the availability bias. If the only people that are available look a certain way, you don’t think about people that aren’t immediately available.
Everyone does it. More people are aware of the fact that there are people different from them. For white people this is critical. It’s similar to when a white person writes an article about how racism has significantly declined in a major news paper.
It is time that organization recognize this and create teams to ensure that ethics and biases are considered when developing and selling novel technologies – or in the cases of bridges old technologies repurposed for modern uses.