Ethics in Technology Matters, Alexandria Ocasio-Cortez is Right, We Instill Our Biases in Technology

Some people are unhappy about what Alexandria Ocasio-Cortez is saying here. People not like to imagine that software cannot have politics, intentionally or otherwise.

https://platform.twitter.com/widgets.js

Whenever I was earning my master’s degree, I took a number of courses on the ethics of technology and the history of technology in general.In one of my classes we learned that bridges, yes bridges can have politics. There was an architect, Robert Moses, that was hired by New York City to design and build bridges. Given that NYC is an island and there’s a lot of water there, building bridges is a pretty big deal. Robert Moses was a racist. He also hated poor people. So, if you’re hired to build a bridge from one part of the city to another part with beautiful parks and outdoor spaces that you wanted to have whites and rich people use but not poor people, how would you do that?

If you build a bridge that uses traditional arches underneath with no top support, any vehicle can cross. However, if you build a bridge that has a maximum height allowed, then you can limit that types of vehicles that can cross the bridge. If you built the bridge low enough, then you can prevent buses from crossing the bridge. Buses that would be carrying poor people of color.

It’s just a bridge, how can it be racist? A bridge is just a thing. Something built by people. However, those people have biases and intentions. These are built into that technology. While A bridge may not be racist, this one IS because of the racism used to build the bridge.

If a bridge can have biases intentionally built into it, there is no doubt that software will have biases built into them. We’ve seen time an again that beauty algorithms where the AI didn’t like dark skinned women. In those cases the people building the training set of images had biases. The engineers didn’t like dark skinned women and didn’t include a significant amount of them in the training set.

Soap dispensers aren’t able to detect dark skinned hands, because the engineers working on them didn’t think of test the sensor on someone with dark skin.

 

https://platform.twitter.com/widgets.js

These aren’t intentional biases. It’d be difficult to imagine a group of engineers all sitting around the room saying, “wouldn’t it be me great if we prevent dark skinned people from properly washing their hands? MWahahahahah.” No that’s not what happened. What happened is that the QA team was made of people that look more like me. The dispenser worked perfectly for them, QA passed! This isn’t an intentional bias, but it’s a bias none the less. It’s called the availability bias. If the only people that are available look a certain way, you don’t think about people that aren’t immediately available.

Everyone does it. More people are aware of the fact that there are people different from them. For white people this is critical. It’s similar to when a white person writes an article about how racism has significantly declined in a major news paper.

It is time that organization recognize this and create teams to ensure that ethics and biases are considered when developing and selling novel technologies – or in the cases of bridges old technologies repurposed for modern uses.

Black Mirror: Nosedive, Authenticity, and Lost Connections

I just finished watching Black Mirror’s episode called Nosedive, which is an interesting episode about the impact of continually rating people for every social interaction. It explores what happens when someone who was previously a very high rated person has a very bad day. It was, implied that it would happen throughout episode, that everyone was just a series of misfortunate events away from dropping from their current social hierarchy to a lower strata where they’d be unable to function in current society. Ratings indicate which jobs a person can have or not. Dropping too low indicates you’re not worthy of that job and in many cases, network effects and game theory type logic comes into play. Where you have to judge if a low ranking person or a person that’s currently out of favor would negatively impact your image.If that would drop you from a person of respect to a person of disrepute.

This episode made me uncomfortable to watch, because in a lot of ways it feels like it hits close to home as it deals with a major reason why I don’t like social media. I don’t like the constant need for validation through pictures, likes, and comments. I’ve tried to, in general avoid, Facebook lately, because it feels inauthentic, and creepy. Between Facebook, itself, tracking what you do online and partners with companies to track your shopping habits offline. Combing that with the desire to display the best of your life on platforms like Instagram, this can lead to depression.

In many articles it’s because of the fact that you’re comparing your messy every day life to what people are willing to post, which typically represents the best parts of their life. Their happy dogs, walking in a vineyard, going surfing, or some new thing that they bought. Even if you know that you are doing this, doesn’t really help. However, I think there’s a few reasons beyond that. For one, it forces you to live an inauthentic life, which is one of the major themes in the show Nosedive. The character knows she’s putting on a show and clearly has some serious anxiety around behaving that way. Her brother, who lives a more authentic life, doesn’t care as much about his social media score and directly asks for Lacie (the main character) to return to her authentic self (“remember when we had real conversations?”)

Being an inauthentic version of yourself is a type of acting as well pushing down the values you actually believe in. This is something referenced in Lost Connections as a root cause of depression. Where our intrinsic values do not align with society’s values and we must adopt society’s values over our own we become depressed. In the episode it Lacie only became aware that it was a possible to reject those norms when she was picked up by a trucker with a rate of 1.4/5. This woman allowed her to reflect on her experience as her rating declined and bottomed out.

However, it wasn’t until she’d been rejected by the society and put into a prison of sorts that she was able to find a truly authentic interaction. It was rage filled, but eventually became filled with joy as the two people in prison were able to be an authentic version of themselves.

In our society, while we don’t have the intensity portrayed in the episode with social media, it is possible we could move into that direction over time. For us to really have authentic interactions, we need to find people that support us being our authentic selves even when there are people in our lives that might not fully support our decisions. Or people in our lives that make it more difficult to be authentic.

Tech and Art

Last night I asked for a writing prompt, not for my blog, but for my planned creative writing stream on Twitch.tv. Instead of a fictional writing prompt, I got one requesting I write about the intersection of technology and art. This is a pretty interesting space to be honest as there are folks that are building crazy things for Burning Man, Soak in Oregon, and just for fun.

The laser reflecting on the windmill is pretty interesting. I haven’t see anything quite like this before. When I used to drive between Austin and Santa Fe on a regular basis the wind mills in east Texas, always got me excited, even when it was just the flashing light on the top. The elegance of the blades juxtaposed with the barren landscape was really a great site to behold.

This gif also brought to mind another Dutch technologist/artist though. This creator uses a form of machine evolution to create super interesting “animals” that move around on beaches without going into the ocean and that move around more efficiently.

A book I read a number of years ago, called “Design Driven Innovation”  talks about how using art along with an understanding of how people use objects allows a great deal of innovation in our products. What might seem useless today, such as a laser on a windmill may actually help pave the way for new energy transmission methodologies or perhaps another way to enhance the amount of energy a windmill actually creates.

I’ll close this with my thoughts about an event in Eindhoven, The Netherlands that I really loved. It was a Glow Festival, which really makes sense because it was a city large built upon the successes of Philips. It is a festival where the entire city center is turned into a series of light art exhibits. It combines the aesthetics of the old city, with modern lights. I really enjoyed it and if you’re living in Europe I strongly suggest you check it out!

Capitalism vs. Robots – which is more terrifying?

In an article that recently resurfaced on Reddit, Famed Astrophysicists Stephen Hawking argues that we should fear capitalism more than robots. I think the timing of this is somewhat interesting, being an election cycle and the two populist candidates are opposites in many regards especially in terms of Democratic Socialism vs. Crony Capitalism (Sanders v Trump). In the broader context of emerging technology this is important as well though, as many other technology leaders have expressed fear of AI, such as Elon Musk, while other leaders are running full steam ahead towards more and more automation.

Hawking isn’t the only person thinking about the economy and technology though. Warren Buffet just released Berkshire Hathaway’s annual report with some pretty stark warnings about the future of capitalism in action at the corporate level. Indicating that innovation does have a darkside. While he’s speaking as a manager, there are economists looking into this and in the book Second Machine age, the authors argue that the best is still to come, because man and machine work best together, not separately.

Unfortunately, this will only push the ceiling up on skills required for jobs, rather than expanding opportunities. A perfect example of this will be Uber. Being an Uber driver isn’t a difficult job because of skill requirements, but because it’s a boring job that is relatively tiring. Uber has been pushing down their prices over a multi-month/year process which will continue through the introduction of “Autonomous” Cars, or RobotCars. At this time a large number of low skilled workers will find themselves out of a job, including people I know and probably people you know. This has been Uber’s plan for a long time as they understand that people are the biggest costs and risk for the company. Especially in light of the mass shooting in Michigan.

Uber isn’t the only major company looking to replace workers like this. In fact, it’s likely that a lot of White Collar jobs are going to go this route as well, including in industries that notoriously relied on people that then made unethical decisions, such as the financial industry.  We’ve heard of High Frequency Trading, which is basically a set of algorithms to make decisions on buying and selling stocks based on microtrends. However, this is going to continue to expand into newer areas. It’s been well remarked that most brokers are no better than a coin flip (Black Swan; Drunkards Walk; Thinking, Fast and Slow; all reference this) so it is highly likely that algorithms will do better than people in picking winners and losers on the stock market. It’s also likely that those algorithms will have access to more data faster than any person could eve analyze and act upon.

This interaction between capitalism and automation creates huge risks for the economy. A few years ago, there was a “flash crash” which was basically caused by those HFT I mentioned above. As more and more portions of the financial industry come under the purview of robo-traders, these sorts of events are going to be more likely. These institutions still have pushed most of the risk to the public, while retaining the bulk of the profits from these robots.

As these trends continue across industries, the local optimization of companies to automate and create more robots is going to gradually push people out of jobs at a more and more rapid pace than new categories of jobs can be opened. I think it likely that will be likely that we’ll see more companies going the route of Uber. Using tools like Amazon’s Mechanical Turk to get processes started before they invest effort and energy into automating processes. Once they are shown to be successful, the effort to remove the human element will continually increase until those workers are out of a job. What we will eventually see is a white collar migratory worker going from one type of tech job to another only to be replaced by automation in the long run.

The impact to the economy in the long run and the human condition in the short term will be catastrophic as our current institutions are not designed to handle this sort of change in labor type. The incentives for this behavior has been in place for decades and have been pushing bad actors to be worse, such as the Turing Pharmaceuticals’ CEO price gouging dying patients, because the market could support it.

Government Policy and Technology Innovation

In a way that mirrors yesterday’s court ruling, the FCC announce they were going to investigate and likely force serious changes in the world of set top boxes. The FCC, at one point, forced and supported the cable industry in controlling the types of set top boxes (Set top boxes are cable boxes – Roku and AppleTV are cableless competitors) available to consumers. Since then, we’ve suffered with mediocre and extremely expensive boxes. Boxes that cost $16/month and over time you end up paying for a box 10 times over. The gist of this issue is whether or not to allow companies to make “soft” cable cards. Right now, if you want to decode any video from a coax cable from Comcast, you must have a physical card to do the decoding. There’s nothing preventing this from being accomplished entirely using software once you get the signal into the box and that’s what this is trying to encourage.

Granted, this has taken a while for the FCC to wake up and look at the competitive landscape and see that this isn’t in the public interest. Defining exactly what is in the public interest is a difficult because everyone sees this in a different light. However, it’s pretty obvious that something that you end up paying $1,920 over span of ten years isn’t in the public interest. The competition, Roku and AppleTV, each cost between 100-200 one time and you can use it until it dies which will probably be something like 10 years. I’ve had my Roku HD for 5 years now and it still works great. It would make perfect sense for me to buy a version, assuming I had cable at all, that would allow me to watch cable through it. Everything all in one place.

This is the type of regulation that government should be celebrated for encouraging. Granted they screwed it up to begin with and they are only righting a wrong now, but they’re on the right path. Regulation like Net Neutrality is a similar decision that can spur innovation. Looking at T-Mobile’s binge on plan, you can see why we need this. If I’m a small streaming company or, ya know, YouTube, I look at this platform and see how it’s slanted against me and limits what I’m capable of delivering on T-Mobile’s network.

in the case of the FBI and forcing technology companies to change their technology to reduce security, it’s nice to see an organization that’s willing to at least consider improving opportunities for innovators. Sure it may look like picking winners and losers – but when most policy is driven by current winners picking them to lose sure looks more like balancing the playing field to me.