SXSW talk given by Clarissa Peterson. Peterson is a UX Designer and educator.

My notes:

Peterson’s talk was very good and very informative. A little random and sporadic but overall very good and had a general flow to it.

Peterson opened up with talking about how design is meant to fix things, user problems and business problems alike. It is also something you should not cheat your way around it. By cheating you aren’t fixing anything, you could be creating even worse problems.

Having ethics is our moral duty. It says a lot about who you are as a person as well as impacts how others view you. By having ethics you aren’t harming others, so why not do it.

Websites have a huge impact on people. If there aren’t any rules for them, you still have to live with the consequences. Peterson gave an example of this by using the Hawaii missile message. In 2018 Hawaii accidentally sent out a message to the entire island that there was an incoming missile. Due to the protocol needed to send out another message clarifying the mistake, it took 38 minutes for them to solve the issue. However in that time, the people had already started to panic. The traffic and driving laws went out the window, people were seeking shelter anywhere they could find it. The incident also caused strain on the telephone system. Internet signals became jammed meaning people couldn’t check to see whether the alarm was real or not. Because of the protocol that the government had to go through the country was in total panic for more than 30 minutes. The design of this system was flawed and even if they didn’t realize that, the consequences were dramatic.

The government in this case had so many protocols in place that they weren’t prepared for every scenario. You can’t always predict everything that will go wrong but you should try. There are also problems that even aren’t necessarily problematic. Because of this, people don’t believe they should have to fix them. Especially if this means more work for them. Peterson stated that people don’t always do what you suggest they do even if they should.

Peterson also points out that computers can’t do everything on their own. Computers are programmed by people and those people are biased and have opinions. However, most people don’t think about this especially when talking about AI, self driving cars and everything else in that vain of tech. The algorithms used to program these computers, can determine the worst case scenario. The bias behind the programmers can affect that. Stuff like age, gender, race, disabilities. If a self driving car gets into an accident, and has to kill either the passenger or a pedestrian. Who does it choose? The programmers behind that essentially get to make that decision, whether it is based off of protecting the customer who bought the car or they base it off of their own bias.

Facebook was also used as an example by Peterson as a problem with ethics in programming. Facebook attributes over 50,000 interests to each user. They used these to categorize people in several different areas. Not all the categories were ethical, some of them included words like Jew, Nazi, anti-semantic, and others. Because the company didn’t have the ethics to eliminate these categories, or they didn’t think it would be a problem. It became a huge problem when people found out about it, and ended up hurting their business.

Another question of ethics in programming is art and nudity. A lot of sites screen and eliminate photos or videos containing nudity, as they should. However, in this process they tend to screen art that also involves nudity. A lot of the older Greek and Roman paintings and statues feature nudity, for example the statue of Michelangelo. By having this programming algorithm, websites were eliminating classical pieces of art and this upset a lot of people. The real question is how do you scan for one but not the other? Especially on the bigger sites that go through hundreds of posts and ads every day.

Algorithms get it wrong sometimes, and there isn’t anything we can really do about it. When we create tools, we are responsible for how people use it. Whether the product is actually being used for it’s intended purposes or not, the way it is being used by the customers is your responsibility.

Whatever your level in the company though, you should be looking ethically at what is happening in your product. If you don’t agree with your companies way of doing things, and can’t change them. The best way to keep your morals is to leave. While that can be a difficult decision, if it keeps up with your morals it will be worth it.

When designing for the majority of users you can’t leave out those who are marginalized. While this goes back to accessibility, she only mentioned those who have a disability or those with situational disability (i.e. broadband access). She did say that companies can exclude people in order to make their products better. However, you can’t exclude people based on race, gender, or age.

Diversity in your team can help you catch some of these problems. Peterson then continued into how people use slang to create space for themselves. they exclude certain groups in order to find a place they can be their own community. Therefore you need to incorporate this into your product, using it to your benefit without crossing those ethical boundaries.

Most data that things are based off of comes from test subjects, on average when doing this research most of the tests are created by white men. By adding diversity to every test creation you can also influence your research into creating better products. You also need to know more about the existing data you are using, by just assuming that it is good you could run into potential problems.

The best way to combat this is to interview at least 1 minority and 1 woman for each job position. By not even realizing that hiring managers tend to ignore these groups, they tend to miss out on some great candidates and a great opportunity to add diversity to their teams.

Peterson ended her talk by saying “Don’t build evil, and don’t design evil systems.” Which I think was a good way to end her speech and is something we should all try to do.

Written by

UX/UI Designer in Austin, TX.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store