Not a week goes by without some huge ethical issue erupting from the brotopia known as Silicon Valley. Without injecting my own opinion into the mix, let’s just take a sampling of recent headlines:
- “Twitter Has Started Researching Whether White Supremacists Belong on Twitter”
- “Facebook lawyer says users ‘have no expectation of privacy’”
- “On YouTube’s Digital Playground, an Open Gate for Pedophiles”
I could go further, but nothing is going to top that last one. You get the point. Everything is on fire. Should you desire to go further, I would suggest you pick up Mike Monterio ’s latest book, Ruined By Design, which is best described as 200 pages of anger lovingly wrapped in humor.
Many of the issues addressed in Ruined are of global significance because the products involved all have a global reach. If there is any risk with a book like “Ruined,” it’s that the issue of design and ethics begins to look very much like a problem that only applies to Bay Area companies.
The problem is much bigger than Silicon Valley.
When ethics takes a backseat to business
Design constraints are often defined by unreasonable business requirements. This is a classic dilemma for anyone to work through and successfully navigating the issue takes both patience and negotiation.
Business requirements are always written to ask for more than what is absolutely needed. This struggle between the two sides will always come down to how much a company would like to make vs. how much a company needs to survive.
To be clear, you should want your employers to be successful. Making money is a good thing — but not if rushing to meet unreasonable deadlines is going to get your customer killed.
Generally, when we think of bad things that can happen to us on airplanes, our thoughts gravitate toward terrorists or freak accidents caused by bad weather or just terrible luck. Decades ago, people might have worried about the trustworthiness of the aircraft they were on, but those concerns have been mainly replaced by the comfort of good airports, good pilots, and good planes.
That all changed recently when a flaw in the Boeing 737 Max ended up crashing two of the relatively new planes.
389 of Boeing’s 737 Max were delivered before a second crash grounded the global fleet. With 5000 planes already ordered, you can be assured that the Chicago-based Boeing will do everything in its power to save face — but how do you save face when 346 people are dead?
As with most disasters, the fatal flaw in the Boeing story can be traced back to an avoidable issue.
“As Boeing rushed to get the aircraft done, many of the employees described a compartmentalized approach, each focusing on a small part of the plane. The process left them without a complete view of a critical and ultimately dangerous system,” as stated in the New York Times article.
In a rush to get business done, everyone put their heads down to meet the deadline. Nobody worried about the bigger picture because that was someone else’s job.
At its core, this sounds like a project management problem, but that would be passing the buck. Sure, a project manager or team of project managers failed, but that’s just falling back to “not my job” excuse.
With the benefit of time, transparency becomes possible because the teams involved have a chance to see what the other groups are doing. Such interrogation of the system shouldn’t be a luxury, it should be a requirement. Regardless of whether you are working on a website CMS or a system intended to hurl people into the air and then gracefully land them later in different time zones, you need time to do both the job AND the testing required to ensure the job was done correctly.
Far too often, egos get in the way.
That’s because it feels far better to be a team player and pitch in to finish the big project. It’s not all that hard to a corner if it means you might hit the deadline and if you manage to pull it off... perhaps you’re on the way to a promotion!
The other side of that coin involves pulling the proverbial Andon cord, and bracing for the tough conversations that follow. Maybe they’ll see it your way, and you’ll be thanked. Perhaps you’ll be expelled from the pack. It’s always hard to tell which way those conversations will go.
The peer pressure against those who consider being a squeaky wheel is immense. You can easily picture the lone voice of reason trying to step forward, saying “I don’t understand how this impacts X” only to immediately be shut down by a co-worker or supervisor who’s retort is undoubtedly some form of, “that’s not our job” or “we’re gonna miss deadline.”
When we hear dissent or concern regarding the system we’re potentially impacting, that’s your cue to pay attention. Even if you don’t share the fears being voiced, perhaps it’s because you lack the perspective needed to see the problem correctly. As Monterio has expressed consistently throughout his many writing projects, “I don’t know” should be the most confident phrase you ever utter and you should say it often. Admitting you don’t know will allow you to begin investigating, and that’s the action that was missing at Boeing.
The sad thing about the situation at Boeing is that with a couple of weeks of time to investigate, the company was able to find the problem. It wasn’t an unsolvable problem. It was an unethical deadline.
Sometimes the most ethical thing you can do is demand more time.
There were many points along the way where someone at Boeing could have demanded more time and it that time wasn’t granted they could have flung their body on the gears of the machine, but no one did. So the planes literally flung themselves into the ground instead.
Death by a thousand cuts
Thankfully, not every ethical issue will involve hundreds of people dying tragically minutes after takeoff. Plane travel remains among the safest activities, and I suspect it will stay that way because the problems surrounding the 737 Max are exceptions, not rules.
Frankly, more ominous issues lurk just under the radar. Small changes that happen over time to gradually shift the perception of what reality is. Things that might avoid detection entirely until you wake up one day and a morally bankrupt reality tv personality suddenly has acquired the nuclear missile codes.
No, I’m not going to spend time discussing how the media industry’s usage of clickbait headlines, autoplay video, and questionable advertising guidelines set the stage for the rise of fake news. I spent 15 years in journalism, and those stories are still too painful for me to reflect upon.
Rather than focusing on the fall of media or the toxic realm of social media, let’s discuss a site that you’ve probably used a least a few times in the past month.
That, of course, is the much beloved Wikipedia.
The undisputed encyclopedia of this era, the non-profit website is the fifth-most popular site on the internet. Wikipedia is supported mainly via donations and an army of volunteers, which allows the site to deliver its 40 million articles of content entirely for free without any commercial advertisements on the site.
To understand how all these parts work together, let’s look at the system that powers Wikipedia.
As outlined in the previous chapter on systems design, every system has stocks, flows, and feedback loops. In the case of Wikipedia, I’ve chosen to display Wikipedia as having two distinct systems. One for submitted new articles for publication and another for updates to existing articles. The distinction is important is because the peer review process for an article is different than that of an edit. Articles (as illustrated the system diagram above) are not published until the editorial volunteers have had a chance to review the work. Updates (as illustrated below) are posted right away and those reviews are handled by bots and volunteers — as time allows.
This approach to updates is particularly important because it allows Wikipedia to move at the speed of life. As events happen around the planet, someone is taking time to update the Wikipedia page associated with the event. Politicians are elected, championships are won, and all of it is documented in Wikipedia.
In fact, it has become commonplace for people to vandalize Wikipedia for laughs. Whether renaming an opponent’s stadium after a defeat or declaring actor Charlie Sheen to be “half human/half cocaine,” these actions are eventually scrubbed off of Wikipedia. Sometimes a bot catches it and if not a volunteer typically comes along and finds the issue.
Luckily, someone or something associated with Wikipedia is paying attention because the non-profit is quickly growing into a tool to fight disinformation.
Platforms like Facebook and YouTube are looking to Wikipedia as a way to offload the burden of verification. Nothing is more 2019 than two of the most valuable internet properties in the world abusing the resources of a non-profit and its group of unpaid volunteers.
Which raises this significant question — what happens if there is no laughter in the background to trigger a quick review. What happens if a malicious update goes through undetected? What if that update is a form of disinformation?
That’s what happened last week when Chicago-based advertising firm Leo Burnett provided anyone with a desire to spread false facts a blueprint on how to do it.
What would drive one of America’s most prestigious advertising companies do this? A desire to tilt image search results in the direction of the client, The North Face.
To Leo Burnett’s credit, the campaign was successful. Replacing images on Wikipedia with those of your client will push your product photos to the top of image search results for a given location.
Undoubtedly, there were cheers in the Leo Burnett offices with a good number of people seeing this outcome as a victory. The team that produced the campaign broke Wikipedia’s terms and conditions, which means all their work was eventually taken down by the volunteers, but they still got all the PR buzz for their client.
Nothing they did was illegal, but that doesn’t mean that their actions were not unethical.
At best, this stunt wasted the time of Wikipedia’s volunteers and forced them to neglect other tasks to correct the mess Leo Burnett made. At worst, it will inspire more malicious actors to begin exploiting the same weaknesses in the Wikipedia review system. What impact could that have on the large platforms that rely on the site for verification? Can a volunteer workforce of editors possibly keep up if the volume of updates increases?
Social media is grappling with an onslaught of fake news. What if Wikipedia suddenly suffered from an attack of fake facts?
It isn’t hard to see how this might increase. Specific figures and issues are already targeted, so those items are under continual monitoring by the Wikipedia team. The more significant problem lies just below that. Malicious edits targeting state legislature political candidates or the questionable updating of local county history all could happen at such a high volume that the system soon becomes overwhelmed. These might seem like minor problems considered alone, but in aggregate, this is where the real danger resides. Ten thousand small lies are harder to detect and remove than 1 big one.
Before social platforms were using Wikipedia as a source of verification, this abuse would have been an unfortunate nuisance. That aspect changes when fake news has an opportunity to be verified by fake facts that have slipped through the system.
Did anyone at Leo Burnett think about this aspect of the system they were about to abuse? It’s an advertising giant. Designers were at the table and nothing happened.
Sometimes the most ethical thing you can do is point out that what your team is doing is wrong.
When we don’t understand the systems that we alter, we risk burning down democracy in a quest to sell a few more backpacks.
Excellent job, Leo Burnett.
Less obvious evils
It’s unknown how many Wikipedia-like situations you’ll come into contact with throughout your career. It’s highly likely that you’ll be under deadline pressure like the team at Boeing faced, but it’s unlikely that hundreds of lives will be lost because of it.
No, the demons you’ll most likely see in your day-to-day battles won’t be so obvious. Like a slow death from a thousand cuts, the smaller infractions seem harmless but end up being just as corrosive for our ethical well-being as any more significant battle we could be fighting.
Here you enter the realm of gamification and dark patterns. Such innovations have been heralded as being ahead of their time, when, in fact, they always leverage known models associated with addiction and fear to push their campaigns forward.
Take the concept of scarcity, as shown in the image above. There are two different aspects of the same idea shown. The notice associated with the first hotel is helpful to the user as it informs them of limited availability. Only two rooms are remaining, so scarcity is actually present. The usage of this principle with the second hotel is not useful to the user as its only purpose is to instill fear of missing out on the first property. The inclusion of this second hotel is a type of dark pattern that should be avoided, and some sites are beginning to address this poor behavior.
We’ll go further into dark patterns in upcoming chapters on information and conversational design, but setting your design moral compass to avoid abusing users by playing on their weaknesses if a great place to start.
You’re gonna need backup
The acts and transgressions documented here are but a handful of late-stage capitalism examples, but that doesn’t mean that we have to go along with it.
While it is effortless to fall back and acquire the “someone would have eventually done it” defense, you’re better than that, and frankly, we NEED you to want these challenges.
The goal isn’t to wait for something to go wrong to begin building a coalition of support. When we wait for something to happen, far too often, we’re timid to act when we are needed. It’s logical to be scared that you’ll be on an island by yourselves and by doing so, risk everything you’ve worked hard to attain.
To have the strength to stand, the power to question — you need community. You need to know that you’re not alone. Otherwise, you’ll be in no better situation than the teams working in silos at Boeing or the team coming up with the clever hack at Leo Burnett.
A design union isn’t going to hatch itself by the time the next ethical crisis strikes, so my advice to you is to begin your coalition building now where you currently stand. Begin discussing the topic of ethics today with your colleagues. Whether it is a group of students or managers, the quicker you can start driving the conversation, the better suited you will be to step up when the moment arrives.
In this chapter, we examined how unethical situations play out. One situation was a solvable problem but required someone to push back for more time. Another involved the exploitation of a flaw in the design of Wikipedia’s submission process. While there will always be flaws that can be exploited, those are short-term wins that do more long-term damage than good. Our goal is to rise above such opportunities and find an ethical way forward that benefits our client, our company, and the society we want to preserve.
It’s time to focus our attention with possible ethical solutions for the client’s goal. In the announcement video, The North Face stated that they wanted images for their brand to appear at the top of Google search listings for several exotic locations around the world.
This isn’t a lousy client goal, but the Leo Burnett solution was terrible because it was both unethical and it failed to create any lasting impact (beyond ill-will toward the brand).
A seat at the table
The client goal remains the same. Make it to the top of Google search — but how? Your job is to come up with three other possible solutions. If you were sitting in the conference room at Leo Burnett, what would your suggestions be?
We know what the wrong solution is (vandalize Wikipedia), but what would the right solution look like?
This exercise is pass/fail. If you come up with options for discussion, you pass. In scenarios like this, a perfect answer may not exist, but every option resides on a scale where the options can be compared against one another for consideration.
Upon completion, update your Program Journal with links to any assets produced in this exercise. Post your journal in the Feedback-Loop channel for review.
Up next Working In Product Design