Netflix’s documentary “The Great Hack“, I hope, is an eye-opener to the infliction point the tech industry is at. How far should a business grow-fast-and-break-things, when breaking things equals invoking genocide and supporting illegal tampering in sovereign nations, empowered by a systemized lack of transparency and rampant lack of responsibility.
I could not sympathize with Kaiser (Cambridge Analytica’s BD Director), even though I felt the documentary narrative was subtly attempting to. What’s frustrating time and time again is that the unethical misuse of technology against “others” seems to always be justified with no thought to how that can come back and harm the perpetuator… a combination of ego-mania, aloofness and entitlement.
Coincidentally, I happened to have just finished reading three books that tackled the state of unethical practices of design and product development in today’s business world (technology and all).
P.S. I did not realize these books would converge in such a manner before reading them, but they did end up being quite complementary and reinforcing or each-others narrative.
Ruined by Design
by Mike Monteiro
This book is must read for anyone responsible for making, developing, researching, designing (in the exact sense), and bringing to life in any way a product that could impact the lives of individuals beyond their comprehension. Mike makes a passionate and compelling plea for product makers around the world to be held more accountable for the products they unleash into the wild. Maybe, just maybe, we’ll have more people thinking before breaking the “internet”.
Below are some thoughts and snippets from the book:
When we look at the social media influenced world around us it’s worth noting that: “Either by action or inaction, through fault or ignorance, we have designed the world to behave exactly as it’s behaving right now. These are our chickens coming home to roost.”
We have reached here because “Companies ask designers to move fast and break things. How has become more important than why.”
“We [now more than ever] have to be ready for any tool we build to have a global impact. But even if it only impacts the area around you, chances are it is going to hopefully, reach people who are different from you…Don’t you want all of those people using your tool? Don’t you want them to be able to participate in what you’re making? …And, for the capitalists …don’t you want their money? If we intend to build successful tools, we need to expand our definition of we.”
Some may argue that ethical considerations makes their work less-innovative and more constrained. Innovation has long flourished “not in spite of constraints, but because of them.”
by Sara Wachter-Boettcher
This book is among the wave of female voices calling out the “frat” culture flourishing in tech companies for its lack on inclusion and intolerance of diversity. Sounding an alarm towards data-powered “intelligence” that these companies monopolize and are being given reign to make life altering decisions, from facial identification to bank loans! Sara makes a timely argument: “People need to understand that data is not truth. It is not going to magically solve hard societal problems for us”, and it’s our responsibility, not the data’s, to build better solutions.
Meritocracy can sometimes mean training algorithms on existing data, which is far from ideal and while considered “pattern recognition” by some, it’s also “profiling or stereotyping”.
And this: “Default settings can be helpful or deceptive, thoughtful or frustrating. But they’re never neutral. They’re designed…[by] designers and developers who’ve been told that they’re rock stars, gurus and geniuses, and that the world is made for people like them.” Who also happen to be generally from similar cultures, upbringing, maybe race, ethnicity, political affiliation, belief and education.
by Kat Holmes
I happened to read this book during a trip to Tokyo last spring, which made me appreciate the city in a way I wouldn’t have otherwise; Tokyo is an awe-inspiringly accessible city (besides the language barrier).
Kat’s tone is less pessimistic, but still critical of the limited vision of products. Mismatch highlights a reality we rarely address in our blanket of assumptions about “accessibility”: “Many of us are temporarily able-bodied and will face new kinds of exclusion as we age. When we design for inclusion we are designing for our future selves. Not just for the changes in our bodies but for our ability to contribute to society. It is designing how the next generation will treat and care for us. It’s making solutions to uphold the human connections that are most important to our lives. Our dignity, health, safety, and sense of being at home.”
“If inclusion isn’t explicitly part of that leadership, exclusion will be default.”
So it’s no surprise that when I watched the documentary over the weekend I was not surprised nor sympathetic. Cambridge Analytica was commercial product that was designed be destructive, with intentions stemming straight out of military intelligence. Facebook was designed to be irresponsible for user-generated content; plain-old-trolls and systematic-war-machines alike, and grew to a disproportional size by simply getting away with it. A match made in heaven.
Twitter, Google, Uber, Amazon, Apple, etc are making ripples across supply-chains, financial systems, news outlets and media, climate, social dynamics, politics and probably facets of life we are not aware of even. Being ethical and inclusive in how they impact our world should no longer be a stock-price driven decision.
I understand that making choices about whom and how you work may seem like a privilege to many, but I echo Mike’s call for designers and developers taking oaths of good conduct. You wouldn’t want a doctor that has not sworn their oath designing a medical treatment for you nor an uncertified lawyer arguing your case in court, then why would you want people that have no accountability designing every other living facet of our increasingly connected lives?