Disruptive Design: Harmful Patterns and Bad Practice
July 1, 2019
I'm watching Disruptive Design: Harmful Patterns and Bad Practice by Laura Kalbag.
Tech folks like to use jargon and euphemism like "disruption" and "engagement" - it lets us get away with building products that make the world worst.
They say "admitting the problem is the first step in fixing it". Laura says: "Dear tech industry: we have problems."
Laura wrote a book on accessibility, which is "the degree to which technology is used by as many people as possible, especially disabled people."
What's the difference in accessibility and inclusivity? Accessibility is often a bolted on solution - like adding a ramp on the side of a shop after you already built stairs. Inclusive design involves thinking about access for all from the beginning, making it part of the design. We need inclusivity to make better technology; bolted-on accessibility won't cut it.
# Patterns
# 1. Low contrast text
Text whose color is very similar in contrast to the background color. The WebAIM Million Site Survey found that low-contrast text was the #1 accessibility problem on the most popular sites on the web (85.3% of all homepages). We often see low-contrast text in things like footers, legal disclaimers, etc. We use contrast to show visual hierarchy, but that means that some people can't read the text at all.
- We can still use contrast and font weight to show visual hierarchy, we just need to make sure that we stay within the accessible contrast range.
# 2. Captcha
"Completely automated public Turing test to tell computers and humans apart". 4.25% of sites use CAPTCHA, but screenreader users report that CAPTCHA is the most difficult part of using the web. "People with assistive technologies report that they get classified as a bot" because it relies on specific visual, spatial, linguistic, and even mathematical knowledge as well as specific gesture speeds.
- ReCAPTCHA 3 is out now, aiming to provide knowledge about whether your users are humans or bots without user interaction. According to Google, though, ReCAPTCHA works best when you embed it all over your site, not just on forms.
- Surveillance capitalism is when a corporation surveils our behavior and monetizes it. This is what's going on when you use CAPTCHAs like ReCAPTCHA 3.
- CAPTCHA is bad for accessibility and inclusivity, ethically misjudged, and often, hard to fix. There are some promising alternatives that protect user privacy and provide more accessibility, but we're not there yet.
# 3. Profiling
The process of construction and application of user profiles generated by computerised data analysis.
- Facebook will only let you sign up as male or female, even though you can select a nonbinary gender later, because it makes it easier to decide what type of ads to target you with.
- Advertising profiles perpetuate stereotypes and prejudices. Julia Angwin & Terry Parris Jr. found that Facebook allowed advertisers to remove groups from targeting that have "ethnic affinities". Ads assume if you're a woman you're interested in fashion and getting pregnant.
“In these experiments, Facebook delivered ads for jobs in the lumber industry to an audience that was approximately 70 percent white and 90 percent men, and supermarket-cashier positions to an audience of approximately 85 percent women. Home-sale ads, meanwhile, were delivered to approximately 75 percent white users, while ads for rentals were shown to a more racially balanced group.”–Aaron Rieke and Corinne Yu, ‘Discrimination’s Digital Frontier’
# Privacy
"Privacy is not a low cost for convenience. Privacy is not about hiding, it’s about having power and agency over how your information is used. One person’s relevant advertising is another person’s grounds for discrimination." - Laura.
We must remember the people whose data is used against them. This isn't easy to fix. Big tech doesn't think it's doing anything wrong - it just thinks it has a PR problem.
# We need new models.
Not just new role models, but new funding models. New ways to fund technology where access to technology and civil rights are not pitted against each other.
People who try to reform big tech from the inside are demoted, given fewer promotions, and retaliated against. See: Google Walkout organizers.
# Better isn't always good enough.
We can make sites more usable, less hostile to the people using them. We can make sites inclusive and more accessible. But if we’re doing all that with a bad business model, all we’re doing is leading more people into exploitation.
# How You Can Help
- Be different - create alternatives.
- Be the advisor and make recommendations based on your research for ethical, inclusive alternatives and designs.
- Be the advocate, for others and underrepresented folks.
- Be the questioner. Question norms, why things are built the way they are.
- Be the gatekeeper. "When the advocacy isn’t getting you far enough, use your expertise to prevent unethical design from happening on your watch."
- Be difficult. "Be the person who is known for always bringing up the issue. Embrace the awkwardness that comes with your power. Call out questionable behaviour."
- Be unprofessional. "Don’t let anybody tell you that standing up for the needs of yourself, and others, is unprofessional. Don’t let people tell you to be quiet. Or that you’ll get things done if you’re a bit nicer."
- Be the supporter. If you can't speak up, at least support those who do. Silence is complicity.
Nothing is inevitable. Go forth and disrupt the disrupters.