Tags: ethics

52

sparkline

Sunday, December 3rd, 2017

What Do We Do with the Art of Monstrous Men?

There are many qualities one must possess to be a working writer or artist. Talent, brains, tenacity. Wealthy parents are good. You should definitely try to have those. But first among equals, when it comes to necessary ingredients, is selfishness. A book is made out of small selfishnesses. The selfishness of shutting the door against your family. The selfishness of ignoring the pram in the hall. The selfishness of forgetting the real world to create a new one. The selfishness of stealing stories from real people. The selfishness of saving the best of yourself for that blank-faced anonymous paramour, the reader. The selfishness that comes from simply saying what you have to say.

Monday, November 20th, 2017

Trolleys, veils and prisoners: the case for accessibility from philosophical ethics

The transcript of a presentation on the intersection of ethics and accessibility.

Saturday, November 18th, 2017

Hooked and booked

At Booking.com, they do a lot of A/B testing.

At Booking.com, they’ve got a lot of dark patterns.

I think there might be a connection.

A/B testing is a great way of finding out what happens when you introduce a change. But it can’t tell you why.

The problem is that, in a data-driven environment, decisions ultimately come down to whether something works or not. But just because something works, doesn’t mean it’s a good thing.

If I were trying to convince you to buy a product, or use a service, one way I could accomplish that would be to literally put a gun to your head. It would work. Except it’s not exactly a good solution, is it? But if we were to judge by the numbers (100% of people threatened with a gun did what we wanted), it would appear to be the right solution.

When speaking about A/B testing at Booking.com, Stuart Frisby emphasised why it’s so central to their way of working:

One of the core principles of our organisation is that we want to be very customer-focused. And A/B testing is really a way for us to institutionalise that customer focus.

I’m not so sure. I think A/B testing is a way to institutionalise a focus on business goals—increasing sales, growth, conversion, and all of that. Now, ideally, those goals would align completely with the customer’s goals; happy customers should mean more sales …but more sales doesn’t necessarily mean happy customers. Using business metrics (sales, growth, conversion) as a proxy for customer satisfaction might not always work …and is clearly not the case with many of these kinds of sites. Whatever the company values might say, a company’s true focus is on whatever they’re measuring as success criteria. If that’s customer satisfaction, then the company is indeed customer-focused. But if the measurements are entirely about what works for sales and conversions, then that’s the real focus of the company.

I’m not saying A/B testing is bad—far from it! (although it can sometimes be taken to the extreme). I feel it’s best wielded in combination with usability testing with real users—seeing their faces, feeling their frustration, sharing their joy.

In short, I think that A/B testing needs to be counterbalanced. There should be some kind of mechanism for getting the answer to “why?” whenever A/B testing provides to the answer to “what?” In-person testing could be one way of providing that balance. Or it could be somebody’s job to always ask “why?” and determine if a solution is qualitatively—and not just quantitatively—good. (And if you look around at your company and don’t see anyone doing that, maybe that’s a role for you.)

If there really is a connection between having a data-driven culture of A/B testing, and a product that’s filled with dark patterns, then the disturbing conclusion is that dark patterns work …at least in the short term.

Saturday, November 11th, 2017

The Medium - daverupert.com

I have to keep reminding myself that I do have some control. I can build The Medium I want. I can cling to what’s good.

Sunday, November 5th, 2017

Against an Increasingly User-Hostile Web - Neustadt.fr

With echoes of Anil Dash’s The Web We Lost, this essay is a timely reminder—with practical advice—for we designers and developers who are making the web …and betraying its users.

You see, the web wasn’t meant to be a gated community. It’s actually pretty simple.

A web server, a public address and an HTML file are all that you need to share your thoughts (or indeed, art, sound or software) with anyone in the world. No authority from which to seek approval, no editorial board, no publisher. No content policy, no dependence on a third party startup that might fold in three years to begin a new adventure.

That’s what the web makes possible. It’s friendship over hyperlink, knowledge over the network, romance over HTTP.

Wednesday, October 4th, 2017

This Future Looks Familiar: Watching Blade Runner in 2017 | Tor.com

If you subtract the flying cars and the jets of flame shooting out of the top of Los Angeles buildings, it’s not a far-off place. It’s fortunes earned off the backs of slaves, and deciding who gets to count as human. It’s impossible tests with impossible questions and impossible answers. It’s having empathy for the right things if you know what’s good for you. It’s death for those who seek freedom.

A thought-provoking first watch of Blade Runner …with an equally provocative interpretation in the comments:

The tragedy is not that they’re just like people and they’re being hunted down; that’s way too simplistic a reading. The tragedy is that they have been deliberately built to not be just like people, and they want to be and don’t know how.

That’s what really struck me about Kazuo Ishiguro’s Never Let Me Go: the tragedy is that these people can’t take action. “Run! Leave! Go!” you want to scream at them, but you might as well tell someone “Fly! Why don’t you just fly?”

Friday, September 1st, 2017

John Lanchester reviews ‘The Attention Merchants’ by Tim Wu, ‘Chaos Monkeys’ by Antonio García Martínez and ‘Move Fast and Break Things’ by Jonathan Taplin · LRB 17 August 2017

Triple the hand-wringing in this combined review of three books:

  • The Attention Merchants: From the Daily Newspaper to Social Media, How Our Time and Attention Is Harvested and Sold by Tim Wu,
  • Chaos Monkeys: Inside the Silicon Valley Money Machine by Antonio García Martínez, and
  • Move Fast and Break Things: How Facebook, Google and Amazon have Cornered Culture and What It Means for All of Us by Jonathan Taplin.

What this means is that even more than it is in the advertising business, Facebook is in the surveillance business. Facebook, in fact, is the biggest surveillance-based enterprise in the history of mankind. It knows far, far more about you than the most intrusive government has ever known about its citizens. It’s amazing that people haven’t really understood this about the company. I’ve spent time thinking about Facebook, and the thing I keep coming back to is that its users don’t realise what it is the company does. What Facebook does is watch you, and then use what it knows about you and your behaviour to sell ads. I’m not sure there has ever been a more complete disconnect between what a company says it does – ‘connect’, ‘build communities’ – and the commercial reality.

Sunday, August 20th, 2017

Unacceptable usage

Fortune magazine published a list of all the companies who say hate groups can’t use their services anymore:

  • GoDaddy,
  • Google,
  • Apple,
  • Cloudflare,
  • Airbnb,
  • PayPal,
  • Discover Financial Services,
  • Visa,
  • Spotify,
  • Discord, and
  • GoFundMe.

Digital Ocean aren’t listed in the article but they’ve also cut off the oxygen to hate groups that were using their platform.

There’s another company that I wish were on that list: Shopify. They provide Breitbart with its online store. That’s despite clause three of their Acceptable Usage Policy:

Hateful Content: You may not offer goods or services, or post or upload Materials, that condone or promote violence against people based on race, ethnicity, color, national origin, religion, age, gender, sexual orientation, disability, medical condition or veteran status.

The flimsy free speech defence looks even more spineless in light of the actions of other companies.

I’m incredibly disappointed in Shopify. I’m starting to have misgivings about appearing at events or on podcasts sponsored by Shopify—being two degrees of separation away from the hatefulness of Breitfart doesn’t sit well with me.

I sincerely hope that Shopify will change their stance, enforce their own terms of service, and dropify hate speech.

Wednesday, August 9th, 2017

Patterns Day 2017: Paul Lloyd on Vimeo

Paul pulls no punches in this rousing talk from Patterns Day.

The transcript is on his site.

Patterns Day 2017: Paul Lloyd

Sunday, August 6th, 2017

Another Lens - News Deeply x Airbnb.Design

A series of questions to ask on any design project:

  • What are my lenses?
  • Am I just confirming my assumptions, or am I challenging them?
  • What details here are unfair? Unverified? Unused?
  • Am I holding onto something that I need to let go of?
  • What’s here that I designed for me? What’s here that I designed for other people?
  • What would the world look like if my assumptions were wrong?
  • Who might disagree with what I’m designing?
  • Who might be impacted by what I’m designing?
  • What do I believe?
  • Who’s someone I’m nervous to talk to about this?
  • Is my audience open to change?
  • What am I challenging as I create this?
  • How can I reframe a mistake in a way that helps me learn?
  • How does my approach to this problem today compare to how I might have approached this one year ago?
  • If I could learn one thing to help me on this project, what would that one thing be?
  • Do I need to slow down?

Monday, July 3rd, 2017

Fantasies of the Future / Paul Robert Lloyd

Paul has published the slides and transcript of his knock-out talk at Patterns Day. This a must-read: superb stuff!

Design systems are an attempt to add a layer of logic and reasoning over a series decisions made by complex, irrational, emotional human beings. As such, they are subject to individual perspectives, biases, and aspirations.

How does the culture in which they are made effect the resulting design?

Wednesday, May 3rd, 2017

Build a Better Monster: Morality, Machine Learning, and Mass Surveillance

So what happens when these tools for maximizing clicks and engagement creep into the political sphere?

This is a delicate question! If you concede that they work just as well for politics as for commerce, you’re inviting government oversight. If you claim they don’t work well at all, you’re telling advertisers they’re wasting their money.

Facebook and Google have tied themselves into pretzels over this.

Thursday, April 13th, 2017

Digital Assistants, Facebook Quizzes, And Fake News! You Won’t Believe What Happens Next | Laura Kalbag

A great presentation from Laura on how tracking scripts are killing the web. We can point our fingers at advertising companies to blame for this, but it’s still developers like us who put those scripts onto websites.

We need to ask ourselves these questions about what we build. Because we are the gatekeepers of what we create. We don’t have to add tracking to everything, it’s already gotten out of our control.

Sunday, March 12th, 2017

Design Ethics in Practice – The Interconnected

Excellent and practical advice for before, during, and after research sessions and usability tests.

Wednesday, March 8th, 2017

Let’s Make the World We Want To Live In | Big Medium

Josh gives a thorough roundup of the Interaction ‘17 event he co-chaired.

“I think I’ve distilled what this conference is all about,” Jeremy Keith quipped to me during one of the breaks. “It’s about how we’ll save the world through some nightmarish combination of virtual reality, chatbots, and self-driving cars.”

Thursday, January 19th, 2017

Trump: A Resister’s Guide | Harper’s Magazine - Part 11

You, the software engineers and leaders of technology companies, face an enormous responsibility. You know better than anyone how best to protect the millions who have entrusted you with their data, and your knowledge gives you real power as civic actors. If you want to transform the world for the better, here is your moment. Inquire about how a platform will be used. Encrypt as much as you can. Oppose the type of data analysis that predicts people’s orientation, religion, and political preferences if they did not willingly offer that information.

Thursday, January 5th, 2017

Going rogue

As soon as tickets were available for the Brighton premiere of Rogue One, I grabbed some—two front-row seats for one minute past midnight on December 15th. No problem. That was the night after the Clearleft end-of-year party on December 14th.

Then I realised how dates work. One minute past midnight on December 15th is the same night as December 14th. I had double-booked myself.

It’s a nice dilemma to have; party or Star Wars? I decided to absolve myself of the decision by buying additional tickets for an evening showing on December 15th. That way, I wouldn’t feel like I had to run out of the Clearleft party before midnight, like some geek Cinderella.

In the end though, I did end up running out of the Clearleft party. I had danced and quaffed my fill, things were starting to get messy, and frankly, I was itching to immerse myself in the newest Star Wars film ever since Graham strapped a VR headset on me earlier in the day and let me fly a virtual X-wing.

Getting in the mood for Rogue One with the Star Wars Battlefront X-Wing VR mission—invigorating! (and only slightly queasy-making)

So, somewhat tired and slightly inebriated, I strapped in for the midnight screening of Rogue One: A Star Wars Story.

I thought it was okay. Some of the fan service scenes really stuck out, and not in a good way. On the whole, I just wasn’t that gripped by the story. Ah, well.

Still, the next evening, I had those extra tickets I had bought as psychological insurance. “Why not?” I thought, and popped along to see it again.

This time, I loved it. It wasn’t just me either. Jessica was equally indifferent the first time ‘round, and she also enjoyed it way more the second time.

I can’t recall having such a dramatic swing in my appraisal of a film from one viewing to the next. I’m not quite sure why it didn’t resonate the first time. Maybe I was just too tired. Maybe I was overthinking it too much, unable to let myself get caught up in the story because I was over-analysing it as a new Star Wars film. Anyway, I’m glad that I like it now.

Much has been made of its similarity to classic World War Two films, which I thought worked really well. But the aspect of the film that I found most thought-provoking was the story of Galen Erso. It’s the classic tale of an apparently good person reluctantly working in service to evil ends.

This reminded me of Mother Night, perhaps my favourite Kurt Vonnegut book (although, let’s face it, many of his books are interchangeable—you could put one down halfway through, and pick another one up, and just keep reading). Mother Night gives the backstory of Howard W. Campbell, who appears as a character in Slaughterhouse Five. In the introduction, Vonnegut states that it’s the one story of his with a moral:

We are what we pretend to be, so we must be careful about what we pretend to be.

If Galen Erso is pretending to work for the Empire, is there any difference to actually working for the Empire? In this case, there’s a get-out clause for this moral dilemma: by sabotaging the work (albeit very, very subtly) Galen’s soul appears to be absolved of sin. That’s the conclusion of the excellent post on the Sci-fi Policy blog, Rogue One: an ‘Engineering Ethics’ Story:

What Galen Erso does is not simply watch a system be built and then whistleblow; he actively shaped the design from its earliest stages considering its ultimate societal impacts. These early design decisions are proactive rather than reactive, which is part of the broader engineering ethics lesson of Rogue One.

I know I’m Godwinning myself with the WWII comparisons, but there are some obvious historical precedents for Erso’s dilemma. The New York Review of Books has an in-depth look at Werner Heisenberg and his “did he/didn’t he?” legacy with Germany’s stalled atom bomb project. One generous reading of his actions is that he kept the project going in order to keep scientists from being sent to the front, but made sure that the project was never ambitious enough to actually achieve destructive ends:

What the letters reveal are glimpses of Heisenberg’s inner life, like the depth of his relief after the meeting with Speer, reassured that things could safely tick along as they were; his deep unhappiness over his failure to explain to Bohr how the German scientists were trying to keep young physicists out of the army while still limiting uranium research work to a reactor, while not pursuing a fission bomb; his care in deciding who among friends and acquaintances could be trusted.

Speaking of Albert Speer, are his hands are clean or dirty? And in the case of either answer, is it because of moral judgement or sheer ignorance? The New Atlantis dives deep into this question in Roger Forsgren’s article The Architecture of Evil:

Speer indeed asserted that his real crime was ambition — that he did what almost any other architect would have done in his place. He also admitted some responsibility, noting, for example, that he had opposed the use of forced labor only when it seemed tactically unsound, and that “it added to my culpability that I had raised no humane and ethical considerations in these cases.” His contrition helped to distance himself from the crude and unrepentant Nazis standing trial with him, and this along with his contrasting personal charm permitted him to be known as the “good Nazi” in the Western press. While many other Nazi officials were hanged for their crimes, the court favorably viewed Speer’s initiative to prevent Hitler’s scorched-earth policy and sentenced him to twenty years’ imprisonment.

I wish that these kinds of questions only applied to the past, but they are all-too relevant today.

Software engineers in the United States are signing a pledge not to participate in the building of a Muslim registry:

We refuse to participate in the creation of databases of identifying information for the United States government to target individuals based on race, religion, or national origin.

That’s all well and good, but it might be that a dedicated registry won’t be necessary if those same engineers are happily contributing their talents to organisations whose business models are based on the ability to track and target people.

But now we’re into slippery slopes and glass houses. One person might draw the line at creating a Muslim registry. Someone else might draw the line at including any kind of invasive tracking script on a website. Someone else again might decide that the line is crossed by including Google Analytics. It’s moral relativism all the way down. But that doesn’t mean we shouldn’t draw lines. Of course it’s hard to live in an ideal state of ethical purity—from the clothes we wear to the food we eat to the electricity we use—but a muddy battleground is still capable of having a line drawn through it.

The question facing the fictional characters Galen Erso and Howard W. Campbell (and the historical figures of Werner Heisenberg and Albert Speer) is this: can I accomplish less evil by working within a morally repugnant system than being outside of it? I’m sure it’s the same question that talented designers ask themselves before taking a job at Facebook.

At one point in Rogue One, Galen Erso explicitly invokes the justification that they’d find someone else to do this work anyway. It sounds a lot like Tim Cook’s memo to Apple staff justifying his presence at a roundtable gathering that legitimised the election of a misogynist bigot to the highest office in the land. I’m sure that Tim Cook, Elon Musk, Jeff Bezos, and Sheryl Sandberg all think they are playing the part of Galen Erso but I wonder if they’ll soon find themselves indistinguishable from Orson Krennic.

Wednesday, December 28th, 2016

Radical Technologies: The Design of Everyday Life, now available for pre-order | Adam Greenfield’s Speedbird

Adam Greenfield’s new book is almost here at last, and it sounds like it has pivoted into quite an interesting beast.

Wednesday, December 21st, 2016

Rogue One: an ‘Engineering Ethics’ Story — SciFi Policy

This article examines what I thought was the most interesting aspect of Rogue One—the ethical implications for technologists.

Don’t dismiss this essay just because it’s about a Hollywood blockbuster. Given the current political situation, this is deeply relevant.

Friday, October 14th, 2016

Addicted to Your iPhone? You’re Not Alone - The Atlantic

The dreadful headline makes this sound like another pearl-clutching moral panic, but there’s some good stuff in this somewhat hagiographic profile.

Harris is developing a code of conduct—the Hippocratic oath for software designers—and a playbook of best practices that can guide start-ups and corporations toward products that “treat people with respect.” Having companies rethink the metrics by which they measure success would be a start.