Tags: ethics

84

sparkline

Thursday, July 12th, 2018

Unchained: A story of love, loss, and blockchain - MIT Technology Review

A near-future sci-fi short by Hannu Rajaniemi that’s right on the zeitgest money.

The app in her AR glasses showed the car icon crawling along the winding forest road. In a few minutes, it would reach the sharp right turn where the road met the lake. The turn was marked by a road sign she had carefully defaced the previous day, with tiny dabs of white paint. Nearly invisible to a human, they nevertheless fooled image recognition nets into classifying the sign as a tree.

Tuesday, June 26th, 2018

Untold AI: The Untold | Sci-fi interfaces

Prompted by his time at Clearleft’s AI gathering in Juvet, Chris has been delving deep into the stories we tell about artificial intelligence …and what stories are missing.

And here we are at the eponymous answer to the question that I first asked at Juvet around 7 months ago: What stories aren’t we telling ourselves about AI?

Tuesday, June 19th, 2018

A Book Apart, We’re donating 25% profits to RAICES

What’s happening right now at the US border is heartbreaking and inexcusable. We’re donating 25% of all profits today and tomorrow (June 19 & 20) to RAICES, to help reunite detained immigrant parents and children.

[Essay] Known Unknowns | New Dark Age by James Bridle | Harper’s Magazine

A terrific cautionary look at the history of machine learning and artificial intelligence from the new laugh-a-minute book by James.

Saturday, June 16th, 2018

Artificial Intelligence for more human interfaces | Christian Heilmann

An even-handed assessment of the benefits and dangers of machine learning.

Tuesday, May 29th, 2018

The Amish understand a life-changing truth about technology the rest of us don’t — Quartz

The headline is terrible but this interview is an insightful look at evaluating technology.

I remember Kevin Kelly referring to the Amish as “slow geeks”, and remarking that we could all become a little more amish-ish.

It’s not that the Amish view technology as inherently evil. No rules prohibit them from using new inventions. But they carefully consider how each one will change their culture before embracing it. And the best clue as to what will happen comes from watching their neighbors.

Superfan! — Sacha Judd

The transcript of a talk that is fantastic in every sense.

Fans are organised, motivated, creative, technical, and frankly flat-out awe-inspiring.

Monday, May 21st, 2018

Google Duplex and the canny rise: a UX pattern – UX Collective

Chris weighs up the ethical implications of Google Duplex:

The social hacking that could be accomplished is mind-boggling. For this reason, I expect that having human-sounding narrow AI will be illegal someday. The Duplex demo is a moment of cultural clarity, where it first dawned on us that we can do it, but with only a few exceptions, we shouldn’t.

But he also offers alternatives for designing systems like this:

  1. Provide disclosure, and
  2. Design a hot signal:

…design the interface so that it is unmistakeable that it is synthetic. This way, even if the listener missed or misunderstood the disclosure, there is an ongoing signal that reinforces the idea. As designer Ben Sauer puts it, make it “Humane, not human.”

Pi-hole®: A black hole for Internet advertisements

This looks like a terrific use of a Raspberry Pi—blocking adtech surveillance at the network level.

Wouldn’t it be great if the clichéd going-home-for-Christmas/Thanksgiving to fix the printer/wifi included setting up one of these?

There’s an article about Pi-hole in Business Week where the creators offer some advice for those who equate any kind of online advertising with ubiquitous surveillance:

For publishers struggling to survive even with maximum ad surveillance, the Pi-hole team recommends a renewed focus on subscriptions, affiliate links, and curated endorsements for products and services that might truly interest users, similar to the way podcast hosts may talk about how much they personally enjoy a sponsor’s products. There’s nothing wrong with pitching people stuff they might enjoy, the team says. It’s just the constant, ever-intensifying surveillance that needs to stop.

Thursday, May 10th, 2018

Kumiho. — Ethan Marcotte

Ethan shares my reaction to Google Duplex:

Frankly, this technology was designed to deceive humans.

And he points out that the team’s priorities are very revealing:

I’ll say this: it’s telling that matters of transparency, disclosure, and trust weren’t considered important for the initial release.

Wednesday, May 9th, 2018

Google Duplicitous

I can’t recall the last time I was so creeped out by a technology as I am by Google Duplex—the AI that can make reservations over the phone by pretending to be a human.

I’m not sure what’s disturbing me more: the technology itself, or the excited reaction of tech bros who can’t wait to try it.

Thing is …when these people talk about being excited to try it, I’m pretty sure they are only thinking of trying it as a caller, not a callee. They aren’t imagining that they could possibly be one of the people on the other end of one of those calls.

The visionaries of technology—Douglas Engelbart, J.C.R Licklider—have always recognised the potential for computers to augment humanity, to be bicycles for the mind. I think they would be horrified to see the increasing trend of using humans to augment computers.

Thursday, May 3rd, 2018

Why Silicon Valley can’t fix itself

Backlash backlash:

The nature of human nature is that it changes. It can not, therefore, serve as a stable basis for evaluating the impact of technology. Yet the assumption that it doesn’t change serves a useful purpose. Treating human nature as something static, pure and essential elevates the speaker into a position of power. Claiming to tell us who we are, they tell us how we should be.

Saturday, April 28th, 2018

An Apology for the Internet — From the People Who Built It

A hand-wringing, finger-pointing litany of hindsight, published with 11 tracking scripts attached.

  1. Start With Hippie Good Intentions …
  2. … Then mix in capitalism on steroids.
  3. The arrival of Wall Streeters didn’t help …
  4. … And we paid a high price for keeping it free.
  5. Everything was designed to be really, really addictive.
  6. At first, it worked — almost too well.
  7. No one from Silicon Valley was held accountable …
  8. … Even as social networks became dangerous and toxic.
  9. … And even as they invaded our privacy.
  10. Then came 2016.
  11. Employees are starting to revolt.
  12. To fix it, we’ll need a new business model …
  13. … And some tough regulation.
  14. Maybe nothing will change.
  15. … Unless, at the very least, some new people are in charge.

Monday, April 23rd, 2018

Spinning jenny. — Ethan Marcotte

During the Industrial Revolution, as new machines were invented to increase output, business owners often dreamed of an entirely automated workforce—of a factory without workers. I assume their workers had different dreams.

Ethan thinks through the ethical implications of increasing automation and efficiency über alles:

I can’t stop thinking about how much automation has changed our industry already. And I know the rate of automation is only going to accelerate from here.

At the very least, maybe it’s worth asking ourselves what might happen next.

Tuesday, April 10th, 2018

Fantasies of the Future: Design in a World Being Eaten by Software / Paul Robert Lloyd

The transcript of a terrific talk by Paul, calling for a more thoughtful, questioning approach to digital design. It covers the issues I’ve raised about Booking.com’s dark patterns and a post I linked to a while back about the shifting priorities of designers working at scale.

Drawing inspiration from architectural practice, its successes and failures, I question the role of design in a world being eaten by software. When the prevailing technocratic culture permits the creation of products that undermine and exploit users, who will protect citizens within the digital spaces they now inhabit?

Saturday, April 7th, 2018

Future Ethics

Cennydd is writing (and self-publishing) a book on ethics and digital design. It will be released in September.

Technology is never neutral: it has inevitable social, political, and moral impact. The coming era of connected smart technologies, such as AI, autonomous vehicles, and the Internet of Things, demands trust: trust the tech industry has yet to fully earn.

Friday, April 6th, 2018

Using Ethics In Web Design — Smashing Magazine

A remarkably practical in-depth guide to making ethical design decisions, with enjoyable diversions into the history of philosophy throughout.

Sunday, March 25th, 2018

Paul Ford: Facebook Is Why We Need a Digital Protection Agency - Bloomberg

The word “leak” is right. Our sense of control over our own destinies is being challenged by these leaks. Giant internet platforms are poisoning the commons. They’ve automated it.

Wednesday, March 21st, 2018

Facebook and the end of the world

I’d love to see some change, and some introspection. A culture of first, do no harm. A recognition that there are huge dangers if you just do what’s possible, or build a macho “fail fast” culture that promotes endangerment. It’s about building teams that know they’ll make mistakes but also recognize the difference between great businesses opportunities and gigantic, universe-sized fuck ups.

Saturday, March 10th, 2018

Technologist Hippocratic Oath | An optional oath for building ethically considered experiences

Everyone draws their lines in different ways and perhaps there is a spectrum of what is reasonable when implementing influential products. That’s exactly why technologists must seek to educate themselves on the patterns they are implementing in order to understand their psychological influence and other outcomes where intended use is not always the same as the reality of the user experience. Not only that, but we should feel empowered to speak up to authority when something crosses a line.