Can you believe we used to willingly tell Google about every single visitor to basecamp.com by way of Google Analytics? Letting them collect every last byte of information possible through the spying eye of their tracking pixel. Ugh.
In this new world, it feels like an obligation to make sure we’re not aiding and abetting those who seek to exploit our data. Those who hoard every little clue in order to piece of together a puzzle that’ll ultimately reveal all our weakest points and moments, then sell that picture to the highest bidder.
Surveillance giants: How the business model of Google and Facebook threatens human rights | Amnesty International
Amnesty International have released a PDF report on the out-of-control surveillance perpetrated by Google and Facebook:
Google and Facebook’s platforms come at a systemic cost. The companies’ surveillance-based business model forces people to make a Faustian bargain, whereby they are only able to enjoy their human rights online by submitting to a system predicated on human rights abuse. Firstly, an assault on the right to privacy on an unprecedented scale, and then a series of knock-on effects that pose a serious risk to a range of other rights, from freedom of expression and opinion, to freedom of thought and the right to non-discrimination.
This page on the Amnesty International website has six tracking scripts. Also, consent to accept tracking cookies is assumed (check dev tools). It looks like you can reject marketing cookies, but I tried that without any success.
The stone PDF has been thrown from a very badly-performing glass house.
Let us not overlook the fact that a semantic HTML web site is inherently accessible by default. When we bend the web to our will, we break that. So we have a responsibility to correct it. Sure the new technologies are neat, but the end result is usually garbage. This all requires some next-level narcissism that our goals and priorities as developers are far more important than that of the audience we’re theoretically building software to serve.
The slides from Laura’s excellent talk at FF Conf on Friday.
Mike follows up on the changes made by email startup Superhuman after his initial post:
I will say this: if you were skeptical of Superhuman’s commitment to privacy and safety after reading the last article, you should probably be even more skeptical after these changes. The company’s efforts demonstrate a desire to tamp down liability and damage to their brand, but they do not show an understanding of the core problem: you should not build software that surreptitiously collects data on people in a way that would surprise and frighten them.
A really excellent analysis by Mike of a dark pattern in the Superhuman email app.
That’s right. A running log of every single time you have opened my email, including your location when you opened it. Before we continue, ask yourself if you expect this information to be collected on you and relayed back to your parent, your child, your spouse, your co-worker, a salesperson, an ex, a random stranger, or a stalker every time you read an email.
Exactly! This violates the principle of least surprise. Also, it’s just plain wrong.
Amazingly though, Mike has been getting pushback from guys on Twitter (and it’s always guys) who don’t think this is a big deal.
Anyway, read the whole thing—it’s fair, balanced, and really well written.
If you’re using Apple’s VoiceOver, both your phone and your computer will broadcast your assumed disability to the entire internet, unless and until you specifically tell it to stop.
I also discussed this accessibility events feature with my friend who is a screen reader user herself. She said it feels like it’s a first step towards a well-meant digital apartheid.
When you stop to consider all the implications of poor performance, it’s hard not to come to the conclusion that poor performance is an ethical issue.
Of all the buzzwords in tech, perhaps none has been deployed with as much philosophical conviction as “frictionless.” Over the past decade or so, eliminating “friction” — the name given to any quality that makes a product more difficult or time-consuming to use — has become an obsession of the tech industry, accepted as gospel by many of the world’s largest companies.
Ben Terrett on balancing the needs of an individual user with the needs of everyone else.
Of course we care about our clients: we want them to be successful and profitable. But we think there’s a balance to be struck between what success means for just the user or customer, and what success means for society.
The terrific Hugo-winning short story about inequality, urban planning, and automation, written by Hao Jinfang and translated by Ken Liu (who translated The Three Body Problem series).
Hao Jinfang also wrote this essay about the story:
I’ve been troubled by inequality for a long time. When I majored in physics as an undergraduate, I once stared at the distribution curve for American household income that showed profound inequality, and tried to fit the data against black-body distribution or Maxwell–Boltzmann distribution. I wanted to know how such a curve came about, and whether it implied some kind of universality: something as natural as particle energy distribution functions, so natural it led to despair.
This instance of collective action from inside a tech company is important, not just for the specifics of Google, but in acting as an example to workers in other companies.
And of all the demands, this is the one that could have the biggest effect in the US tech world:
An end to Forced Arbitration.
This looks like a really interesting two-day event here in Brighton in November. Like Indie Web Camp, it features one day of talks followed by one day of making.
After a day of tech talks from project teams using their skills for social good, you’ll have the chance to take part in workshops and hackathons to use your own talents for a worthy cause.
And you get to go up the i360.
This is excellent news from Mozilla. Firefox is going to make it easier to block vampiric privacy-leeching and performance-draining third-party scripts and trackers.
In the physical world, users wouldn’t expect hundreds of vendors to follow them from store to store, spying on the products they look at or purchase. Users have the same expectations of privacy on the web, and yet in reality, they are tracked wherever they go.
A near-future sci-fi short by Hannu Rajaniemi that’s right on the zeitgest money.
The app in her AR glasses showed the car icon crawling along the winding forest road. In a few minutes, it would reach the sharp right turn where the road met the lake. The turn was marked by a road sign she had carefully defaced the previous day, with tiny dabs of white paint. Nearly invisible to a human, they nevertheless fooled image recognition nets into classifying the sign as a tree.
Prompted by his time at Clearleft’s AI gathering in Juvet, Chris has been delving deep into the stories we tell about artificial intelligence …and what stories are missing.
And here we are at the eponymous answer to the question that I first asked at Juvet around 7 months ago: What stories aren’t we telling ourselves about AI?