Tags: security

143

sparkline

Sunday, November 11th, 2018

Push without notifications

On the first day of Indie Web Camp Berlin, I led a session on going offline with service workers. This covered all the usual use-cases: pre-caching; custom offline pages; saving pages for offline reading.

But on the second day, Sebastiaan spent a fair bit of time investigating a more complex use of service workers with the Push API.

The Push API is what makes push notifications possible on the web. There are a lot of moving parts—browser, server, service worker—and, frankly, it’s way over my head. But I’m familiar with the general gist of how it works. Here’s a typical flow:

  1. A website prompts the user for permission to send push notifications.
  2. The user grants permission.
  3. A whole lot of complicated stuff happens behinds the scenes.
  4. Next time the website publishes something relevant, it fires a push message containing the details of the new URL.
  5. The user’s service worker receives the push message (even if the site isn’t open).
  6. The service worker creates a notification linking to the URL, interrupting the user, and generally adding to the weight of information overload.

Here’s what Sebastiaan wanted to investigate: what if that last step weren’t so intrusive? Here’s the alternate flow he wanted to test:

  1. A website prompts the user for permission to send push notifications.
  2. The user grants permission.
  3. A whole lot of complicated stuff happens behinds the scenes.
  4. Next time the website publishes something relevant, it fires a push message containing the details of the new URL.
  5. The user’s service worker receives the push message (even if the site isn’t open).
  6. The service worker fetches the contents of the URL provided in the push message and caches the page. Silently.

It worked.

I think this could be a real game-changer. I don’t know about you, but I’m very, very wary of granting websites the ability to send me push notifications. In fact, I don’t think I’ve ever given a website permission to interrupt me with push notifications.

You’ve seen the annoying permission dialogues, right?

In Firefox, it looks like this:

Will you allow name-of-website to send notifications?

[Not Now] [Allow Notifications]

In Chrome, it’s:

name-of-website wants to

Show notifications

[Block] [Allow]

But in actual fact, these dialogues are asking for permission to do two things:

  1. Receive messages pushed from the server.
  2. Display notifications based on those messages.

There’s no way to ask for permission just to do the first part. That’s a shame. While I’m very unwilling to grant permission to be interrupted by intrusive notifications, I’d be more than willing to grant permission to allow a website to silently cache timely content in the background. It would be a more calm technology.

Think of the use cases:

  • I grant push permission to a magazine. When the magazine publishes a new article, it’s cached on my device.
  • I grant push permission to a podcast. Whenever a new episode is published, it’s cached on my device.
  • I grant push permission to a blog. When there’s a new blog post, it’s cached on my device.

Then when I’m on a plane, or in the subway, or in any other situation without a network connection, I could still visit these websites and get content that’s fresh to me. It’s kind of like background sync in reverse.

There’s plenty of opportunity for abuse—the cache could get filled with content. But websites can already do that, and they don’t need to be granted any permissions to do so; just by visiting a website, it can add multiple files to a cache.

So it seems that the reason for the permissions dialogue is all about displaying notifications …not so much about receiving push messages from the server.

I wish there were a way to implement this background-caching pattern without requiring the user to grant permission to a dialogue that contains the word “notification.”

I wonder if the act of adding a site to the home screen could implicitly grant permission to allow use of the Push API without notifications?

In the meantime, the proposal for periodic synchronisation (using background sync) could achieve similar results, but in a less elegant way; periodically polling for new content instead of receiving a push message when new content is published. Also, it requires permission. But at least in this case, the permission dialogue should be more specific, and wouldn’t include the word “notification” anywhere.

Monday, October 22nd, 2018

So We Got Tracked Anyway

Even using a strict cookie policy won’t help when Facebook and Google are using TLS to fingerprint users. Time to get more paranoid:

HTTPS session identifiers can be disabled in Mozilla products manually by setting ‘security.ssl.disablesessionidentifiers’ in about:config.

Thursday, August 16th, 2018

On HTTPS and Hard Questions - TimKadlec.com

A great post by Tim following on from the post by Eric I linked to last week.

Is a secure site you can’t access better than an insecure one you can?

He rightly points out that security without performance is exlusionary.

…we’ve made a move to increase the security of the web by doing everything we can to get everything running over HTTPS. It’s undeniably a vital move to make. However this combination—poor performance but good security—now ends up making the web inaccessible to many.

Security. Performance. Accessibility. All three matter.

Wednesday, August 15th, 2018

Google AMP - A 70% drop in our conversion rate. - Rockstar Coders

Google hijacking and hosting your AMP pages (in order to pre-render them) is pretty terrible for user experience and security:

I’m trying to establish my company as a legitimate business that can be trusted by a stranger to build software for them. Having google.com reeks of a phishing scam or fly by night operation that couldn’t afford their own domain.

Tuesday, August 7th, 2018

Securing Web Sites Made Them Less Accessible – Eric’s Archived Thoughts

This is a heartbreaking observation by Eric. He’s not anti-HTTPS by any stretch, but he is pointing out that caching servers become a thing of the past on a more secure web.

Can we do anything? For users of up-to-date browsers, yes: service workers create a “good” man in the middle that sidesteps the HTTPS problem, so far as I understand. So if you’re serving content over HTTPS, creating a service worker should be one of your top priorities right now, even if it’s just to do straightforward local caching and nothing fancier.

Tuesday, July 10th, 2018

When 7 KB Equals 7 MB - Cloud Four

I remember Jason telling me about this weird service worker caching behaviour a little while back. This piece is a great bit of sleuthing in tracking down the root causes of this strange issue, followed up with a sensible solution.

Sunday, July 8th, 2018

BBC News on HTTPS – BBC Design + Engineering – Medium

BBC News has switched to HTTPS—hurrah!

Here, one of the engineers writes on Ev’s blog about the challenges involved. Personally, I think this is far more valuable and inspiring to read than the unempathetic posts claiming that switching to HTTPS is easy.

Update: Paul found the original URL for this …weird that they don’t link to it from the syndicated version.

Saturday, June 23rd, 2018

I discovered a browser bug - JakeArchibald.com

Jake’s blow-by-blow account of uncovering a serious browser vulnerability is fascinating. But if you don’t care for the technical details, skip ahead to to how different browser makers handled the issue—it’s very enlightening. (And if you do care for the technical details, make sure you click on the link to the PDF version of this post.)

Monday, June 18th, 2018

The crooked timber of humanity | 1843

Tom Standage—author of the brilliant book The Victorian Internet—relates a tale of how the Chappe optical telegraph was hacked in 19th century France, thereby making it one of the earliest recorded instances of a cyber attack.

Saturday, June 16th, 2018

Artificial Intelligence for more human interfaces | Christian Heilmann

An even-handed assessment of the benefits and dangers of machine learning.

Tuesday, June 12th, 2018

Password Tips From a Pen Tester: Common Patterns Exposed

I’ve been wondering about this for quite a while: surely demanding specific patterns in a password (e.g. can’t be all lowercase, must include at least one number, etc.) makes it easier to crack them, right? I mean, you’re basically providing a ruleset for brute-forcing.

Turns out, yes. That’s exactly right.

When employees are faced with this requirement, they tend to:

  • Choose a dictionary word or a name
  • Make the first character uppercase
  • Add a number at the end, and/or an exclamation point

If we know that is a common pattern, then we know where to start…

Tuesday, June 5th, 2018

CORS

A thorough explanation of the history and inner workings of Cross-Origin Resource Sharing.

Like tales of a mythical sea beast, every developer has a story to tell about the day CORS seized upon one of their web requests, dragging it down into the inexorable depths, never to be seen again.

Friday, June 1st, 2018

A cartoon intro to DNS over HTTPS – Mozilla Hacks – the Web developer blog

This is a great illustrated explanation of how DNS resolution works.

CSS Is So Overpowered It Can Deanonymize Facebook Users

First of all, don’t panic—this browser vulnerability has been fixed, so the headline is completely out of proportion to the reality. But my goodness, this was a clever technique!

The technique relies on luring users to a malicious site where the attacker embeds iframes to other sites. In their example, the two embedded iframes for one of Facebook’s social widgets, but other sites are also susceptible to this issue.

The attack consists of overlaying a huge stack of DIV layers with different blend modes on top of the iframe. These layers are all 1x1 pixel-sized, meaning they cover just one pixel of the iframe.

Habalov and Weißer say that depending on the time needed to render the entire stack of DIVs, an attacker can determine the color of that pixel shown on the user’s screen.

The researchers say that by gradually moving this DIV “scan” stack across the iframe, “it is possible to determine the iframe’s content.”

Thursday, March 8th, 2018

Google and HTTP

I share many of these concerns.

The web is huge. Even bigger than Google. I love that the web preserves all the work. I don’t think anyone has the right to change the web so they no longer work.

Thursday, March 1st, 2018

Third party CSS is not safe - JakeArchibald.com

We all know that adding a third-party script to your site is just asking for trouble. But Jake points out that adding a third-party anything to your site is a bad idea.

Trust no one.

Tuesday, February 27th, 2018

Let’s talk about usernames

This post goes into specifics on Django, but the broader points apply no matter what your tech stack. I’m relieved to find out that The Session is using the tripartite identity pattern (although Huffduffer, alas, isn’t):

What we really want in terms of identifying users is some combination of:

  1. System-level identifier, suitable for use as a target of foreign keys in our database
  2. Login identifier, suitable for use in performing a credential check
  3. Public identity, suitable for displaying to other users

Many systems ask the username to fulfill all three of these roles, which is probably wrong.

Monday, January 22nd, 2018

We need more phishing sites on HTTPS!

All the books, Montag.

If we want a 100% encrypted web then we need to encrypt all sites, despite whether or not you agree with what they do/say/sell/etc… 100% is 100% and it includes the ‘bad guys’ too.

Bad Month for the Main Thread - daverupert.com

JavaScript is CPU intensive and the CPU is the bottleneck for performance.

I’m on Team Dave.

But darn it all, I just want to build modular websites using HTML and a little bit of JavaScript.

Sunday, January 7th, 2018

I’m harvesting credit card numbers and passwords from your site. Here’s how.

This is a “what if?” scenario, but it’s all too plausible.

For site owners, the (partial) solution is to have a strong Content Security Policy.

For users, the solution is to disable JavaScript.

(In the wake of Spectre and Meltdown, this is now a perfectly legitimate action for security-conscious web users to take; I hope your site can support that.)