Tags: indie

182

sparkline

Sunday, June 18th, 2017

IndieWebifying my website: part 1, the why & how – AltPlatform

Richard MacManus begins to document the process of making his website part of the indie web.

Wednesday, June 14th, 2017

The IndieWeb Movement Will Help People Control Their Own Web Presence?

A pretty good summary of some key indie web ideas.

Sunday, June 4th, 2017

Month maps

One of the topics I enjoy discussing at Indie Web Camps is how we can use design to display activity over time on personal websites. That’s how I ended up with sparklines on my site—it was the a direct result of a discussion at Indie Web Camp Nuremberg a year ago:

During the discussion at Indie Web Camp, we started looking at how silos design their profile pages to see what we could learn from them. Looking at my Twitter profile, my Instagram profile, my Untappd profile, or just about any other profile, it’s a mixture of bio and stream, with the addition of stats showing activity on the site—signs of life.

Perhaps the most interesting visual example of my activity over time is on my Github profile. Halfway down the page there’s a calendar heatmap that uses colour to indicate the amount of activity. What I find interesting is that it’s using two axes of time over a year: days of the month across the X axis and days of the week down the Y axis.

I wanted to try something similar, but showing activity by time of day down the Y axis. A month of activity feels like the right range to display, so I set about adding a calendar heatmap to monthly archives. I already had the data I needed—timestamps of posts. That’s what I was already using to display sparklines. I wrote some code to loop over those timestamps and organise them by day and by hour. Then I spit out a table with days for the columns and clumps of hours for the rows.

Calendar heatmap on Dribbble

I’m using colour (well, different shades of grey) to indicate the relative amounts of activity, but I decided to use size as well. So it’s also a bubble chart.

It doesn’t work very elegantly on small screens: the table is clipped horizontally and can be swiped left and right. Ideally the visualisation itself would change to accommodate smaller screens.

Still, I kind of like the end result. Here’s last month’s activity on my site. Here’s the same time period ten years ago. I’ve also added month heatmaps to the monthly archives for my journal, links, and notes. They’re kind of like an expanded view of the sparklines that are shown with each month.

From one year ago, here’s the daily distribution of

And then here’s the the daily distribution of everything in that month all together.

I realise that the data being displayed is probably only of interest to me, but then, that’s one of the perks of having your own website—you can do whatever you feel like.

Tuesday, May 30th, 2017

Micropub is a W3C Recommendation • Aaron Parecki

I was just talking about micropub …and now it’s officially a W3C spec. Great work!

If you’re building a blogging platform, you can allow your users choose from a wide variety of posting clients by implementing the Micropub spec.

If you’re building a posting client and want it to work with many different server backends instead of hard-coding it to Twitter or other proprietary APIs, implement the Micropub spec and you’ll quickly have people eager to start using the app!

Checking in at Indie Web Camp Nuremberg

Once I finished my workshop on evaluating technology I stayed in Nuremberg for that weekend’s Indie Web Camp.

IndieWebCamp Nuremberg

Just as with Indie Web Camp Düsseldorf the weekend before, it was a fun two days—one day of discussions, followed by one day of making.

IndieWebCamp Nuremberg IndieWebCamp Nuremberg IndieWebCamp Nuremberg IndieWebCamp Nuremberg

I spent most of the second day playing around with a new service that Aaron created called OwnYourSwarm. It’s very similar to his other service, OwnYourGram. Whereas OwnYourGram is all about posting pictures from Instagram to your own site, OwnYourSwarm is all about posting Swarm check-ins to your own site.

Usually I prefer to publish on my own site and then push copies out to other services like Twitter, Flickr, etc. (POSSE—Publish on Own Site, Syndicate Elsewhere). In the case of Instagram, that’s impossible because of their ludicrously restrictive API, so I have go the other way around (PESOS—Publish Elsewhere, Syndicate to Own Site). When it comes to check-ins, I could do it from my own site, but I’d have to create my own databases of places to check into. I don’t fancy that much (yet) so I’m using OwnYourSwarm to PESOS check-ins.

The great thing about OwnYourSwarm is that I didn’t have to do anything. I already had the building blocks in place.

First of all, I needed some way to authenticate as my website. IndieAuth takes care of all that. All I needed was rel="me" attributes pointing from my website to my profiles on Twitter, Flickr, Github, or any other services that provide OAuth. Then I can piggyback on their authentication flow (this is also how you sign in to the Indie Web wiki).

The other step is more involved. My site needs to provide an API endpoint so that services like OwnYourGram and OwnYourSwarm can post to it. That’s where micropub comes in. You can see the code for my minimal micropub endpoint if you like. If you want to test your own micropub endpoint, check out micropub.rocks—the companion to webmention.rocks.

Anyway, I already had IndieAuth and micropub set up on my site, so all I had to do was log in to OwnYourSwarm and I immediately started to get check-ins posted to my own site. They show up the same as any other note, so I decided to spend my time at Indie Web Camp Nuremberg making them look a bit different. I used Mapbox’s static map API to show an image of the location of the check-in. What’s really nice is that if I post a photo on Swarm, that gets posted to my own site too. I had fun playing around with the display of photo+map on my home page stream. I’ve made a page for keeping track of check-ins too.

All in all, a fun way to spend Indie Web Camp Nuremberg. But when it came time to demo, the one that really impressed me was Amber’s. She worked flat out on her site, getting to the second level on IndieWebify.me …including sending a webmention to my site!

IndieWebCamp Nuremberg

Tuesday, May 23rd, 2017

Going offline at Indie Web Camp Düsseldorf

I’ve just come back from a ten-day trip to Germany. The trip kicked off with Indie Web Camp Düsseldorf over the course of a weekend.

IndieWebCamp Düsseldorf 2017

Once again the wonderful people at Sipgate hosted us in their beautiful building, and once again myself and Aaron helped facilitate the two days.

IndieWebCamp Düsseldorf 2017

Saturday was the BarCamp-like discussion day. Plenty of interesting topics were covered. I led a session on service workers, and that’s also what I decided to work on for the second day—that’s when the talking is done and we get down to making.

IndieWebCamp Düsseldorf 2017 IndieWebCamp Düsseldorf 2017 IndieWebCamp Düsseldorf 2017 IndieWebCamp Düsseldorf 2017

I like what Ethan is doing on his offline page. He shows a list of pages that have been cached, but instead of just listing URLs, he shows a title and description for each page.

I’ve already got a separate cache for pages that gets added to as the user browses around my site. I needed to figure out a way to store the metadata for those pages so that I could then display it on the offline page. I came up with a workable solution, and interestingly, it involved no changes to the service worker script at all.

When you visit any blog post, I put metadata about the page into localStorage (after first checking that there’s an active service worker):

if (navigator.serviceWorker && navigator.serviceWorker.controller) {
  window.addEventListener('load', function() {
    var data = {
      "title": "A minority report on artificial intelligence",
      "description": "Revisiting Spielberg’s films after a decade and a half.",
      "published": "May 7th, 2017",
      "timestamp": "1494171049"
    };
    localStorage.setItem(
      window.location.href,
      JSON.stringify(data)
    );
  });
}

In my case, I’m outputting the metadata from the server, but you could just as easily grab some from the DOM like this:

var data = {
  "title": document.querySelector("title").innerText,
  "description": document.querySelector("meta[name='description']").getAttribute("contents")
}

Meanwhile in my service worker, when you visit that same page, it gets added to a cache called “pages”. Both localStorage and the cache API are using URLs as keys. I take advantage of that on my offline page.

The nice thing about writing JavaScript on my offline page is that I know the page will only be seen by modern browsers that support service workers, so I can use all sorts of fancy from ES6, or whatever we’re calling it now.

I start by looping through the keys of the “pages” cache (that’s right—the cache API isn’t just for service workers; you can access it from any script). Then I check to see if there is a corresponding localStorage key with the same string (a URL). If there is, I pull the metadata out of local storage and add it to an array called browsingHistory:

const browsingHistory = [];
caches.open('pages')
.then( cache => {
  cache.keys()
  .then(keys => {
    keys.forEach( request => {
      let data = JSON.parse(localStorage.getItem(request.url));
        if (data) {
          data['url'] = request.url;
          browsingHistory.push(data);
      }
    });

Then I sort the list of pages in reverse chronological order:

browsingHistory.sort( (a,b) => {
  return b.timestamp - a.timestamp;
});

Now I loop through each page in the browsing history list and construct a link to each URL, complete with title and description:

let markup = '';
browsingHistory.forEach( data => {
  markup += `
<h2><a href="${ data.url }">${ data.title }</a></h2>
<p>${ data.description }</p>
<p class="meta">${ data.published }</p>
`;
});

Finally I dump the constructed markup into a waiting div in the page with an ID of “history”:

let container = document.getElementById('history');
container.insertAdjacentHTML('beforeend', markup);

All those steps need to be wrapped inside the then clause attached to caches.open("pages") because the cache API is asynchronous.

There you have it. Now if you’re browsing adactio.com and your network connection drops (or my server goes offline), you can choose from a list of pages you’ve previously visited.

The current situation isn’t ideal though. I’ve got a clean-up operation in my service worker to limit the number of items stored in my “pages” cache. The cache never gets bigger than 35 items. But there’s no corresponding clean-up of metadata stored in localStorage. So there could be a lot more bits of metadata in local storage than there are pages in the cache. It’s not harmful, but it’s a bit wasteful.

I can’t do a clean-up of localStorage from my service worker because service workers can’t access localStorage. There’s a very good reason for that: the localStorage API is synchronous, and everything that happens in a service worker needs to be asynchronous.

Service workers can access indexedDB: it’s asynchronous. I could use indexedDB instead of localStorage, but I’m not a masochist. My best bet would be to use the localForage library, which wraps indexedDB in the simple syntax of localStorage.

Maybe I’ll do that at the next Homebrew Website Club here in Brighton.

Amber Wilson: IndieWebCamp

Amber’s report from Indie Web Camp Nuremberg last week. I was blown away by how much she got done in one day.

Saturday, May 6th, 2017

ongoing by Tim Bray · Still Blogging in 2017

This really resonates with me. Tim Bray duly notes that people are writing on Medium, and being shunted towards native apps, and that content is getting centralised at Facebook and other hubs, and then he declares:

But I don’t care.

Same.

Any­how, I’m not go­ing away.

Same.

Tuesday, April 18th, 2017

Back to the Cave – Frank Chimero

Frank has published the (beautifully designed) text of his closing XOXO keynote.

Sunday, April 16th, 2017

@Mentions from Twitter to My Website

Chris gives a step-by-step walkthrough of enabling webmentions on a Wordpress site.

Tuesday, April 4th, 2017

A bit more on container queries. — Ethan Marcotte

Ethan wrote about container queries on his website. Paul wrote his counter-argument on his website. Now Ethan responds. It’s fun to watch two gentlemen engage in civilised discourse.

Blogs, man. They’re gonna big, I tells ya.

Wednesday, March 22nd, 2017

Writing on the web

Some people have been putting Paul’s crazy idea into practice.

  • Mike revived his site a while back and he’s been posting gold dust ever since. I enjoy his no-holds-barred perspective on his time in San Francisco.
  • Garrett’s writing goes all the way back to 2005. The cumulative result is two fascinating interweaving narratives—one about his health, another about his business.
  • Charlotte has been documenting her move from Brighton to Sydney. Much as I love her articles about front-end development, I’m liking the slice-of-life updates on life down under even more.
  • Amber has a great way with words. As well as regularly writing on her blog, she’s two-thirds of the way through writing 100 words every day for 100 days.
  • Ethan has been writing about responsive design—of course—but it’s his more personal posts that make me really grateful for his site.
  • Jeffrey and Eric never stopped writing on their own sites. Sure, there’s good stuff on their about web design and development, but it’s the writing about their non-web lives that’s so powerful.

There are more people I could mention …but, to be honest, not that many more. Seems like most people are happy to only publish on Ev’s blog or not at all.

I know not everybody wants to write on the web, and that’s fine. But it makes me sad when people choose not to publish their thoughts because they think no-one will be interested, or that it’s all been said before. I understand where those worries come from, but I believe—no, I know—that they are unfounded.

It’s a world wide web out there. There’s plenty of room for everyone. And I, for one, love reading the words of others.

Monday, February 20th, 2017

This Week in the IndieWeb Podcast by Marty McGuire

What an excellent idea! A weekly round-up in audio form of indie web and homebrew website news. Nice and short.

Chris is huffduffing it too.

Thursday, February 2nd, 2017

GarrettDimon.com

It strikes me that Garrett’s site has become a valuable record of the human condition with its mix of two personal stories—one relating to his business and the other relating to his health—both of them communicated clearly through great writing.

Have a read back through the archive and I think you’ll share my admiration.

Friday, January 6th, 2017

It’s more than just the words

I can relate to what Rachel describes here—I really like using my own website as a playground to try out new technologies. That’s half the fun of the indie web.

I had already decided to bring my content back home in 2017, but I’d also like to think about this idea of using my own site to better demonstrate and play with the new technologies I write about.

Wednesday, January 4th, 2017

Day 14: Posting to my Website from Alexa #100DaysOfIndieWeb • Aaron Parecki

Aaron documents how he posts to his website through his Amazon Echo. No interface left behind.

Tuesday, January 3rd, 2017

Indie Microblogging: owning your short-form writing by Manton Reece — Kickstarter

Here’s an interesting Kickstarter project: a book about owning your notes (and syndicating them to Twitter) to complement the forthcoming micro.blog service.

Monday, December 19th, 2016

Deep linking with fragmentions

When I was marking up Resilient Web Design I wanted to make sure that people could link to individual sections within a chapter. So I added IDs to all the headings. There’s no UI to expose that though—like the hover pattern that some sites use to show that something is linkable—so unless you know the IDs are there, there’s no way of getting at them other than “view source.”

But if you’re reading a passage in Resilient Web Design and you highlight some text, you’ll notice that the URL updates to include that text after a hash symbol. If that updated URL gets shared, then anyone following it should be sent straight to that string of text within the page. That’s fragmentions in action:

Fragmentions find the first matching word or phrase in a document and focus its closest surrounding element. The match is determined by the case-sensitive string following the # (or ## double-hash)

It’s a similar idea to Eric and Simon’s proposal to use CSS selectors as fragment identifiers, but using plain text instead. You can find out more about the genesis of fragmentions from Kevin. I’m using Jonathon Neal’s script with some handy updates from Matthew.

I’m using the fragmention support to power the index of the book. It relies on JavaScript to work though, so Matthew has come to the rescue again and created a version of the site with IDs for each item linked from the index (I must get around to merging that).

The fragmention functionality is ticking along nicely with one problem…

I’ve tweaked the typography of Resilient Web Design to within an inch of its life, including a crude but effective technique to avoid widowed words at the end of a paragraph. The last two words of every paragraph are separated by a UTF-8 no-break space character instead of a regular space.

That solves the widowed words problem, but it confuses the fragmention script. Any selected text that includes the last two words of a paragraph fails to match. I’ve tried tweaking the script, but I’m stumped. If you fancy having a go, please have at it.

Update: And fixed! Thanks to Lee.

Friday, December 16th, 2016

Thread. — Ethan Marcotte

Ethan redesigned. It’s lovely.

And now that the new site’s live, I realize I’d like to keep working on it. I’m not just feeling excited to see where it goes from here: as modest as it is, I’ve made something I’m proud of.

Saturday, November 26th, 2016

kottke.org memberships

I have so much admiration for Jason Kottke’s dedication (or sheer bloodymindedness)—he’s been diligently writing and sharing weird and wonderful stuff on his own website for so long. I’m more than happy to support him in that.