Alan Kay’s initial description of a “Dynabook” written at Xerox PARC in 1972.
Anecdotes about the development of Apple’s original Macintosh, and the people who made it.
Like a real-life Halt And Catch Fire.
Occasionally, people e-mail me to say something along the lines of “I’ve come up with something to replace HTML!”.
Five years ago, Hixie outlined the five metrics that a competitor to the web would have to score well in:
- Be completely devoid of any licensing requirements.
- Be vendor-neutral.
- Be device-neutral and media-neutral.
- Be content-neutral.
- Be radically better than the existing Web.
You come at the king, you best not miss.
This book—released today—looks right up my alley.
After World War I, Smith used her talents to catch gangsters and smugglers during Prohibition, then accepted a covert mission to discover and expose Nazi spy rings that were spreading like wildfire across South America, advancing ever closer to the United States. As World War II raged, Elizebeth fought a highly classified battle of wits against Hitler’s Reich, cracking multiple versions of the Enigma machine used by German spies.
- the early era: ~1996 – 2004,
- the jQuery era: ~2004 – 2010,
- the Single Page App era: ~2010 - 2014, and
- the modern era: ~2014 - present.
A great bit of web history spelunking in search of the first websites that allowed users to interact with data on a server. Applications, if you will. It’s well written, but I take issue with this:
The world wide web wasn’t supposed to be this fun. Berners-Lee imagined the internet as a place to collaborate around text, somewhere to share research data and thesis papers.
This often gets trotted out (“the web was intended for scientists sharing documents”), but it’s simply not true that Tim Berners-Lee was only thinking of his immediate use-case; he deliberately made the WWW project broad enough to allow all sorts of thitherto unforeseen uses. If he hadn’t …well, the web wouldn’t have been able to accommodate all those later developments. It’s not an accident that the web was later used for all sorts of unexpected things—that was the whole idea.
Anyway, apart from that misstep, the rest of the article is a fun piece, well worth reading.
Here’s the closing keynote I gave at Frontend Conference in Zurich a couple of weeks back.
Perhaps the most permanent action that any human being has accomplished in the history of our species is when one of our ancestors placed this cave bear skull on a rock, where still it sits, tens of thousands of years later.
An astonishing dose of perspective delivered via a lovely bit of hypertext by Matt.
Most technologies are overestimated in the short term. They are the shiny new thing. Artificial Intelligence has the distinction of having been the shiny new thing and being overestimated again and again, in the 1960’s, in the 1980’s, and I believe again now.
Rodney Brooks is not bullish on the current “marketing” of Artificial Intelligence. Riffing on Arthur C. Clarke’s third law, he points out that AI—as currently described—is indistinguishable from magic in all the wrong ways.
This is a problem we all have with imagined future technology. If it is far enough away from the technology we have and understand today, then we do not know its limitations. It becomes indistinguishable from magic.
Watch out for arguments about future technology which is magical. It can never be refuted. It is a faith-based argument, not a scientific argument.
I am convinced that it is not the girls that must change, but rather society’s view of “computing” and the whole culture of the computing industry.
With the advent of artificial intelligence, this is about to get really serious. There are worrying signs that the world of big data and machine learning is even more dominated by men than computing in general. This means that the people writing the algorithms for software that will control many automated aspects of our daily lives in the future are mainly young, white men.
Time-shifted photographs of my hometown in Ireland.
I love John’s long-zoom look at web development. Step back far enough and you can start to see the cycles repeating.
Underneath all of these patterns and practices and frameworks and libraries are core technologies. And underlying principles.
These are foundations – technological, and of practice – that we ignore, overlook, or flaunt at our peril.
Of course, information existed before Shannon, just as objects had inertia before Newton. But before Shannon, there was precious little sense of information as an idea, a measurable quantity, an object fitted out for hard science. Before Shannon, information was a telegram, a photograph, a paragraph, a song. After Shannon, information was entirely abstracted into bits. The sender no longer mattered, the intent no longer mattered, the medium no longer mattered, not even the meaning mattered: A phone conversation, a snatch of Morse telegraphy, a page from a detective novel were all brought under a common code. Just as geometers subjected a circle in the sand and the disk of the sun to the same laws, and as physicists subjected the sway of a pendulum and the orbits of the planets to the same laws, Claude Shannon made our world possible by getting at the essence of information.
Toilet paper, barbed wire, shipping containers, and replicants.
Silicon Valley’s weapon of choice against women: shoddy science | Angela Saini | Opinion | The Guardian
Those who want to use science to support their views – especially if they seek to undermine equality efforts in the workplace – must make an effort to fully inform themselves about the science of human nature. They may be disappointed to learn that it’s not as simple as they think.
For more, read Angela Saini’s book Inferior: How Science Got Women Wrong and the New Research That’s Rewriting the Story.
An excellent rebuttal of that vile manifestbro, and an informative history lesson to boot.
You can’t cherry-pick a couple of scientific studies you like and use them to justify your arguments against diversity programs, while carefully ignoring the mountains of other scientific studies that show both how and why diversity programs are good, beneficial to all, and worth investing in.
I wish I could be this calm in refuting pseudoscientific bollocks, but I get so worked up by it that I’d probably undermine my own message. I’m glad that Faruk took the time to write this down.
Web developers aren’t going to shed many tears for Flash, but as Bruce rightly points out, it led the way for many standards that followed. Flash was the kick up the arse that the web needed.
He also brings up this very important question:
I’m also nervous; one of the central tenets of HTML is to be backwards-compatible and not to break the web. It would be a huge loss if millions of Flash movies become unplayable. How can we preserve this part of our digital heritage?
This is true of the extinction of any format. Perhaps this is an opportunity for us to tackle this problem head on.
I wrote this song while my colleague Tim Berners-Lee was inventing something called “The World Wide Web” a few offices away. The song was published in 1993, when less that 100 websites existed.
The first image ever published on the web was of this band, Les Horribles Cernettes …LHC.
How the IETF redefined the process of creating standards.
To some visionary pioneers, such as Ted Nelson, who had been developing a purist hypertext paradigm called Xanadu for decades, the browser represented an undesirably messy direction for the evolution of the Internet. To pragmatists, the browser represented important software evolving as it should: in a pluralistic way, embodying many contending ideas, through what the Internet Engineering Task Force (IETF) calls “rough consensus and running code.”
Beyond Curie is a design project that highlights badass women in science, technology, engineering + mathematics.