Time and again, organizations have sought to contain software’s most troublesome tendencies—its habit of sprawling beyond timelines and measurable goals—by introducing new management styles. And for a time, it looked as though companies had found in Agile the solution to keeping developers happily on task while also working at a feverish pace. Recently, though, some signs are emerging that Agile’s power may be fading. A new moment of reckoning is in the making, one that may end up knocking Agile off its perch.
A fascinating look at what it might take to create a truly sunstainable long-term computer.
A wonderful bit of spelunking into the annals of software interfaces by Elise Blanchard.
What about a scarf or collar so the back of your neck prickles when somebody is talking about you on Twitter.
Or a ghost detector for homes, restaurants, etc that glows when someone is “visiting” in Google Maps/Facebook Pages or looking through a webcam? Maybe it would be better to control the air conditioning to produce a chill, or play barely audible infrasound, indications that there is a haunting in progress and the veil here is thin.
Men specialized in hardware while software development was seen as an exciting alternative to secretarial work. In 1967, Cosmopolitan published an article titled The Computer Girls, encouraging young women to pursue careers in computer science. So the curve went up, and continued to do so up until 1984. That’s when personal computers appeared.
When Apple released the Macintosh 128K and the Commodore 64 was introduced to the market, they were presented as toys. And, as toys were gendered, they were targeted at boys. We can look at advertisements from that time and quickly find a pattern: fathers and sons, young men, even one where a man is being undressed by two women with the motto Two bytes are better than one. It’s more evident with the ads for computer games; if women appear, they do so sexualized and half-naked. Not that appealing for young girls, one could imagine.
Baldur Bjarnason writes an immense treatise on the current sad state of software, grounded in the historical perspective of the past sad state of software.
The ZX Spectrum in a time of revolution:
Gaming the Iron Curtain offers the first book-length social history of gaming and game design in 1980s Czechoslovakia, or anywhere in the Soviet bloc. It describes how Czechoslovak hobbyists imported their computers, built DIY peripherals, and discovered games as a medium, using them not only for entertainment but also as a means of self-expression.
Languages, platforms, and systems that break from the norms of computing.
I don’t think I agree with Don Knuth’s argument here from a 2014 lecture, but I do like how he sets out his table:
Why do I, as a scientist, get so much out of reading the history of science? Let me count the ways:
- To understand the process of discovery—not so much what was discovered, but how it was discovered.
- To understand the process of failure.
- To celebrate the contributions of many cultures.
- Telling historical stories is the best way to teach.
- To learn how to cope with life.
- To become more familiar with the world, and to know how science fits into the overall history of mankind.
SETI—the Search for Extra Terrestrial Information processing:
What we get is a computational device surrounding the Asymptotic Giant Branch star that is roughly the size of our Solar System.
The intent is for this website to be used by self-forming small groups that want to create a “watching club” (like a book club) and discuss aspects of technology history that are featured in this series.
I’m about ready to rewatch Halt And Catch Fire. Anybody want to form a watching club with me?
In 1990, the science fiction writer Douglas Adams produced a “fantasy documentary” for the BBC called Hyperland. It’s a magnificent paleo-futuristic artifact, rich in sideways predictions about the technologies of tomorrow.
I remember coming across a repeating loop of this documentary playing in a dusty corner of a Smithsonian museum in Washington DC. Douglas Adams wasn’t credited but I recognised his voice.
Hyperland aired on the BBC a full year before the World Wide Web. It is a prophecy waylaid in time: the technology it predicts is not the Web. It’s what William Gibson might call a “stub,” evidence of a dead node in the timeline, a three-point turn where history took a pause and backed out before heading elsewhere.
Here, Claire L. Evans uses Adams’s documentary as an opening to dive into the history of hypertext starting with Bush’s Memex, Nelson’s Xanadu and Engelbart’s oNLine System. But then she describes some lesser-known hypertext systems…
In 1985, the students at Brown who encountered Intermedia had never seen anything like it before in their lives. The system laid a world of information at their fingertips, saved them hours at the library, and helped them work through tangles of thought.
Claire L. Evans on computational slime molds and other forms of unconvential computing that look beyond silicon:
In moments of technological frustration, it helps to remember that a computer is basically a rock. That is its fundamental witchcraft, or ours: for all its processing power, the device that runs your life is just a complex arrangement of minerals animated by electricity and language. Smart rocks.
The characteristica universalis and the calculus racionator of Leibniz.
Portrait of the genius as a young man.
It is fortifying to remember that the very idea of artificial intelligence was conceived by one of the more unquantifiably original minds of the twentieth century. It is hard to imagine a computer being able to do what Alan Turing did.
From Xerox PARC to the World Wide Web:
The internet did not use a visual spatial metaphor. Despite being accessed through and often encompassed by the desktop environment, the internet felt well and truly placeless (or perhaps everywhere). Hyperlinks were wormholes through the spatial metaphor, allowing a user to skip laterally across directories stored on disparate servers, as well as horizontally, deep into a file system without having to access the intermediate steps. Multiple windows could be open to the same website at once, shattering the illusion of a “single file” that functioned as a piece of paper that only one person could hold. The icons that a user could arrange on the desktop didn’t have a parallel in online space at all.
I am not a believer in the AI singularity — the rapture of the nerds — that is, in the possibility of building a brain-in-a-box that will self-improve its own capabilities until it outstrips our ability to keep up. What CS professor and fellow SF author Vernor Vinge described as “the last invention humans will ever need to make”. But I do think we’re going to keep building more and more complicated, systems that are opaque rather than transparent, and that launder our unspoken prejudices and encode them in our social environment. As our widely-deployed neural processors get more powerful, the decisions they take will become harder and harder to question or oppose. And that’s the real threat of AI — not killer robots, but “computer says no” without recourse to appeal.