Link tags: complexity

72

sparkline

The Resiliency of the Internet | Jim Nielsen’s Weblog

An ode to the network architecture of the internet:

I believe the DNA of resiliency built into the network manifests itself in the building blocks of what’s transmitted over the network. The next time somebody calls HTML or CSS dumb, think about that line again:

That simplicity, almost an intentional brainlessness…is a key to its adaptability.

It’s not a bug. It’s a feature.

Yes! I wish more web developers would take cues from the very medium they’re building atop of.

Lateral Thinking With Withered Technology · Matthias Ott – User Experience Designer

What web development can learn from the Nintendo Game and Watch.

The Web now consists of an ever-growing number of different frameworks, methodologies, screen sizes, devices, browsers, and connection speeds. “Lateral thinking with withered technology” – progressively enhanced – might actually be an ideal philosophy for building accessible, performant, resilient, and original experiences for a wide audience of users on the Web.

Make me think! – Ralph Ammer

This is about seamful design.

We need to know things better if we want to be better.

It’s also about progressive enhancement.

Highly sophisticated systems work flawlessly, as long as things go as expected.

When a problem occurs which hasn’t been anticipated by the designers, those systems are prone to fail. The more complex the systems are, the higher are the chances that things go wrong. They are less resilient.

Progressive · Matthias Ott – User Experience Designer

Progressive enhancement is not yet another technology or passing fad. It is a lasting strategy, a principle, to deal with complexity because it lets you build inclusive, resilient experiences that work across different contexts and that will continue to work, once the next fancy JavaScript framework enters the scene – and vanishes again.

But why don’t more people practice progressive enhancement? Is it only because they don’t know better? This might, in fact, be the primary reason. On top of that, especially many JavaScript developers seem to believe that it is not possible or necessary to build modern websites and applications that way.

A heartfelt look at progressive enhancement:

Some look at progressive enhancement like a thing from the past of which the old guard just can’t let go. But to me, progressive enhancement is the future of the Web. It is the basis for building resilient, performant, interoperable, secure, usable, accessible, and thus inclusive experiences. Not only for the Web of today but for the ever-growing complexity of an ever-changing and ever-evolving Web.

Your blog doesn’t need a JavaScript framework /// Iain Bean

If the browser needs to parse 296kb of JavaScript to show a list of blog posts, that’s not Progressive Enhancement, it’s using the wrong tool for the job.

A good explanation of the hydration problem in tools like Gatsby.

JavaScript is a powerful language that can do some incredible things, but it’s incredibly easy to jump to using it too early in development, when you could be using HTML and CSS instead.

On dependency | RobWeychert.com V7

I’m very selective about how I depend on other people’s work in my personal projects. Here are the factors I consider when evaluating dependencies.

  • Complexity How complex is it, who absorbs the cost of that complexity, and is that acceptable?
  • Comprehensibility Do I understand how it works, and if not, does that matter?
  • Reliability How consistently and for how long can I expect it to work?

I really like Rob’s approach to choosing a particular kind of dependency when working on the web:

When I’m making things, that’s how I prefer to depend on others and have them depend on me: by sharing strong, simple ideas as a collective, and recombining them in novel ways with rigorous specificity as individuals.

Today’s Javascript, from an outsider’s perspective | Lea Verou

This is a damning and all-too typical example of what it’s like for someone to trying to get to grips with the current state of the JavaScript ecosystem:

Note that John is a computer scientist that knows a fair bit about the Web: He had Node & npm installed, he knew what MIME types are, he could start a localhost when needed. What hope do actual novices have?

I think it’s even worse than that. Not only are potential new devs being put off ever getting started, I know plenty of devs with experience who have pushed out by the overwhelming and needless complexity of the modern web’s toolchain. It’s like a constant gaslighting where any expression of unease is summarily dismissed as being the whinings of “the old guard” who just won’t get with the programme.

John gives up. Concludes never to touch Node, npm, or ES6 modules with a barge pole.

The End.

(Just watch as Lea’s post gets written off as an edge case.)

CSS Architecture for Modern JavaScript Applications - MadeByMike

Mike sees the church of JS-first ignoring the lessons to be learned from the years of experience accumulated by CSS practitioners.

As the responsibilities of front-end developers have become more broad, some might consider the conventions outlined here to be not worth following. I’ve seen teams spend weeks planning the right combination of framework, build tools, workflows and patterns only to give zero consideration to the way they architect UI components. It’s often considered the last step in the process and not worthy of the same level of consideration.

It’s important! I’ve seen well-planned project fail or go well over budget because the UI architecture was poorly planned and became un-maintainable as the project grew.

Complexity Explained

Emergence and complex systems, explained with interactive diagrams.

This Page is Designed to Last | CSS-Tricks

I feel there is something beyond the technological that is the real trick to a site that lasts: you need to have some stake in the game. You don’t let your URLs die because you don’t want them to. They matter to you. You’ll tend to them if you have to. They benefit you in some way, so you’re incentivized to keep them around. That’s what makes a page last.

Software disenchantment @ tonsky.me

I want to deliver working, stable things. To do that, we need to understand what we are building, in and out, and that’s impossible to do in bloated, over-engineered systems.

This pairs nicely with Craig’s post on fast software.

Everyone is busy building stuff for right now, today, rarely for tomorrow. But it would be nice to also have stuff that lasts a little longer than that.

I just got a new laptop and I decided to go with fresh installs rather than a migration. This really resonates:

It just seems that nobody is interested in building quality, fast, efficient, lasting, foundational stuff anymore. Even when efficient solutions have been known for ages, we still struggle with the same problems: package management, build systems, compilers, language design, IDEs.

Frank Chimero · Redesign: On This Design

Most experienced designers want concision—clear, robust, consistent, elegant systems that avoid redundancy. Concise designs are smoother to implement, faster to render, quicker to understand, and easier to hand-off and maintain. Achieving a simplicity with clarity means that you’re engaging with the fundamentals of the problem (and of your craft) at the correct fidelity. You’ve cut through complexity with insight, understanding, and committed decision-making. That third one is critical. A lot of complexity comes from an unwillingness to commit to the things that insight and understanding surface.

This Page is Designed to Last: A Manifesto for Preserving Content on the Web

Geocities, LiveJournal, what.cd, now Yahoo Groups. One day, Medium, Twitter, and even hosting services like GitHub Pages will be plundered then discarded when they can no longer grow or cannot find a working business model.

Considering the needs of someone who wants to make and maintain a website, without the ridiculous complexity of “modern” web tooling:

How do we make web content that can last and be maintained for at least 10 years? As someone studying human-computer interaction, I naturally think of the stakeholders we aren’t supporting. Right now putting up web content is optimized for either the professional web developer (who use the latest frameworks and workflows) or the non-tech savvy user (who use a platform).

Music and Web Design | Brad Frost

I feel my trajectory as a musician maps to the trajectory of the web industry. The web is still young. We’re all still figuring stuff out and we’re all eager to get better. In our eagerness to get better, we’re reaching for more complexity. More complex abstractions, build processes, and tools. Because who wants to be bored playing in 4/4 when you can be playing in 7/16?

I hope we in the web field will arrive at the same realization that I did as a musician: complexity is not synonymous with quality.

Can I get an “Amen!”?

Everything is Amazing, But Nothing is Ours – alexdanco.com

Worlds of scarcity are made out of things. Worlds of abundance are made out of dependencies. That’s the software playbook: find a system made of costly, redundant objects; and rearrange it into a fast, frictionless system made of logical dependencies. The delta in performance is irresistible, and dependencies are a compelling building block: they seem like just a piece of logic, with no cost and no friction. But they absolutely have a cost: the cost is complexity, outsourced agency, and brittleness. The cost of ownership is up front and visible; the cost of access is back-dated and hidden.

You really don’t need all that JavaScript, I promise

The transcript of a fantastic talk by Stuart. The latter half is a demo of Portals, but in the early part of the talk, he absolutely nails the rise in popularity of complex front-end frameworks:

I think the reason people started inventing client-side frameworks is this: that you lose control when you load another page. You click on a link, you say to the browser: navigate to here. And that’s it; it’s now out of your hands and in the browser’s hands. And then the browser gives you back control when the new page loads.

Keeping it simple with CSS that scales - Andy Bell

The transcript of Andy’s talk from this year’s State Of The Browser conference.

I don’t think using scale as an excuse for over-engineering stuff—especially CSS—is acceptable, even for huge teams that work on huge products.

The web without the web - DEV Community 👩‍💻👨‍💻

I love React. I love how server side rendering React apps is trivial because it all compiles down to vanilla HTML rather than web components, effectively turning it into a kickass template engine that can come alive. I love the way you can very effectively still do progressive enhancement by using completely semantic markup and then letting hydration do more to it.

I also hate React. I hate React because these behaviours are not defaults. React is not gonna warn you if you make a form using divs and unlabelled textboxes and send the whole thing to a server. I hate React because CSS-in-JS approaches by default encourage you to write completely self contained one off components rather than trying to build a website UI up as a whole. I hate the way server side rendering and progressive enhancement are not defaults, but rather things you have to go out of your way to do.

An absolutely brilliant post by Laura on how the priorites baked into JavaScript tools like React are really out of whack. They’ll make sure your behind-the-scenes code is super clean, but not give a rat’s ass for the quality of the output that users have to interact with.

And if you want to adjust the front-end code, you’ve got to set up all this tooling just to change a div to a button. That’s quite a barrier to entry.

In elevating frontend to the land of Serious Code we have not just made things incredibly over-engineered but we have also set fire to all the ladders that we used to get up here in the first place.

AMEN!

I love React because it lets me do my best work faster and more easily. I hate React because the culture around it more than the library itself actively prevents other people from doing their best work.

The Real Dark Web

Charlie’s thoughts on dev perception:

People speak about “the old guard” and “stupid backwards techniques”, forgetting that it’s real humans, with real constraints who are working on these solutions. Most of us are working in a “stupid backwards way” because that “backwardsness” WORKS. It is something that is proven and is clearly documented. We can implement it confident that it will not disappear from fashion within a couple of years.