The Cost of Convenience - DEV Community
The pros and cons of dependencies in your toolchain.
The pros and cons of dependencies in your toolchain.
- Don’t wrap too much of your identity in a tool.
- Every tool will eventually fade.
- Flexibility is a valuable skill
- Changing tools does not mean starting over.
I agree with pretty much every word of this article.
Perhaps most problematic of all is the effect that contemporary developer experience has on educational programs (be they traditional classes, bootcamps, workshops, or anything in between). Such a rapidly expanding and ever changing technological ecosystem necessarily means that curricula struggle to keep up, and that the fundamentals of web development (e.g. HTML, CSS, HTTP, browser APIs…) are often glossed over in favor of getting students into the technologies more likely to land them jobs (like React and its many pals). This leads to an outpouring of early career developers who may speak confidently about things like React hooks or Redux state reducers, but who also lack any concept about the nature of HTML semantics or the most basic accessibility considerations. To be clear, I’m not throwing shade at those developers — they have been failed by an industry obsessed with the new and shiny at the expense of foundational practices and end user experiences.
And so, I ask: what exactly are we buying when we are sold ‘developer experience’ today? Who is benefiting from it? And if it is indeed something many of us aren’t too excited about (to put it kindly), how can we change it for the better?
I agree with pretty much every word of this article.
We were told writing apps with an HTML-first, SSR-first, progressively enhanced mindset, using our preferred language/tech stack of choice, was outdated and bad for users.
That was a lie.
We were told writing apps completely using frontend-y JavaScript would make our lives easier.
That also was a lie.
I agree with pretty much every word of this article.
If you’re top priority is paid employment, right now, React is a great choice for that.
True. But…
If your priority is long-term resilience and maintainability, vanilla JS (probably with a light build process on top of it) is the ideal choice.
It will never become obsolete, or suffer from a breaking version change. It’s fast and performant, results in less code sent over the wire, and generally has a smaller footprint of things to break.
For me, a complicated Javascript build system just doesn’t seem worth it for small 500-line projects – it means giving up being able to easily update the project in the future in exchange for some pretty marginal benefits.
This! Also, this:
I’m writing this because most of the writing I see about JS assumes that you’re using a build system, and it can be hard to navigate for folks like me who write very simple small Javascript projects that don’t require a build system.
A search engine for images and audio that’s either under a Creative Commons license or is in the public domain.
I think some tools are a good idea. But as few as possible, and the easier they are to stop using, the better.
Now Michelle asks:
Suppose we want to stop using Tailwind one day?
Turns out it’s a bit of a roach motel, much like most JavaScript frameworks: you can get in but you can’t easily get out.
So whenever possible, the safest, and most future-proof bet is to use the native features of the web platform.
Wise words:
Myself, I think some tools are a good idea. But as few as possible, and the easier they are to stop using, the better.
To paraphrase Michael Pollan:
Write CSS, not too much, mostly vanilla.
When Dan wrote this a week ago, I thought it sounded very far-fetched. Now it sounds almost inevitable.
How do you keep knowledge alive over centuries? Stuff that seems big enough for a group of people to worry about at the time, but not so big it makes world news. Not the information that gets in all the textbooks, but just the stuff that makes the world gently tick over.
I can very much relate to what Craig is saying here.
Always refreshing to see some long-term thinking applied to the web.
Growing—that’s a word I want to employ when talking about my personal sites online. Like a garden, I’m constantly puttering around in them. Sometimes I plow and sow a whole new feature for a site. Sometimes I just pick weeds.
I like this analogy. It reminds me of the the cooking analogy that others have made.
Most of my favorite websites out there are grown—homegrown in fact. They are corners of the web where some unique human has been nurturing, curating, and growing stuff for years. Their blog posts, their links, their thoughts, their aesthetic, their markup, their style, everything about their site—and themselves—shows growth and evolution and change through the years. It’s a beautiful thing, a kind of artifact that could never be replicated or manufactured on a deadline.
This part of the web, this organic part, stands in start contrast to the industrial web where websites are made and resources extracted.
A graveyard for good domains you let expire.
This is a great talk by Nadia Eghbal on software, open source, maintenance, and of course, long-term thinking.
More on battling entropy:
Ever needed to change “just a small thing” on an old page you build years ago? I recently had the pleasure and the simple task of changing some colors in CSS lead to a whole day of me wrangling with old deprecated Grunt tasks and trying to get the build task running.
The solution:
That’s why starting with HTML, CSS and JavaScript without the need to ever compile anything on your local machine is a good idea. Changing some colors on such a page would indeed only take minutes and not a whole day.
I like this mindset:
Be boring by default and enhance on the way.
This post really highlights one of the biggest issues with the convoluted build tools used for “modern” web development. If you return to a project after any length of time, this is what awaits:
I find entropy staring me back in the face: library updates, breaking API changes, refactored mental models, and possible downright obsolescence. An incredible amount of effort will be required to make a simple change, test it, and get it live.
Take a moment and think about this super power: if you write vanilla HTML, CSS, and JS, all you have to do is put that code in a web browser and it runs. Edit a file, refresh the page, you’ve got a feedback cycle. As soon as you introduce tooling, as soon as you introduce an abstraction not native to the browser, you may have to invent the universe for a feedback cycle.
Maintainability matters—if not for you, then for future you.
The more I author code as it will be run by the browser the easier it will be to maintain that code over time, despite its perceived inferior developer ergonomics (remember, developer experience encompasses both the present and the future, i.e. “how simple are the ergonomics to build this now and maintain it into the future?) I don’t mind typing some extra characters now if it means I don’t have to learn/relearn, setup, configure, integrate, update, maintain, and inevitably troubleshoot a build tool or framework later.
Don’t blame it on the COBOL:
It’s a common fiction that computing technologies tend to become obsolete in a matter of years or even months, because this sells more units of consumer electronics. But this has never been true when it comes to large-scale computing infrastructure. This misapprehension, and the language’s history of being disdained by an increasingly toxic programming culture, made COBOL an easy scapegoat. But the narrative that COBOL was to blame for recent failures undoes itself: scapegoating COBOL can’t get far when the code is in fact meant to be easy to read and maintain.
It strikes me that the resilience of programmes written in COBOL is like the opposite of today’s modern web stack, where the tangled weeds of nested dependencies ensures that projects get harder and harder to maintain over time.
In a field that has elevated boy geniuses and rockstar coders, obscure hacks and complex black-boxed algorithms, it’s perhaps no wonder that a committee-designed language meant to be easier to learn and use—and which was created by a team that included multiple women in positions of authority—would be held in low esteem. But modern computing has started to become undone, and to undo other parts of our societies, through the field’s high opinion of itself, and through the way that it concentrates power into the hands of programmers who mistake social, political, and economic problems for technical ones, often with disastrous results.
I probably need to upgrade the Huffduffer server but Maciej nails why that’s an intimidating prospect:
Doing this on a live system is like performing kidney transplants on a playing mariachi band. The best case is that no one notices a change in the music; you chloroform the players one at a time and try to keep a steady hand while the band plays on. The worst case scenario is that the music stops and there is no way to unfix what you broke, just an angry mob. It is very scary.