LLMs and The Trap of Frictionless Design
A lot of friends who teach have commented on a trend that's accelerated since the pandemic: students don't know what a filesystem is anymore. This is, as you might imagine, a fairly difficult stumbling block when you're trying to teach people to Do Computer. The students entering college over the past few years are, by no coincidence, the first generation to grow up with smartphones. For a lot of them, they cannot remember a time before their smartphone was a part of their daily life.
I got my first iPhone when I was about 17. For better or worse, as a child of the 90s, I grew up using computers, not touch devices, and I first encountered the web when it was still in its infancy. I am trying to avoid entering "back in my day" territory here: the internet was terrible then and it's terrible now. But I do think something valuable was lost as smartphones became most people's primary form of interaction with computation and the internet. The filesystem is a prime example of this.
A filesystem is an abstract representation of how data is stored. I neither want nor expect most people to learn to use the terminal extensively or learn how the operating system works, but the filesystem is still "closer to the metal" than whatever it is we have today. By representing data as files and folders, the operating system gives people a stronger and more intuitive understanding of what the hell their computer is doing. There's been a lot of nostalgia for 90s and early 2000s-style interfaces in recent years, and my working hypothesis is that a lot of that is because those interfaces gave users more power over their software. They had more menus, more buttons, more options to customize your experience and learn to become a power user. And, of course, they ran on a big, rickety tower PC instead of a smooth, buttonless hand computer.
What I call "frictionless design" here does the opposite. It tries to make the input-output loop of a given piece of software as friction-free as possible. You get rid of an interface to the filesystem. You take away customization options in your editor so your interface is clutter-free. You, as much as possible, have people interact with a single ecosystem, a single design philosophy, and avoid pushing them towards custom solutions or advanced use.
What the larger move towards frictionless design does, through everything from touch devices to the SaaS industry, is remove a lot of the friction and confusion by obfuscating a lot of the weird inner workings of your computer. You trade the long-term value that dealing with friction creates for short-term convenience.
This does not, on its face, seem to be a bad thing. Making things easier for users is an admirable enough goal. And, to be fair, frictionless design has made computing accessible to a lot more people. I won't comment on whether that's a net good or not, although I suspect that we might be better off if boomers still didn't know how to use the internet. But by abstracting away what the computer is doing, it makes people more dependent on the service being provided. So you end up with computer science students who don't know how to use their computer, and people whose entire livelihoods are dependent on a single device, operating system, or software. A world where people can lose everything because a server goes down or a company goes kaput does not, on net, strike me as an ideal situation.
It may be annoying to have to futz around in your filesystem, learn some basic terminal commands, or, I dunno, run your own server, but you sacrifice a lot when you outsource both your software and your knowledge of how that software works to someone else. Friction forces users to learn. And this is the problem with frictionless design. The less you make your users learn about how to use your software, the less you empower them to understand it, modify it, and take it in new directions. You make them wholly dependent on you.
This situation is largely by design. The more Apple or Google's customers are dependent on them and their ecosystem, the better it is for them. And we end up with a world where people have less understanding of the software and hardware they work with, less autonomy to make their own solutions, and generally less interesting, small, custom solutions. This loss of novelty and loss of autonomy concerns me the most. And the design pattern that's created the situation we're in now is repeating itself in a way that might be even more concerning.
"Big AI" (OpenAI, Anthropic, and deeper-pocketed FAANG giants) is making no bones about adopting the same frictionless model that worked so well at making gens Z and Alpha computer-illiterate. And they're targeting the people who, historically, have existed to fill in the gap between end users and black-box software: developers. Junior devs, increasingly, rely entirely on ChatGPT or Copilot to get the job done, and to learn what to do on the job. Every day I see a new Reddit post or Hacker News article about how someone built their dream app with no code thanks to their Claude subscription.
But when these tools can only get you 70% of the way there, juniors are often totally lost. This should be extremely concerning to anyone with a vested interest in software continuing to run (read: everyone). By giving devs an easy way to make something "good enough" without actually learning anything, they both make devs wholly dependent on their software and make the product worse in the long run. And programmers, of all people, shouldn't be dependent on a black box to get their job done. We're supposed to be the ones who understand how the computer works. We are the last bastion against the mystery and obfuscation of contemporary computing! This whole damn field is about constantly encountering and overcoming the inevitable friction of software!
I'm not a total doomer. I'm sure, thanks to the lower bar to entry that AI offers, lots of things that will be made that couldn't be made otherwise. I've taken advantage of these tools in situations where I couldn't be fucked to spend hours messing with an unfamiliar language just to write a one-off script. But I think an increasingly deskilled, dependent workforce - not just of programmers, but of anyone who can use LLMs to automate away their work - is a scary thing to consider. Best case scenario, we end up with a lot of technical debt in a decade or two and a shortage of people who have the slightest idea of how to deal with it. Worst case, I dunno, New Guinea gets wiped off the map because someone let Claude into the DARPA servers and forgot to turn off dry_run=true
.
The other issue that I've been thinking about here is the ongoing trend of increased productivity without increased gains. Some software engineers love to brag about how they're automating away their jobs so they can be overemployed, and good for them, but for the rest of us, what seems likely to happen is what always happens, at least in America, when a new, productivity-increasing software comes out. We're going to keep working just as much, our wages will continue to stagnate, and whatever benefits those magical increases in productivity create will land at the top of the food chain.
The last few years have, I think, started to help programmers realize that we are not, in fact, the big fish in the pond of American capitalism. While there was a developer shortage, we had a pretty cushy situation, but tech companies are doing everything in their power to cut wages, kill bargaining power, and reduce the previously spoiled tech workers to what we always have been: workers. So enjoy getting Claude to write your React components for you while it lasts: you may be singing a different tune when you're trying to fix that last 30% that Claude couldn't figure out, up against an impossible deadline, getting paid pennies on the dollar because "anyone can write software now". At the very least, it'll be nice to see companies investing so heavily in AI right now bite the dust when there's no one left to fix the problems that weren't in the training set.