For the better part of a decade we’ve been warned to fear the displaced truck drivers that will soon be set adrift by autonomous semis. Suddenly that looks wrong. You can find self-driving projects in the “losses” section of various companies’ financial statements and in a handful of sunbelt cities. But that’s about it. Meanwhile, ChatGPT’s serviceable prose is everywhere! What does this mean for the white collar worker? A representative riff came from Kevin Drum this week:
[M]y guess is that GPT v5.0 or v6.0 (we’re currently at v3.5) will be able to take over the business of writing briefs and so forth with only minimal supervision. After that, it only takes one firm to figure out that all the partners can get even richer if they turn over most of the work of associates to a computer. Soon everyone will follow. Then the price of legal advice will plummet, too, at all but the very highest levels.
I agree that language models are going to have important effects on knowledge workers. But Drum reasons about this by comparing human- and machine-authored documents’ quality. I don’t think that tells the whole story. A document’s function and value depends not only on its content but its context, and inhuman authors aren’t going to be able to satisfy our contextual needs.
Consider these questions:
- Why does the pace of production for things like books, TV shows, and pop music continue to increase when the catalog of excellent older works is already too large to ever be consumed?
- Why do business executives spend their enormously expensive time writing planning documents that will only be read by a small set of c-suite executives when cheaper and better prose could be purchased from a professional writer?
- Why do you need a lawyer to draft a will, a trust, or other common legal documents?
It wasn’t until I watched some close friends start a successful news site that I really started to think about these questions. It was the 2010s, and not only was I interested in my friends’ success, but the cultural moment suddenly cast journalism in a stark new light. The internet made global distribution the default. Digital metrics made it easy to see what parts of the news bundle were generating value. The bundle was quickly pulled apart, and an era of pitiless optimization began.
The adaptations that succeeded in this tumult were shocking. Headlines became confrontational. Content began to focus on moral questions that either flattered or impugned their audience, often based on the reader’s membership in groups they couldn’t easily change. Old theories about why people sought out news–“to be informed”; “for entertainment”–started to look pretty suspect. These stories did not have much value for guiding behavior in daily life–at best, they helped solidify some existing social norms. And a lot of them seemed to make people feel mad, guilty, or smug. If this was entertainment, it was a pretty strange kind.
A different model fit the facts better: news consumption (and subsequent sharing) was about identity. Readers were building, transmitting, and asserting their identity by deciding what to read and how they felt about it. It was a kind of self-expression via consumption. In doing so they sorted themselves within a moral landscape defined by authors and other readers. Group membership was important, but metagroup membership–how you judged the correctness of the sub-hierarchy–was maybe more so. From there the logic of factionalism in a zero-sum system took over and every dimension of opinion and preference got collapsed into the overdetermined mush of the dominant coalitions. Before you knew it truck ownership had a moral valence.
Aligning ourselves within social systems is something humans like and badly need to do. It’s easy to understand why: this is how we succeed as a species and as individuals. Ultimately, it’s how we find a mate and reproduce. We are designed to do it, and we invent tools to let us do it ever-more intensely.
This is why we never stop needing new pop stars, authors, and TV shows. Not because the old ones were inferior or because the payphone on the set of Cheers looks distractingly anachronistic. It’s because pop music is about sex, and is consequently best administered by pop stars who we find desirable. It’s because novelty is an important ingredient as we reify relationships through gift-giving; or as we clamber through social hierarchies of wealth or fame or cleverness by responding to new inputs rather than simply nodding in agreement with previous generations that yes, Moby Dick and Thriller are really good.
Similarly, my hypothetical executive’s so-so .docx is produced the way it is not because of what it contains but because of what it represents: countless hours of meetings, Slacks and phone calls to align the participants in the business unit around a shared understanding of goals, roles, and statuses.
The lawyer’s exclusive perch is even easier to explain. Lawyers serve as an interface to our formal system for resolving conflict, and have used their proximity to that machinery to cement their position in the hierarchy–to ensure that when there is a question about who gets to facilitate access to the law, the answer is almost always “lawyers”. Most professions don’t have this luxury. Nice work if you can get it.
Not everything in our economy is about these concerns. But a lot of the information products we exchange are fundamentally in service of our impossibly baroque system for managing simian hierarchy. Removing the human underpinnings of that hierarchy will rob many of those products of their salience. They will become uninteresting. No one wants to fuck a computer-generated pop star. Okay, almost no one.
I think we’ll probably dream up some over-complicated rationales for why we feel this way. It’d be just like us, wouldn’t it? Luddite solidarity. Spiritual mysticism. Endless appeals to safety and quality–we’re already having a great time playing gotcha! with bad ChatGPT output. But at root, the whole thing is about people, and figuring out which of them get to satisfy their animal needs, and how much.
None of this is to deny that these technologies will be powerful tools that we humans use to swing between branches of our hierarchy in new and surprising ways. But until the AIs start reading each other’s stuff, you’re still going to need a monkey attached to the enterprise somewhere. Otherwise what’s the point?