Here are a few bite-sized thoughts about AI of my own.
Maybe I’m just not making the kinds of requests that ChatGPT responds to well.
Maybe ChatGPT seems so good at writing leetcode-style code because it’s been trained on so much leetcode-like content, and its whole job is to generate statistically-likely language.
Maybe I’ve never worked the kind of job where my input was only a list of technical and technically-worded requirements and my only expected output was code that met those requirements. My inputs have been things like: we need a combination CRM and Project Management System, or: make a website that looks like this mockup and works as you’d expect in every browser and at every screen size, or: design and implement improvements to our document management system such that people are willing to pay more money for it.
Who is worried about losing their job? What kind of job is at risk? Are there jobs whose whole scope is to receive technical and technically-worded requirements and to produce code that meets those requirements? Who produces those requirements? Are those people capable of verifying that the code ChatGPT produces meets those requirements? Are those people capable of updating and replacing and re-verifying that code when the requirements change? Is that what they want their job to be?
What owner of a clothing company fires the sewers after the invention of the sewing machine?
It seems like the jobs most at risk are lower-level jobs at large companies. What happens to the purported gains in productivity if the work of those roles is reabsorbed by higher-level developers? If that kind of low-level work needs to be done anyway, and if doing that work is how junior developers develop, how do we expect to develop higher-level developers?
Revising code is usually harder than writing it. Understanding a system someone else wrote, to the level of competently and confidently being able to change and improve the system, usually takes more time than writing a similar system yourself. Would you rather design and implement a system yourself, or review, revise, and verify one ChatGPT wrote?
Maybe the kinds of jobs most at risk by ChatGPT are those worked by people that don’t really want to work in development anyway. Has every bootcamp graduate unlocked a deep and abiding love of computer programming, or were they lured to the industry by the illusion of easy work and easy money? There are other high-paying, possibly more important kinds of technical work. Like plumbing.
Presumably the publishing artist thinks it’s worth their time to make their art, wants the public to know it was them that made their art, and thinks their art is worth other people’s time to consume. A happy artist-fan relationship is mutually beneficial. Contrast that with the kind of relationship between readers and the people that caused Clarkesworld to close submissions.
I don’t want to read a story generated by an AI. If I’m going to invest hours of my time in a novel, I want to know that at least the author thinks it’s a worthy investment.
I’d prefer not to read (or listen to) anything generated by an AI, nothing serious anyway. If you don’t feel it’s worth your time to write a thing, you can’t in good conscience feel it’s worth a fellow human’s time to read that thing.
If a thing isn’t worth a human’s time to write, but we feel words are needed anyway, that’s pure spectacle: it’s not the words themselves that are important, but the image of the words. The words aren’t needed to convey the information the words convey, they’re needed to convey to some people the idea that other people thought about conveying an idea to them.
Or the words are SEO. I talked with a marketer who’s using ChatGPT in his work routinely, generating many thousands of words for webpages he knows are only ever read by search engines.
The language ChatGPT produces is, by design, statistically most likely to follow from what came before, according to its training data. In other words, its work is to make things like what it was trained on. It seems like great artists actively work on making things unlike what came before.
Maybe the most significant thing to come from ChatGPT will be novelty. Not in the sense of memes and other throwaway things but of newness. ChatGPT is a historical achievement and people are already pledging to produce things AI-free.
How often do you find yourself questioning whether the article you’re reading was written by an AI? How often do you find yourself worrying that an AI could do your job better than you? How are you responding to these questions?
How can you trust that that article wasn’t generated by an AI? (This one wasn’t. Trust me.) How can you trust that that Not By AI badge wasn’t placed by an AI? How can you trust that the people you talk to online aren’t bots, or that the long tail of comments on posts by your favorite celebrities weren’t all written and posted by bots? How can you trust anything online anymore?
How can you trust that next month’s bestseller won’t be generated by an AI? Why would you choose to believe it wasn’t, and why?
I’m not in the camp that thinks ChatGPT shouldn’t exist. I’m glad it does and I’m glad it’s us that created it. There’s power in creating a new thing and I’d rather live in a country that creates powerful new things and adapts to its achievements in its own way than in one that is forced to adapt, or one that merely observes.
That said, I’m really only impressed by the technical achievement ChatGPT represents, and not by its actual product. I’ve had one fun session. My code-related sessions with it were disappointing (prompt: write a string-reversing function in Gerbil Scheme). I feel like I get a better return on investment with a good book or good documentation. Engaging with a culture is more enriching than with a bot.
Computers are tools. Software is a tool. What problem does ChatGPT solve? Who is asking for this kind of AI, and what do they hope to do with it? It feels like tech for tech’s sake.
Maybe my disappointment in ChatGPT comes from a basic misalignment of expectations. Its job is to produce convincing language, not the truth.