February 27, 2023
The emergence of generative AI on the scene reminds me of mobile phones. At first, I thought, 'well, that's cute, but what would I actually use it for besides calling people from the ocean and asking if they can hear the waves?' Then, in the blink of an eye, I felt like I couldn't leave the house without mine.
Like most, I spent my first days with ChatGPT asking it to do tricks: explain difficult ideas like quantum mechanics to me "like I'm five," or write me a poem about chickens. But now, I find myself, if not dependent on ChatGPT and GitHub's Co-Pilot, regularly leaning on them heavily.
Starting out, I never worked at a place like Google or Facebook largely because the minute they'd have asked me to sort a linked list, I'd have passed out from trying to remember how that worked, much less the syntax for such a thing. I can, and have, built large, complex systems that function flawlessly and efficiently for years, but the path to get there involved a LOT of drawing and Stackoverflow and dead-ends until I got the scaffold in place and found my groove in that context.
I've coded languages from BASIC to Scala, and everything in between. It's all stuffed into my ADD brain, mashed up in a big pile of mental laundry. Now, that's great in that I've seen a lot and can synthesize solutions for a world of problems. It's not so great when I have to remember the syntax reduce an array in Typescript.
Likewise, if I have to write any given document explaining an idea, I can spend hours thinking of the perfect metaphor, writing and re-writing so it's clear, and then deciding it's terrible and starting over. OR, I can ask ChatGPT to write it, re-generate it if I don't like it, and then take that and edit it to how I want it in half the time. And it'd good. Definitely good enough for most cases. And, guess what? I'm not creatively burnt-out.
In both situations, I don't feel like AI is doing my job. It feels like a tool, and it's a tool that's particularly useful for people, like me, who think in big, beautiful, swirly messes of spilled colors rather than linear, precise, organized zero-inbox hellscapes. Which, I'm sure, is nice for those people.
I love craft. I build things that I can buy. I love that people still make wood chopping swords (It's true. I saw it on Instagram.) But my thinking at the moment is that if AI can just spit it out, it's probably not worth committing to memory. If we can't tell the difference between an email that I wrote and one that AI wrote, is there any? AI is like the Ikea of the mind. It's good enough. It's convenient. It's cheap. It's not the best, but it's good enough.
My phone has Google Maps, so I don't have to call and ask for directions and then get in an accident looking for "the second red building on the left." It doesn't tell me why I need to go or what to do when I get there.
Co-Pilot doesn't create my systems. It puts in brackets and callbacks where they're necessary and fills out my tedious form validations for me. ChatGPT doesn't drive my thinking or figure out who might need more information. It just plops down some words so I don't have to come up with all of them myself and give myself creativity fatigue.
Even if their code doesn't work, which is actually pretty common, it gets me a large part of the way to my destination by doing the part of the work that I have no desire to do. I still have to know how to make it work to do what I need it to do, but I don't have to type 90 percent of the arbitrary syntactical hieroglyphs to get there.
At the moment, I'm a fan, and I would be sorely disappointed to have to go back into the cave of either folded maps, a bookshelf of language reference books to find out what order I pass those parameters in, or having to expend my energy coming up with another interesting way to explain phishing. I can save my thinking for things that AI can't, and I suspect will never, do-- real, creative, human pursuits.