On stardust and snake oil
You’ve probably been reading a lot about “AI” these days. Articles run the gamut, from “AI is terrible” (The Guardian) to “AI is awesome and we should embrace it” (The Washington Post). But aside from articles about how much money people are pouring into AI (it’s a lot) and how much energy AI costs (it’s a lot), you don’t see much of a result. Or so you think. But beyond chat-like services, AI is popping up first in the most predictable way: time-saving.
AI can be trained to save you from arduous tasks. That’s the promise, anyway. But the early examples are the cheapest and most dehumanizing examples you can think of: AI can save you from... effort.
Here’s my argument: the effort is the reward. Granted, I am one of those people that reads every word. I like to listen to music that requires long attention spans. I will write out music with a pencil. I still read books. You know, brain exercise. But I also think that the experience of experiencing the information is the real point. Yes, you can only focus on the result – art made me sad, book was about X – but a summary and a feeling are not the point of the information. The point of the information is to make you think. A lot.
Example: youtube’s handy algorithm for some reason served me up an AI rendition of Pink Floyd’s Animals. Yes, that’s right: the entire album, recreated by AI trained on music supposedly like – and including – Pink Floyd’s Animals, to give me a... version of Animals. Now, these versions of the songs are academically interesting from the perspective of how they differ from or warp the original, sure – line everything up to 4/4 here, put the accent on the one there, etc. – but they are not aesthetically consistent. In some spots, you’re hearing actual bits of Animals, verbatim; in other spots, you’re hearing bits of Animals massaged to be more ... predictable? conventional? something; in other spots, you’re hearing a generic “progressive” guitar solo that is completely out of place stylistically. But only if you know the style. So you actually get less from this than the original work. You get less of the art of the art.
And that’s my gripe about this. Instead of listening to this AI rendition, you could, you know, listen to the actual Animals album, as conceived and created by humans. The art this derivative work was trained on. What is the benefit of skipping the original? None. Zero benefit. This is technical masturbation, and nothing more.
OK, at this point you might be saying, “but art is a complicated area for AI!” so let's look at a more pedestrian example: assistants.
The promise of the robotics revolution is to make things better for humans! Reduce waste. Save effort. Do the menial things so humans can focus on the “higher functions” of sentient lifeforms. So it’s no surprise that digital assistants are the first wave of everyday AI. But these assistants are... weird. Here’s one (of several) that will summarize your email for you. It will save you the time of having to read what someone else wrote. (There’s a strange ouroboros here if that email was written with yet another AI assistant, but who knows? Maybe those cancel each other out in a very complicated no-op.) This reduces reading (and writing) to some basic subset of information that has been determined to be necessary, but it takes the personality out of what’s been written. Sure, some emails are just a means to an end, but not all emails. How will you know which emails you should summarize and which you really need to read for nuance and expression? You could, of course, read the email yourself, but you’re busy with the advancement of the human race, so... understandably, time’s a-wastin’.
Here’s another assistant that will give you an answer to a complex question. You just need to verify it. So you check its sources, make sure all the information it spits out is correct; check it for accuracy. That thing you're doing? That’s called research. It’s the tasks you would do if you were answering the complex question yourself.
So, yes, you can trust the AI and save time, minus the time you spend double-checking the AI; or you could just do it yourself in the first place. Both of these tasks get you to the same place: done. One of them exercises your brain and makes you smarter and/or wiser; the other one is a dopamine hit for tech enthusiasts.
As you can tell, I'm not a fan of AI or the race to the bottom for humanity in general. In society’s drive for ultimate convenience, we’re not making better people or a better world. And to make a better world, what we need is better people. We need more critical thought, more creativity, more truth; less shock, less opinion, less blind acceptance of peer pressure and fear of missing out. Life isn’t a quantitative equation. The thing that makes something worthwhile isn’t just completing it, it’s how it makes you think, how it changes you, how you grow from having put in the effort.
You’re a marvel of stardust, don’t waste your time with snake oil.