Although electric vehicle (EV) companies present themselves as sustainable, the reality is much more complicated.
August 4, 2023
At first it’s like, “Oh yeah, this is useful information!”
Looking at the neat little lists with clean, bold headers, it’s the satisfaction of seeing the information you were looking for. There it is!
But as I read, as I probe and explore, and as I attempt to engage it in a dialogue, I lose the thread. Slowly, what it’s “saying” all starts to blend together.
The thoughts stop and start. And they rarely lead anywhere.
For quickfire answers, it’s fine. Just make sure to verify it with outside sources, please. But for anything sustained, anything that needs to take a thought beyond the sudden halting of the next most probable word, we’re left with scattered beans.
The thing that scares me the most is that I can’t help but wonder if people seem to love it so much because that’s what they actually want.
Do they not want to go further, push harder, and engage in words that may tilt, dance around, or end up somewhere unexpected?
Or are they just hoping to save money? Is it my own bias? Is my view clouded by my own love of words and my vocational devotion to them? (Read: self-interest.)
When I read AI-generated words for too long, my skin starts to crawl, and I feel a pressure in my chest that turns to a drop in my stomach. Of course, there’s the fear of being “automated into obsolescence,” but there’s another uncanny feeling as well that’s hard to put a finger on.
It reminds me of a mask.
Somewhere in a data center, an algorithm clicks away. Little pulses of electricity go to and fro, yes and no, and there’s enough order in nature for us to harness them. We use technique to shape these systems, to mold them to look just like us. It’s an attempt to create something in our image—or at least an image that resembles the insides of our brains.
These neural networks go back and forth: yes and no; yes and no; dog and cat; cat not dog; until these systems have built up enough experience to form a memory.
We named it machine learning, and we’ve trained it to talk.
It has the knowledge contained within the internet’s vastness (one trillion parameters!). But, like a child learning language for the first time, it’s currently immature and lacking wisdom. And like most children, it’s not yet developed to the point where it can fully, fluently, or meaningfully converse on a lot of the things that we want to talk about.
Forgive a Dungeons & Dragons reference, but it’s like a high-intelligence hero who makes their wisdom rolls with a -2 modifier.
Yet don’t we delight in automagical words that relieve us of our obligation to find them? There’s no need to sort them, express them, and occasionally endure the pain of pulling them out against their will. It’s so easy, so tempting, so time-saving.
LIKE WHAT YOU’RE READING?
Get more, straight to your inbox.
The biggest shadow over me is the fear of what will happen when so many people are willing to hand over the written word to the machine.
We’ve already loaned out so much. Credit-worthiness. Hiring decisions. Risk of recidivism. And these are just some of the less visible ones. Sure, people are still involved in these decision-making processes, but that doesn’t make these “Weapons of Math Destruction” any less dangerous.
But honestly, the thing that’s the most frightening is the fact that so many social places—places that should be full of words spoken out loud, full of voices flowing and bouncing around—are often instead full of people craning over their glowing screens. Well, at least we have social media, “the new town square.” In service of what?
We’re being tracked, watched, followed, and targeted in the name of “personalization.” Just watch out. The True Threat of Artificial Intelligence, writes Evgeny Morozov for the New York Times, is that “[AI] will never overcome the market’s demands for profit.”
What happens when we automate the words to point that out?
Although electric vehicle (EV) companies present themselves as sustainable, the reality is much more complicated.
By taking action through data privacy and establishing data sovereignty laws, the people who speak these languages can retain control over how others may use their data.
Human experience does not boil down to statistical probability.