By taking action through data privacy and establishing data sovereignty laws, the people who speak these languages can retain control over how others may use their data.
May 30, 2023
There’s a big difference between hyperbole and the reality of AI content creation.
Technology is rapidly evolving, but anyone who’s worked in the industry can tell you that buzzwords and big promises rarely live up to the hype.
That’s why, when a potential client came to me with a project to edit content that they created with ChatGPT, I told them that I’d rather start from scratch, go through my full content creation process, and focus on quality.
“50 years ago you likely would have been clutching at your typewriter decrying the PC as a disgrace,” they responded. “I guarantee that before long you will be left behind as the rest of the world has moved forward.”
Ouch!
But it got me thinking. What would I have done 50 years ago? So I looked into computers from the 1970’s.
Even going beyond the mainframes that come to mind when we picture early computing, the Apple II, the world’s first personal computer, didn’t arrive until 1977. It cost $1,300 at the time, the equivalent of $6,475 today.
In 1973, I probably would have continued on with my typewriter.
The reason that I’m relating this anecdote to you is because it’s an analogy for what’s happening right now. Just as the PC wasn’t ready for writers like me 50 years ago, today’s AI isn’t ready to replace skilled writers.
I wouldn’t be surprised if this changes in 50 or even 20 years. But, in 2023, AI that’s based on machine learning (ML) excels at handling specific tasks under predefined conditions. Any semblance of critical thought or cognition is little more than a veneer that results from our human inclination to anthropomorphize.
This can change. Especially once we figure out the quantum computer, I expect AI capabilities to rapidly expand. We may even see the singularity in our lifetimes.
But that’s not what this article is about. We’re here to talk about the state-of-the-art in 2023.
To be sure, AI has its strengths. Today’s content writers can and should incorporate this technology into their work.
So, let’s keep content marketing strategy at the top of mind as we move forward. At the end of the day, what’s this all for?
These form the criteria under which we will judge the relative strengths and weaknesses of AI for content creation.
In a word, AI works when “good enough” is good enough.
It’s fast and it’s cheap. Just like that drive-through burger, it may not be so nourishing in the long run, it has its place on occasion or in a pinch.
When you’re on a tight deadline and you just need to get something out there, generative AI can expedite the content creation process without having to pay a premium for a rush order.
Especially for short pieces on simple subjects, AI users can expect a level of quality that’s about on par with offshore content writers who may not be fluent speakers but are willing to create a 1000 word article for $20.
It may be harder to detect AI’s defects with your ear, but you’ll still end up in the same ballpark. And to be sure, I’ve spent enough of my career on Upwork to know that lots of businesses offshore their content writing.
What’s more appealing, however, is using AI tools within a greater content writing process. This technology can help skilled writers to generate blog titles, do keyword research, or get the ball rolling during a fit of writer’s block.
Some writers may also rely on AI to double-check their grammar or alter their style. If a writer isn’t certain—or at least intentional—about every phrase, word, and punctuation mark, then these tools could offer some value during editing.
Finally, while humans may struggle with illness, fatigue, or disaffection, all of which can lower the quality of the final content piece, AI is consistent. What this ultimately amounts to is that a writer has a lower floor and a higher ceiling.
In the end, it comes down to your priorities as a content marketer. Will the content be good enough to bring in traffic, offer them value, and build trust?
The biggest problem with generative AI is that it often doesn’t get the facts straight.
First off, we need to worry about the “hallucinations” of these digital minds. Chatbots fabricate information and present these falsehoods as a matter of fact. Without extensively researching each claim they make, it’s impossible to sort truth from conjecture.
“If you don’t know an answer to a question already, I would not give the question to one of these systems,” advises Subbarao Kambhampti, a professor and researcher of AI at Arizona State University.
At least a human can recognize their ignorance. Whether that means deciding to look more into a topic to improve one’s understanding or even conceding Socratic ignorance, it’s essential for a writer to know what they don’t know instead of making wild claims that have no basis in reality.
Second, a ML model can only create output based on its training data. In the case of ChatGPT, the cutoff date was in late 2021, meaning that it won’t be able to generate content on anything that’s happened since.
For topics that deal with trends or current events, this is a deal breaker.
For instance, when my clients ask me to write about trends in the fast-moving technology industry, they aren’t looking for information from a few years ago. They want to talk about what’s happening right now so that they can offer perspective on where tech is going to head next.
Third, algorithmic bias is well-documented: between facial recognition struggling with black faces to Amazon scrapping their AI recruiting tool that showed bias against women to a racist risk of recidivism algorithm, there’s no such thing as an unbiased AI system.
“Bias creeps in far before the datasets are collected and deployed, e.g. when framing the problem, preparing the data, and collecting it,” explain researchers from the University of Amsterdam and the Delft University of Technology. “Language corpora actually contain human-like biases.”
The big problem is that these biases are deeply entrenched and hard to spot. Not only are AI systems “black boxes” that are opaque in how they turn inputs into outputs, but people are likely to see AI as less biased. How could mathematics, statistics, and computer code be biased?
“Even the people building these systems don’t understand how they are behaving,” says Emily Bender, a professor of computational linguistics at University of Washington.
Bias creeps in and shrouds itself in a cloak of 1s and 0s. It’s difficult to see and even harder to correct.
At least with human writers, we can recognize that there’s no such thing as an unbiased person or publication. You can recognize that it’s in my interest to promote writers over generative AI, just as I’m capable of recognizing that bias in myself.
At the very least, that reflection helps us both understand the nature of this content. It’s editorial, an informed opinion. AI cannot make this recognition. And most people won’t stop to question it.
Finally, AI content writing tools cannot produce genuine or unique insights. It comes down to how they’re built. These algorithms are defined by architectures like Long Short-Term Memory (LSTM), which helps them to figure out what word should come next in a sentence based on what’s come before.
This statistical analysis, by its very definition, puts strict limits on what’s possible. Generative AI cannot synthesize divergent concepts or imagine something new because it’s never been done before. It’s not in the training data.
By boiling thought down to simple probability, we’re left with nothing but a regression to the mean: a reversion to mediocrity.
Creativity is impossible.
Content marketing relies heavily on search engine optimization (SEO). Using AI to create content can actually be counterproductive to ranking well in search results.
Google and other search engines actively filter machine generated content out of their search results, unless such automation is genuinely helpful to readers: sports scores and weather forecasts, for example.
Using AI for the primary purpose of boosting your site’s rank in search results violates Google’s spam policies.
Instead, Google’s algorithms determine content quality according to their “E-E-A-T” acronym. This stands for:
They point to trust as the most critical component of E-E-A-T. This is a big problem for generative AI.
As we just discussed in the section above, AI content for all but the simplest of subjects cannot be considered trustworthy. Publishing the “hallucinations” of a chatbot can lower the ranking for your post and even your whole website.
Although AI generated content can rank on search engines, this is only true for questions that an AI can actually answer. At that point, people can use the AI tool to answer these questions by themselves.
If you do want to go down this route, you’re going to have to compete with Google’s Bard chatbot, Microsoft’s integrated Bing chatbot and search engine combo, and whatever comes next.
Because of this constraint, your content marketing efforts will be most effective when you answer questions and address topics that generative AI can’t or won’t answer well.
Another giveaway of machine generated content is the fact that AI struggles with idioms, making their inclusion—or lack thereof—a sign of whether a piece of content was generated by a machine.
In fact, a group of researchers from the University of Ottawa and American University found that “idiom features retain the most predictive power in detection of current generative models.”
To put it simply, not using any idioms makes you sound like a robot.
Who wants to read that?
SEO has come a long way since the days of keyword stuffing. Today’s search engines prioritize metrics like how long a reader stays on the page, whether they click through to other pages, and whether they share it elsewhere on the internet.
That’s why AI content in 2023 cannot perform to the same degree as content from a skilled professional. Writers are sensitive to style and word choice. We inject our content with humor, imagery, metaphor, and other elements that we use to entertain and inform you, the reader.
People can tell the difference. That difference translates into actions they take that directly affect a website’s ranking.
Developing a unique and consistent voice is an essential aspect of brand identity.
Your readers need to feel like they’re getting to know you and that you’re speaking directly to them. They want someone who recognizes their struggles to offer them guidance, for instance. They want to interact with companies who are relatable and who will give them the time of day that they deserve.
That’s why emotional intelligence (EQ) is so important.
“Many business owners and marketers consider EQ to be one of the most critical factors that drive sales and conversions,” writes John Turner for Forbes. “Mastering emotional intelligence means you’re able to pay attention to things like your brand tone and the way your audience engages with similar brands.”
EQ lets us build rapport with potential customers. It helps us deliver a brand image that is at once approachable and professional.
AI doesn’t have emotional intelligence.
That’s why human writers can build authority and trust in ways that generative AI simply cannot. We know, for instance, when to shift between educating and persuading because we take the time to get inside our readers’ heads, feel what they’re feeling, and then directly address those feelings.
Most of all, this human touch is what enables us to write content that people actually want to read. We can infuse our work with narrative, humor, idiom, and other stylistic and rhetorical devices that bring content to life in the reader’s imagination.
Sure, you can survive off bland food alone. But what’s life without a little spice?
LIKE WHAT YOU’RE READING?
Get more, straight to your inbox.
Now let’s go back to the criteria that we established above.
We set out the underlying strategy of why we want to publish content in the first place. This will help us to get a clear picture of how AI generated and human content stack up.
Both AI and human created content can drive traffic and even boost your site’s rank in search engine results.
However, AI generated content has problems when it comes to trustworthiness and creating engagement. Both of these factors can negatively affect SEO.
AI is better suited to answering simple questions, while humans are better at answering more complex, abstract, or multifaceted ones. It’s possible that visitors will come to your site for answers to their simple questions, but it’s also becoming increasingly likely that they’ll just ask a chatbot directly.
Verdict: AI can generate traffic, but not as well as a human writer.
Not only is generative AI incapable of producing authentic insights. There’s also the fact that readers, especially when they come to your site looking for help with sticky or intractable problems, need to feel seen and understood.
Since AI lacks emotional intelligence, it can’t deliver the same lead generation capabilities as a human writer.
Generating a lead comes down to a potential customer wanting to take that next step with your business. They’re interested.
I don’t know about you, but there’s been plenty of occasions where I took my business elsewhere when a company immediately wanted to route me through some automated system. This is even more frustrating when we’re trying to find answers to our questions.
You know what makes a much better impression? A person at the other end.
Verdict: Humans do a better job of generating leads than AI chatbots.
It’s possible to edit and fact-check AI generated content so that we can at least trust that the content is getting the facts straight. It could end up being more effort than it’s worth, especially if it takes nearly as much time as just writing the piece from scratch, but it’s an option.
The bigger problem is that machines don’t even have an ounce of heart. People do business with people they like, and that comes down to a gut reaction.
They want to feel a human connection.
Verdict: Properly edited AI content can build trust and authority, but not to the same degree as human writers.
Using EQ to build connections is a huge part of what it takes to be an effective salesperson. While they may be able to use AI content to answer a lead’s questions and give them confidence, the impact will be less than it could be.
When a salesperson provides a lead with a compelling asset that has a strong brand voice, it makes their case that much stronger.
Verdict: Human writers can help salespeople reinforce the connection that they’re building with a lead. This makes them more effective at converting leads than AI.
It may seem like “everyone else is doing it,” but remember: it pays not to be a lemming.
There’s a reason that reputable publications haven’t fired their staff and replaced them with machines.
Hiring a quality writer may not be in everyone’s budget, especially for early-stage companies that just need to start getting the word out. If you’re strapped for cash, the fast food of content can be good enough, at least in the short term.
But, if you’re like our B2B technology clients, you need content that speaks directly to both the hearts and minds of intelligent decision-makers. You need content that’s well-researched, accurate, and up-to-date with the latest trends. You need to engage your audience to get the results that you’re after.
Don’t miss out on results because you want to save a bit of time and money. Biting into that juicy burger feels good in the moment, but you might just end up with a belly ache.
Just like in technology R&D, shortcuts lead to debt. When you do something, do it the right way.
Normally I would ask you to learn more about our products at this point in the article. Instead, I want you to take a look at another version of this article.
Seeing is believing. That’s why we’ve reproduced this same article, header by header, using generative AI.
START HERE:
DOWNLOAD THE FREE CORE BRAND ELEMENTS TEMPLATE
Take the first step to branding your tech company. Use our free template to crystallize the foundational elements of your brand.
By taking action through data privacy and establishing data sovereignty laws, the people who speak these languages can retain control over how others may use their data.
Without optimization, content won’t reach the people who want and need it most.
Human experience does not boil down to statistical probability.