Ray Bradbury who said, “People ask me to predict the future, when all I want to do is prevent it.” That line came to mind yesterday, when this ad interrupted my morning scrolling:
Now see, I run an organization called Pro Rhetoric. Essentially, we help writers become better at helping leaders become more persuasive communicators. So how violently can I object to a program that promises, in however annoying a style, to help people who can’t afford speechwriters, be more persuasive communicators?
Because it’s not just the style of this ad that’s wrong here. It’s the premise.
There’s always a moral component to the teaching of persuasive writing, at least as our instructors go about it. It can’t be too prescriptive or pushy or even quite explicit, of course. We don’t tell writers what to think, so we can’t tell them what to write. But we do operate on the basic assumption that writers are writers because they want to describe the truth—not because they want to cleverly lie. We don’t teach people how to hornswaggle people.
Our company’s carefully crafted tagline is, “Professional leadership communication to promote greater social understanding.” What does that mean, exactly? It’s short of, “Helping leaders tell the truth, the whole truth and nothing but the truth.” But it’s beyond just, “Let’s persuade this person to do whatever I want them to do,” as our young Grammarly customer says. Or as Joseph Goebbels might put it, “Lassen Sie uns diese Person davon überzeugen, das zu tun, was ich von ihr möchte.”
When I talk about communication, I am also talking about culture and community and relationships and and the commonweal. U.S. Senate Chaplain Barry Black gave a keynote address at our World Conference a few years ago on what he called “responsible rhetoric.” Yes, that’s it. Our teachers try to help communicators and leaders be more compelling, yes; but implicit in all our lessons is that they try to do so responsibly, faithfully following their own moral lights.
Of course, AI is as moral as a claw hammer, and it’s responsible for nothing.
So that proponents of generative AI, whether they are tech-sanguine speechwriters, or these Grammarly creeps, would do well to build at least a moral premise into their pitches.
Or so it seems to me.
Also, using AI to write things and persuade people simply adds fuel to the general mistrust of the written word encouraged by Trump and his wingnut followers.
That quote by Goebbels (I looked up the translation; I haven’t studied German in about 52 years). It reminded me of countless corporate speeches, when executives would call for “educating the public” (or, as we say today, “informing the masses,” an old Bolshevik phrase). What they meant, of course, was “Let me do what I want, because I know better.” It was always a victory when a speaker agreed with me and we omitted it from a speech.