In recent years, I’ve watched AI weave its way into our daily lives. It’s written and directed movies, acted as a therapist, and visualized alternate realities. But I was curious to learn if AI is now smart enough to be an “intelligent writing assistant.”
It’s not too far off. As Microsoft points out in its Future of Work(Opens in a new window) report, “AI is good at learning and scaling patterns, meaning for these activities people can instead focus on doing things in new ways and generating novel ideas. For example, someone might write a document by merely listing the ideas it should include. The details can be fleshed out automatically, much like developers use Copilot to flesh out ideas through code.”
But how realistic is that for the average would-be writer? We tried Jasper, Rytr, and HyperWrite to see if artificial intelligence can give our writing an edge.
Test 1: Jasper.ai
Jasper(Opens in a new window), which bills itself as “your AI co-writing pilot,” boasts that you’ll be able to “create content 5x faster with artificial intelligence.” Speed is useful, especially when one is freelancing, so I signed up with one of my Gmail profiles.
I’m always wary of handing over personal information upfront, but Jasper wanted to know my company and domain name. The site had grayed-out pre-installs in both fields with company name [SpaceX] and domain [SpaceX.com]. Wish-fulfillment, or just another Elon Musk admirer?
When asked how I was planning to use Jasper, I noticed the options were primarily marketing functions (emails, social posts, blogs, ads, etc). This made me realize Jasper isn’t an AI writing assistant (for professional writers) but more of a swift hack / rules-based marketing automaton designed to keep costs down for churning out copy. Fair enough.
I selected “other,” but it didn’t ask me for more information. Suddenly, Jasper cut straight to the chase and asked for cash. Beyond a five-day, money-back guarantee, there’s no freemium option; plans(Opens in a new window) starts at $40 per month with a Boss Mode tier at $82 per month.
Verdict: This didn’t quite fit my needs, so I went in search of something else.
Test 2: Rytr
Rytr(Opens in a new window) is less flashy than Jasper in terms of UX, and even uses the writing hand emoji (in original yellow) as its brand motif, so no expense spared there. I logged in again with Gmail.
When it encouraged me to “start ryting,” my inner grammarian winced. A blank document opened inside Rytr’s dashboard. I didn’t like the interface at all. I have a file-naming routine and Google Drive folders, so it felt uncomfortable writing inside something that was not stored properly in my own setup. But I persevered for the sake of journalism.
On the dashboard’s left column, I was offered a bunch of options that would automatically trigger certain rules to guide Rytr’s AI in helping me write. There were 22 tone options, including Formal, Inspirational, and Worried (I’m not sure why anyone would want that last one).
There are 33 use cases, including SEO Meta Description, Tagline & Headline, Google Search Ad, Keyword Extractor, and Blog Section Writing. Those with premium accounts also get Create Your Own Use-Case.
I switched between different tones and use cases to see how it modified the text I typed in. It also had language-translation options, so I tested it in French, and it wasn’t half bad. Translation engines have improved enormously over the years.
Verdict: In the end, I was disappointed with it as a writing assistant, though I could see its value for assisting (or replacing) marketing departments.
I took PCMag’s site description and Rytr ingested it, then played around with ways to make it into a usable landing page and website copy. It was fine; not great. There was no flair. Then again, an AI creating something that, for the most part, is read by machines in order to serve up SEO-friendly sites makes sense. Why would someone pay humans to make copy for machines to read? But it wasn’t a collaborative experience. I wanted an AI to “chat” with me as I wrote.
Test 3: HyperWrite
At first glance, HyperWrite(Opens in a new window) was exactly the same as the other writing assistants I tried, and I started to feel slightly disappointed with the whole process. I logged in with Google, but this time it launched the Chrome Web Store and told me to install a Chrome extension.
I don’t like extensions; I find they slow my Chromebook down. However, for the purposes of experimentation, I installed it.
A pop-up appeared asking me to turn off Google Smart Compose, which I found amusing. It’s funny that HyperWrite’s AI felt competitive with Mountain View’s technology, but then I realized that by installing HyperWrite’s extension, I’d given it permission to root around and see what else I had installed. In the end, I did not turn off Google’s widget. At the risk of anthropomorphizing HyperWrite, I felt a virtual sniff of disapproval.
Just like Jasper and Rytr, it asked me what sort of writer I am (early career, professional, marketer, student, etc.) and what I wanted to use the service for, in terms of content (beat writer’s block, write English confidently, improve writing speed, etc.).
Again, this is clearly training the AI down specific branches of a decision tree in order to improve its accuracy and support.
Once I’d selected my keywords, it launched its own dashboard, which looked like it was built for students. I started to long for my Google Workspace and neatly labeled docs and folders. When the strange little AI icon on the right side of the screen attempted a lame joke, I sighed.
I started writing and waited for the AI to suggest something better in my choice of words, but nothing happened. Then I scanned the instructions and saw I had to prompt it to help. I highlighted a bunch of text that had deliberate grammatical errors in it, and the AI spotted them and made some useful adjustments.
After a while I started to appreciate HyperWrite’s AI more and more. Especially when I selected Custom Rewrite and a sub-menu suggested ways in which it could entirely rework my paragraph into, say, a more “polite” manner. I could see how this would be useful for non-native language speakers, or people on the spectrum who often lack social awareness clues.
Verdict: HyperWrite was the best of the three I tried, but I wouldn’t use it again. This is mainly because I had an uneasy feeling that it was reading everything I had ever written (and stored in the cloud), in order to learn more about my writing style. In fact, it is drawing on all the language models(Opens in a new window) contained in OpenAI(Opens in a new window), then using JSON to generate desired outputs by scanning everything, including what you post on social feeds, and what you’re typing in the moment.
CoAuthor: The Future of AI Writing Assistants?
I’m still keen to find an AI writing assistant, but the current crop aren’t up to it, IMHO. What I’m looking for doesn’t exist quite yet—but I know it will, soon.
Basically I’m looking for an AI version of what we used to call the Subs Desk. When I trained on a national broadsheet newspaper off Fleet Street in London in the mid-90s, during the magical print days, there was a group of smart people who sat in a room adjacent to the editorial floor.
The “subs” (sub-editors) were the bane of a rookie reporter’s life. As the last stop before our articles went to press, their job was to cut everything to fit, write headlines and sub-heads, and polish our copy. They forced us newbie journalists to up our game through mockery and sometimes outright derision, especially when somone overwrote or used highfalutin words. But they made us better writers.
To that end, my hopes are pinned on Mina Lee(Opens in a new window), a doctoral student in computer science at Stanford University. Alongside her advisor, Professor Percy Liang(Opens in a new window), and their collaborator from Cornell, Dr. Qian Yang(Opens in a new window), they developed CoAuthor(Opens in a new window), an interface that records, and learns from, writing sessions between humans and language models, specifically Open AI’s GPT-3 (Generative Pre-trained Transformer 3).
With CoAuthor, Lee and her collaborators collected an enormous quantity of human-AI collaborative writing sessions for 830 stories written by 58 writers and 615 essays written by 49 writers. Each story and essay starts with a prompt from the subreddit Writing Prompts as well as The New York Times. The dataset won an Honorable Mention(Opens in a new window) at CHI 22.
If you want to go deeper, and are familiar with Python, Lee and her co-collaborators publicly released the dataset, alongside tutorials and a collection of sample writing sessions(Opens in a new window). It has real promise, because it can be used to understand the co-writing process, and train models on the sessions to better support the collaborative process.
Tools like Jasper, Rytr, and HyperWrite are designed for automating copy, and taking the pressure off writers to generate SEO-ready materials. As such, they appear to be doing a good job. But writers need a smart AI, and CoAuthor feels like it’s the beginning of something truly clever.
Alongside her work on CoAuthor, Stanford’s Mina Lee was one of the official organizers for In2Writing(Opens in a new window), the first ever workshop on intelligent writing assistants, which took place in Dublin in May. Sessions included differentiating between AI assistants that correct, write, and rewrite. Others reported back on generating song lyrics using character-recurrent neural networks.
I’m going to take my experience with the current crop of AI assistants, with a view to seeking how CoAuthor develops, and (depending on its location) will report back in 2023 from the second In2Writing workshop to see how good the AIs are by then.
Like What You’re Reading?
Sign up for Tips & Tricks newsletter for expert advice to get the most out of your technology.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Hits: 0