Quality vs. Quantity of Corrections: Choose Your AI Writing Tool Wisely! 

by Govind Ravi Sulekha
Quality vs. Quantity of corrections: Choose your AI writing tool wisely!

When it comes to academic writing, grammar and spelling are not the only things that matter. A great academic article should also be clear, concise, and engaging. However, achieving this can be a challenge for many, especially for non-native English speakers. To help, there are various English editing tools available online, each offering unique features and capabilities, each one claiming to be the best. As a researcher, you are spoilt for choice! But how can you tell the best AI writing tool from the rest, especially when English isn’t your first language to begin with?  

Independent studies are a good place to start when assessing AI writing tools. In a whitepaper by Dr. Alexopoulou of Cambridge Language Sciences,1 which tested seven language correction tools, Paperpal and InstaText were estimated to make the highest number of proposed corrections at about 100 and 175 suggestions per 1,000 words, respectively. However, according to a comparison of AI writing tools based on expert human assessment, Paperpal has a much lower percentage of incorrect suggestions: 13% as opposed 23% for InstaText.2 

How do the numbers above come into play in a real-life scenario? Let’s examine a sample passage to find out (Figure 1).  

(a)


(b)

Figure 1. Sample passage with suggestions by (a) InstaText and (b) Paperpal. 

(a)
(b)

Figure 2. Sentence #1: (a) InstaText and (b) Paperpal. 

Enhance your academic writing skills from the first draft. Try Paperpal today!

The first sentence in the sample above contains two language errors: “predominating” and “power up.” While InstaText corrects both errors, it also suggests an unnecessary change: “transfer” to “transmission” (Figure 2a). A quick Google Scholar search reveals that “wireless power transfer” is used nearly three times more often than “wireless power transmission,” which implies that the original term was better than InstaText’s suggestion. Paperpal’s AI language tools have been trained specifically on corrections made by professional copyeditors on academic papers, and this is reflected in the restraint it shows, a quality that experienced human editors develop over time: Paperpal fixes both errors but avoids making any changes that would alter the author’s voice (Figure 2b). 

(a)
(b)

Figure 3. Sentence #4: (a) InstaText and (b) Paperpal. 

Now, let’s look at sentence #4 from the sample. The sentence does not have any grammatical errors, but the transition phrase, “on the other hand,” is somewhat out of place. While sentence #1 states that near-field inductive coupling is the predominant technology for wireless power transfer, sentence #4 states a disadvantage with inductive coupling: it cannot be used with miniaturization, which is a new requirement in this area. An appropriate transition phrase for introducing this disadvantage is “However,” which is what Paperpal has suggested (Figure 3b). InstaText ignores this issue and changes “as” to “because,” which completely changes the original meaning and intent of the sentence (Figure 3a). This specific aspect when assessing AI writing tools, is particularly concerning for authors who are non-native speakers of English, considering that InstaText does not provide any explanations for its changes. 

(a)

Figure 4. Sentence #7: (a) InstaText and (b) Paperpal. 

We find a similar situation in sentence #7. This sentence contains LaTeX code to generate a multiplication symbol. Additionally, the word “small” is redundant because the actual size of the Rx coil is presented. As shown in Figure 4a, InstaText mishandles the LaTeX code, misses the redundant text, and makes two further changes: one unnecessary change (addition of “have”) and one change in the author’s meaning (“power” to “drive”). In contrast, Paperpal identifies and preserves LaTeX code (Figure 4b). It also removes the redundancy by deleting “small.”  

Table 1: Paperpal vs. InstaText performance on the sample passage.

Tool Proposed edits Incorrect edits Relevant suggestions 
Paperpal 18 93.33% 
InstaText 31 80% 

The performance of the two AI writing tools on the sample passage is summarized in Table 1. The passage has a total of 15 objective errors, and the column named “Relevant Suggestions” shows the percentage of these errors that were addressed by each AI writing tool. As summarized in the table, although InstaText proposes a higher number of edits, its edits are frequently incorrect, and its coverage of objective errors is inferior to that of Paperpal. 

In summary, if you’re a researcher looking for an AI editing tool that goes beyond grammar and spelling to offer substantive rephrases and rewrites with high coverage, Paperpal is evidently the best choice. Haven’t explored AI writing tools that are tailored to support academic writing? Take a few moments and try Paperpal now

References

1. Alexopoulou, D., Comparison of Automated English Editing Tools. University of Cambridge, 2022.   

2. AI-based editing tools for researchers – A comparative analysis, Paperpal, 2022. (Accessed on March 2, 2023) https://www.paperpal.com/blog/wp-content/uploads/2022/11/AI-based-editing-tools-for-researchers_White-Paper_2022.pdf   

You may also like

Your Paperpal Footer