Academic writing plays a crucial role for student success in academia as writing is viewed as an extension of thinking. Writing assignments are considered a reflection of students critical thinking and analytical skills (Hutson, 2024). In 2022, generative artificial intelligence (AI) tools, like ChatGPT, were introduced to the consumer sector, including academia. They use large language models (LLMs) which pull human generated content into catalogues, index the content, and use that information to produce responses to complex questions (Cheng et al, 2025). What is produced by these tools can very closely mimic human-written text (Hutson, 2024). Complete dependency on AI tools, like ChatGPT, can hinder a student’s development in academic writing (Cheng et al, 2025) and a lack of academic writing skills affects students’ success both academically and in future careers (Ginting et al, 2022). Academic writing tutors are more likely to improve student success overall when compared to generative AI tools. This work aims to break down issues with generative AI tools in the academic context and how the use of a tutor can improve issues of accuracy and academic integrity, including plagiarism and ethics.
Accuracy of Generative AI tools
A study conducted by Myriam Safrai and Kyle E. Orwig found that ChatGPT was an unreliable tool for producing scientific reviews due to its inaccuracy. In the article they produced using ChatGPT, the tool did not reference sources for twenty-seven statements it generated, four of these generated statements were completely inaccurate (2024). When asked to produce a list of twenty-five references, ChatGPT invented four references and twelve other references were real articles but contained other made-up reference information (Safrai et al, 2024). Explanations produced by ChatGPT did not show a deep understanding of the subject of the article and were out of date. Additionally, they found that if ChatGPT is asked the same question twice it would answer the question differently each time suggesting that it would write two different articles if given the same instructions at different times (Safrai et al, 2024).
Generative AI tools sometimes produce contextually convincing content that turns out to be false or inaccurate as they are not designed to assess the accuracy or authenticity of the original source material. This phenomenon is referred to as “AI Hallucinations” (Cheng et al, 2025). Due to the LLM being used to train these AI tools, they tend to produce exact or near exact copies of text from original source information. Additionally, AI tools like ChatGPT might not recognize that the information was plagiarized in the original source materials creating a ripple effect that is extremely concerning from an academic integrity standpoint (Cheng et al, 2025). Other commonly used AI tools like Grammarly have been found to be good for final edits but fail to address issues like organization, argument development, and voice. The feedback from these tools are not personalized to the author and is often repetitive, inaccurate and lacks context (Eleftheriou et al, 2025).
Academic Integrity and Generative AI Use
As AI tools improve, they become a cause for concern regarding academic integrity as tools like ChatGPT can now generate original content on their own (Cheng et al, 2025). However, AI tools do not meet academic standards for authorship in research as these tools cannot be held responsible for the accuracy of the work’s content. As mentioned above, current large language models (LLMs) do not have fact-checking processes which means these tools are unable to identify AI hallucination or bias in the dataset being used (Cheng et al, 2025). Additionally, students’ overreliance on AI can hinder their development of analytical and writing skills (Hutson, 2024).
Plagiarism
Plagiarism can be defined as copying another’s ideas or words and presenting them as one’s own. It can include copying or paraphrasing work without providing source citations, lack of transparency with regards to collaboration and recycling previous work (Hutson, 2024). The introduction of generative AI tools has introduced a complex discussion around plagiarism, particularly the principle of citing original authors and the words that were borrowed from them which emphasized the importance and respect for intellectual labour. This principle becomes insufficient when you consider that AI produced text can be seamlessly added to student work and that it can sometimes be difficult to identify (Hutson, 2024). However, it is not impossible to detect, and students can face consequences for its use. An easy identifier for use of AI is where a student’s writing and verbal communication abilities are inconsistent (Eleftheriou et al, 2025). As mentioned above, it is also problematic that these AI tools can produce fake or inaccurate references in support of the articles they produce. This is concerning from an academic integrity standpoint as even a reasonably informed person can be misled by these types of “AI hallucinations”. The only way to guarantee these errors are not present is by verifying everything that the tool produces (Safrai et al, 2024).
Ethical Use
Ethically Acceptable. Use of generative AI tools in academic writing is ethically acceptable for making corrections or changes to grammar, spelling, readability and language translations if the writer oversees the edits to ensure they reflect their own voice and critical thinking (Cheng et al, 2025).
Ethically complex. Generating outlines, summarizing content, improving content clarity and brainstorming ideas using AI is ethically complex because the appropriateness of its use depends on the steps taken by the author (Cheng et al, 2025).
The following applications and their considerations need to be reviewed by the author:
- When generating outlines, the author needs to ensure what is generated by the AI tool conforms with assignment requirements;
- When generating summaries of content, the author needs to ensure that the summary reflects their own ideas and insights;
- When using AI tools to clarify content, the author needs to ensure that the edits do not change the key meaning or messages intended by the author; and
- When using AI to brainstorm, the author needs to ensure they give appropriate credit for new ideas.
Ethically inappropriate. The following uses of generative AI in the academic writing process are ethically inappropriate for reasons of accuracy and academic integrity (Cheng et al, 2025):
- To draft text from scratch;
- To develop new concepts;
- To interpret data;
- To review literature;
- To determine ethical compliance; and
- To check for plagiarism.
To use these applications ethically, it is recommended that students who wish to use generative AI for their assignments ensure that they are vetting and guaranteeing the information produced, that the information produces contains substantial human contribution, and that the student is transparent about their use of AI (Cheng et al, 2025). In addition, students should review their institutions policies on the use of generative AI and discuss with their professors.
Tutor-Tutee Relationship in the Age of AI
Tutors encourage students to improve through coaching, commentating, counselling, listening, diagnosing, and activating (Liu et al, 2022). Given the issues with AI accuracy referenced earlier, students who use a tutor for coaching and feedback are more likely to be able to make meaningful changes and edits to their work that will make a real impact on their academic success. Improving confidence through skill development means that as students become more comfortable in the writing process, they will become more efficient and will rely less on generative AI due to time constraints or lack of confidence in their own abilities (Eleftheriou et al, 2025). Tutors can also support students ethical use of AI throughout their academic career by assisting students with locating and understanding their institutions policy on the matter and addressing any inconsistencies that exist, including AI benefits and limitations (Eleftheriou et al, 2025).
Students in the age of artificial intelligence (AI) must consider their professional growth and evolution, including their ability to analyze research problems and data. Productivity, how quickly someone can synthesize data into a research paper, must not be the focus or priority (Cheng et al, 2025). Despite some colleges and universities offering academic writing courses, students still experience a lack of guidance and feedback resulting in low confidence (Ginting et al, 2022). This lack of confidence in academic writing makes the use of generative AI tools for various portions of the writing process attractive to students. A tutor, whether private or through an institution writing centre, can support students academic writing skill development by providing students with a personalized approach and further explanation (Ginting et al, 2022). Tutors can also assist students with using AI tools ethically and in a way that does not hinder their development. According to Eleftheriou and colleagues, the personalized feedback and confidence boost that tutors can provide students helps them to develop efficiencies in academic writing that reduces over-reliance on generative AI (Eleftheriou et al, 2025).
If you are ready to stop relying on AI for your academic writing projects – visit my services page and fill out my contact form
References
Cheng, A., Calhoun, A., & Reedy, G. (2025). Artificial intelligence-assisted academic writing: recommendations for ethical use. Advances in Simulation, 10 (22). https://doi.org/10.1186/s41077-025-00350-6
Eleftheriou, M., Ahmer, M., & Fredrick, D. (2025). Balancing ethics and support: Peer tutors’ experiences with AI tools in student writing. Contemporary Educational Technology, 17(3), ep587. https://doi.org/10.30935/cedtech/16554
Ginting, D. & Barella, Y. (2022, August). Academic writing centers and the teaching of academic writing at colleges: literature review. Journal of Education and Learning (EduLearn), 16(3), 350-356. DOI: 10.11591/edulearn.v16i3.20473
Hutson, J. (2024, April 26). Rethinking plagiarism in the era of generative AI. Journal of Intelligent Communication, 4(1).
Liu, C. & Harwood, N. (2022). Understanding the role of the one-to-one writing tutor in a U.K. university writing centre: multiple perspectives. Written Communications 39(2), 228-275. DOI: 10.1177/07410883211069057
Safrai, M. & Orwig, K. E. (2024, April 15). Utilizing artificial intelligence in academic writing: an in-depth evaluation of a scientific review on fertility preservation written by ChatGPT-4. Journal of Assisted Reproduction and Genetics, 41, 1871-1880. https://doi.org/10.1007/s10815-024-03089-7

Leave a Reply