PHS Investigates: academic integrity
December, 2024Across PHS, the use of artificial intelligence (AI) tools such as ChatGPT to assist students with school assignments has become an increasing trend, raising concerns about the future of academic integrity at PHS.
While some argue these tools are simply new forms of study aids, others are raising concerns regarding AI being used as a gateway to academic dishonesty. The rise of AI has made cheating easier than ever in a modernizing digital environment, where a few keystrokes can generate essays, solve math problems, and even mimic human thought. At PHS, AI chatbots have sparked concerns among teachers, administrators, and student leaders, who are actively collaborating to discover new methods and policy changes to uphold academic integrity in an increasingly tech-savvy world.
ā[AI is] a fast developing technology; our policy, our adaptation to it, how we manage our student expectations, [and] how we teach in [the] AI era, needs to continue to be evaluated and monitored ... perhaps regulated as well,ā said PHS Principal Cecilia Birge.
There is a significant concern that violations of the AI policy may become progressively challenging for teachers to regulate as technology continues to advance and become more accessible. The challenge lies more in addressing how these tools impact student engagement and education rather than simply identifying AI-generated content. As a result, many teachers have developed various techniques to tackle this growing issue, such as the usage of AI checker softwares or monitoring the editing history to prevent students from entirely relying on AI to complete their assignments.
āWe have students submit assignments to Turnitin.com, but I also check,ā said AP Language and Composition and English II teacher John Bathke. ā[Students] give me editing permission on Google Docs and a lot of [assignments] are written in class.ā
But teachers arenāt the only ones concerned about AI ā student leadership representatives have also voiced their worries about its misuse. In November, Board of Education Student Liaisons Nikolai Margulis ā25 and Maya Hagt ā25 addressed the Board of Education, highlighting instances of students violating the districtās AI policy. Since then, the administration has been actively discussing ways to strengthen policies, ensuring that PHS not only keeps up with rapidly evolving technology but also upholds its commitment to fostering safe and effective AI usage that enhances individual learning.
āWe canāt completely keep kids from cheating,ā said Margulis. āTeachers can make rules for cheating ... [but] itās hard to enforce those rules unless the student body is committed to it and believes it from an ideological standpoint. I think thatās really important [to] ingrain it in our school culture not to cheat.ā
With AI usage often overlapping with academic dishonesty, educators believe an overreliance on AI tools can stifle critical thinking and problem-solving skills, as students may feel empowered to skip the deeper analytical thinking in favor of the quick, easily presented AI responses. This not only impacts their ability to absorb and apply knowledge but also creates gaps in skills that are crucial for success in both future academic and professional settings.
āUsing [AI] to complete entire assignments [will] hurt you in the long run. If I plagiarize all my stuff, then Iām not actually doing the learning that is required for me to actually [understand] the information, which ultimately [will hurt me] when I get to the real world and [have] a job, since I havenāt learned that skill,ā said Bathke.
The environment students are in is believed to also play a large role in shaping how students use these tools. If classrooms foster collaboration, curiosity, and accountability, students may feel more encouraged to engage with the learning process authentically. However, in more cutthroat, competitive environments, students may see it necessary to unethically use AI tools as a means to gain an edge, prioritizing outcome over learning.
āWe live in a very competitive learning environment, society as well. An unhealthy part of our building culture is this toxic competitiveness of grades. So thereās that simple-minded focus on grades that makes some of our students make bad choices,ā said Birge. āBut when your grades are not truly reflective of your mastery, eventually it will catch up with you. It happens all the time in life.ā
However, many teachers do not view the existence of AI tools as the issue, but rather how they are used. When used thoughtfully, AI has the potential to support learning by offering explanations, generating ideas, and providing assistance that encourages deeper understanding.
āThere [are] significant benefits that AI brings, especially when it comes to equity issues. If I speak English as a second language learner, I can say something in [English in] an okay manner, but if I ask AI to help me with certain content and language and grammar, thatās learning. Thatās a positive thing,ā said Birge.
Many educators believe that the key is balancing the positives and negatives of AI,which involves teaching students to use these tools responsibly, as an additional source of learning rather than a replacement for completing their own work.
āAI is like a compass ā it can point you in the right direction, but [real learning happens] when you put in the work yourself and make the most of the tools around you to learn,ā said computer science teacher Edward Yin. āAI can be integrated into education when used thoughtfully and responsibly ... but itās up to students to use it as a learning tool, not for shortcuts. As [educators], we must [create environments] where students donāt feel like cheating, but rather [learning]."