Digital Literacy for Students in the Age of AI
The dramatic advancement of artificial intelligence over the past few years can be felt in all areas of our lives. From content-farmed posts on websites, AI generated ads across media platforms, to monotone voice overs on TikToks and Instagram reels, there is no escaping AI. Although it has seen a rapid rise in public usage, it is nothing new, and likely something you have been taking advantage of for years without noticing. GPS apps, email filtering, plagiarism software such as Turnitin, and YouTube, Netflix, and other streaming recommendations all rely on AI.
Popular public use of AI took off in 2021 with the release of OpenAI’s DALL-E software, capable of producing AI-generated images. The uncanny nature of the images, combined with the novelty of being able to create something out of a simple prompt, saw the software explode in popularity online. Soon after in 2022, OpenAI released their AI chatbot Chat-GPT. Just over a year and a half later, Chat-GPT has become one of the fastest growing consumer apps on the market with 100s of millions of users. Alongside its rapid consumer expansion, OpenAI has also revised Chat-GPT into a scarily human-sounding programme. This, alongside developments to AI image generation software, have made the average person less confident in their abilities to distinguish AI content from human-generated content.
While much can be said about the ways AI has wormed its way into the arts and social media, one area of growing concern is the ways in which AI has found its way into educational institutions. Schools, parents, and teachers alike are faced with the growing problem of students using AI to complete coursework. While plagiarism is nothing new in schools, the ability for students to generate new content with minimal effort is troubling. The question that stands now is what can educators do to mediate this change, and is it something to resist or embrace?
As institutions and boards continue to outline the specifics of AI usage in their policies, the best approach for educators to take is to teach students to engage with digital literacy. Just as students are taught what is and isn’t a reliable source when getting information online, students should be taught how to navigate using AI as an adjunct to their education. Education isn’t stagnant, and rather than banning a potential tool in its entirety, we should consider how AI could be an instrument in students’ lives. This is also particularly relevant as we consider the various ways students encounter AI, often unknowingly, in their daily lives. Simply ignoring the problem at hand does not resolve it.
A healthy attitude towards AI serves as the foundation towards proper integration and management of AI into the classroom. Similarly, students should be encouraged to use AI in a way that is beneficial to them, and indirectly shown the ways in which AI is not useful, or at times dangerous. This can be achieved by demonstrating AI usage in the classroom – for example, demonstrating how to ask ChatGPT questions, or alternatively using AI apps as a tool for classroom activities. At the same time, these activities can be a useful opportunity to show how dangerous AI can be, perhaps by generating faulty information, or by sparking conversation about the implications of using AI generated content in different sectors. At the same time, this can also be an opportunity for students to grapple with the notion of AI being a form of plagiarism. By working with students and having them explore AI both inside and out of the classroom, as well as understand how AI actually works, students may be better equipped to understand why copying an essay from ChatGPT is indeed plagiarism.
Additionally, to encourage a positive education-AI relationship, educators and parents alike can consider adopting a home or classroom “social contract” for using AI. Banning AI from classrooms outright stands to pose a similar risk to banning cell phones from classrooms (which we have also discussed in a previous post here) – a portion of students are bound to keep using AI, and will inevitably not foster a healthy relationship. Similarly, by collaborating with students on the rules surrounding AI usage, they will have a better understanding of why they are being asked to follow these rules. By making AI restrictions a collaborative effort, students will generally be more receptive to following these rules.
Another key skill educators should encourage students to build is the development of their own voice in writing. Perhaps this is a personal bias, or the consequence of having experience using AI, but when apps like ChatGPT are used in articles or student-made essays, there is a very specific tone that can be discerned. Though this seems to be less and less detectable in later versions of the app, I have found that many students simply don’t realise that their work sounds like AI even before checking it with a third-party detector (ironic, I know). By working with AI rather than avoiding it entirely, students will be more familiar with the way AI sounds, and in theory less likely to rely on it to complete assignments.
Teaching digital literacy surrounding AI should not end in the classroom. Parents can play an active role in encouraging their children to understand the limitations of AI, as well as how to use it correctly. Encourage children to explore AI, as well as to fact check the information they are given. This can be done together, or as an unofficial “homework” task. Similarly, when helping your child do their homework, you could take the opportunity to ask ChatGPT to solve a question with an explanation, then encourage your child to use this as a worked example. Whatever the way you choose to expose your child to AI, ensure that they understand that apps such as ChatGPT exist as a tool, rather than a workaround.
At the end of the day, AI isn’t going anywhere, and while its fate in schools is still to be determined at a national level, for now, if you can't beat them, join them. Educators shouldn’t shy away from the unknown due to the dangers that tools like AI pose. Instead, they should be proactive in encouraging digital literacy to students at an early age.
We'd love to hear from you
If you are interested in learning more about our services, please get in touch to book a call.