Ladies and gentlemen, colleagues, and fellow educators...
In the age of AI, we need a reset on how we define both literature and language in the canon. Traditionally, literature has been understood as written texts of enduring artistic or cultural value, while language has been treated primarily as a system of words and grammar. But today, meaning is made through multimodal forms: images, sounds, performances, algorithms, and digital platforms that reshape how stories are created, shared, and preserved. TikTok videos, speculative AI collaborations, or Indigenous language revitalization projects are as much a part of our cultural record as novels and poems. To prepare students for this expanded landscape, educators must reconsider what counts as literature and language, and how we engage with them critically, ethically, and creatively.
Framing the Challenge
We, as educators today, are not just transmitters of knowledge but catalysts of engagement—sparking curiosity, guiding critical thinking, and connecting learning to real-world contexts.
We are facilitators, mentors, and bridge-builders, guiding students beyond surface-level information toward deeper understanding and meaningful engagement. (Engagement means building connections to real-world issues (e.g., sustainability, ethics, AI literacy) and to students’ lives and goals)
We have the power to teach process over product (how to think, not just what to produce).
we have the power to allow for different opportunities for students to show what they know through different mediums and forms of expression
And we have the verbal, cultural, and social understanding to guide students in distinguishing between surface knowledge (what AI can provide) and deep learning.
These five points push us to re-define traditional assumptions, current academic guidelines and expectations, and re-evaluate methods of measuring learning.
----
1. Redefine Creative Expression and Authorship
AI use reshapes our understanding of creation, ownership, and originality.
Plus: AI expands creative possibilities by generating ideas, remixing styles, and sparking innovation. It lowers barriers to entry, allowing more people to experiment with tools that once required advanced training or resources, and it encourages collaboration between humans and machines that pushes us to rethink what originality means in the digital age.
Minus: At the same time, AI blurs the boundaries of authorship, making it unclear who deserves credit for a work—the human, the machine, or both. It raises intellectual property concerns, risks producing formulaic or homogenized outputs, and poses ethical challenges when training data is drawn from the work of creators who may never have given consent or received acknowledgment.
Goal: Help students explore new creative possibilities with AI while critically examining authorship, originality, and ownership.
Example
- At the University of Toronto, students co-authored a speculative fiction magazine titled We Did It!?: Pathways to a Net Zero Canada by 2050, using AI tools and collaborative writing to translate complex research on climate policy, social change, and technology into imaginative stories, highlighting both the creative potential of AI and the power of collaboration.
- The lesson assisted in summarizing complex research findings on environmental policy, social change, and technological developments that would inform the fictional narratives. It also demonstrated the value of collaboration.
Takeaway: AI use expands creativity but complicates authorship, raising questions about originality and ownership.
II
2. Rethink Language and Communication
Language is multimodal and includes words, images, music, gestures, symbols. Communication is cross-cultural and has global reach.
Plus: AI tools such as translation apps, captioning software, and multimodal platforms expand communication across languages and cultures. They allow people to share ideas through text, sound, image, and video, making communication more accessible, dynamic, and far-reaching than ever before.
Minus: Yet the same speed and convenience can reduce depth and nuance. Automated translations may flatten cultural meaning, algorithms often privilege dominant voices, and students may rely too heavily on these tools without pausing to critically reflect on context or accuracy.
Just like with AI drafting tools, we need to pause: What message is being conveyed? Who benefits? What’s being left unsaid? Critical thinking is essential because this form of language and communication can both break barriers (sharing unheard voices) and blur meaning (misinformation, oversimplification).
Example
- In short, “Learn on TikTok” is a living example of how AI-driven platforms reshape language and communication—expanding access while demanding new critical literacies.
TikTok redefines how we use language by blending video, captions, audio, and visuals into multimodal forms of communication. The “Learn on TikTok” initiative shows this shift in action: complex ideas and lessons are compressed into short, globally accessible clips. On the plus side, this makes education more engaging, cross-cultural, and accessible to diverse learners. On the minus side, the brevity and algorithm-driven delivery can oversimplify nuance, spread misinformation, or encourage passive consumption rather than deep critical engagement.
Takeaway: AI use enables global, multimodal communication, expanding access across cultures, but its speed and automation can flatten nuance and oversimplify meaning, as seen in TikTok content and automated translation tools.
Goal: Cultivate adaptive communicators who can code-switch across modes and cultures with critical awareness and intentionality.
III
3. Recognize Representation and Bias
AI systems mirror and intensify existing cultural biases, influencing whose voices gain visibility and whose are marginalized.
Plus: AI has the potential to broaden representation by amplifying diverse stories and generating content that reaches new audiences. With the right prompts and design, it can highlight perspectives that might otherwise remain unheard.
Minus: However, AI systems inherit the biases of their training data, often reinforcing stereotypes and marginalizing already underrepresented voices. This can shape whose stories are told and whose are erased, limiting inclusivity rather than expanding it.
Example
- After her performance of her poem, “The Hill We Climb” at President Biden’s inauguration, Gorman’s poetry book was translated into multiple languages. But controversy erupted when some publishers chose translators who were white, male, or not aligned with Gorman’s own identity. Critics argued that this choice erased representation, raising questions about whose voices are considered “authentic” enough to carry her words across cultures.
Takeaway: AI use can preserve cultural memory or distort it, depending on whether it amplifies authentic voices or entrenches erasure.
Goal: Teach students to identify, question, and challenge bias in AI use, ensuring that marginalized voices are recognized and represented.
IV
4. Reconsider Language as Cultural Memory
Machines store memory but cannot replace lived experience or authenticity.
Plus: AI can support cultural memory by archiving, translating, and revitalizing endangered languages, making them more accessible to wider audiences and preserving traditions that might otherwise disappear.
Minus: Yet language stored by machines lacks lived experience and community context. In relying on AI archives or translations, we risk flattening nuance, misrepresenting meaning, or reducing culture to data points rather than lived practice.
Example
Indonesian storytelling traditions are adapting to the age of AI by blending traditional methods with new digital and AI-based tools for preservation, creation, and distribution. AI can preserve cultural heritage, revive local languages and literature, and create new narrative forms, but it also poses challenges regarding authenticity, job displacement, and the need for responsible use and digital literacy.
Trump pressured the Smithsonian to alter its representation of slavery, showing how cultural memory can be reshaped—or erased. AI risks repeating these distortions if trusted uncritically. Yet, projects like First Voices demonstrate how AI, when guided by communities, can preserve and amplify authentic voices instead of silencing them.
Takeaway: AI use can preserve cultural memory or distort it, depending on whether it amplifies authentic voices or entrenches erasure.
Goal: Guide students to use AI responsibly in preserving cultural voices and histories, while remaining alert to distortion and erasure.
V
5. Recommit to Ethical Engagement
Language in the age of AI demands responsibility, reflection, and accountability.
Plus: AI creates opportunities for ethical engagement by inviting students to reflect on how language, authorship, and representation function in a digital age. It can support learning when used transparently and responsibly.
Minus: Yet without clear guidelines, AI can encourage shortcuts, obscure accountability, and raise questions of plagiarism or misuse. Ethical engagement requires constant vigilance, reflection, and responsibility on the part of both educators and students.
Shannon Vallor in Ethics and Virtue in Technology & AI discusses how emerging technologies challenge our moral compass. Generated images, memes, and stories can spread faster than truth. Students must learn not only how to spot misinformation, but also how to engage with technology ethically and responsibly.
Example: The viral AI-generated “Pope in a Puffer Jacket” fooled millions, showing how misinformation spreads easily in the AI era.
In 2023, an AI-generated image of the Pope wearing a trendy designer coat went viral, fooling millions of viewers—even journalists. This incident sparked urgent debates about misinformation, media literacy, and the ethical use of generative AI to spread false images, create visual propaganda, and blur the lines of reality. It became a case study in how quickly manipulated content can shape public perception—and why ethical literacy is more essential than ever.
Takeaway: misinformation can spread when AI outputs are trusted without question. Ethical engagement requires users to pause, verify, and reflect, ensuring that AI is used transparently and responsibly rather than carelessly amplifying distortions. Propaganda spreads rapidly. We need to be tech-wise rather than tech-savvy. AI demands ethical engagement, requiring transparency, reflection, and accountability to prevent misuse and misinformation.
Goal: Model critical thinking and reinforce ethical engagement with AI by emphasizing transparency, accountability, and responsible decision-making.
In Conclusion
As educators, we can teach and model using AI to celebrate collaboration, communication, critical thinking, and creativity. These values offer a framework for navigating the shifts we see.
AI redefines authorship and originality, reshapes how we communicate across cultures and modes, exposes the need to recognize bias in representation, challenges us to reconsider language as cultural memory, and demands that we recommit to ethical engagement.
Real-world examples—from AI-generated art and Amanda Gorman’s translators to Trump’s pressure on the Smithsonian and the viral image of the Pope in a puffy jacket—show both the risks of distortion and the opportunities for preservation and innovation.