Gen AI as a Cognitive Tool
- Rachel Eunjoon Um
- Sep 9
- 3 min read
The emergence of Generative AI has rekindled longstanding concerns about the role of technology in education. Like calculators in math classrooms or word processors in writing courses, each new technology raises questions about its impact on teaching, learning, and academic integrity.

I still remember being in college in the 1990s when computers, especially word processors, were becoming mainstream. Some professors refused to allow them, insisting that handwriting itself was part of the learning process. Yet today, word processing apps (e.g., Google Docs, Microsoft Office) are a default tool that enhances how we write, think, and collaborate. In my in-person classes, nearly every student brings a laptop or tablet connected to the internet, using it for note taking or for quick activities on the course’s Learning Management System (LMS).
What was once controversial has become indispensable.
Generative AI is at a similar crossroads. Some educators embrace it, others resist it, and many remain unsure how to integrate it. The question isn’t whether AI will shape education, but how we should frame it as a cognitive tool that supports, rather than replaces, human thinking.
Cognitive Capabilities and Tools
According to cognitive architecture theory (Langley, Laird, & Rogers, 2009), human cognition includes key capabilities and functions:
Recognition and categorization
Decision making and choice
Additional capabilities for human-level intelligence include:
Perception and situation assessment
Prediction and monitoring
Problem solving and planning
Reasoning and belief maintenance
Execution and action
Interaction and communication
Remembering, reflection, and learning
These functions mirror the essence of learning. A cognitive tool, then, is one that helps us engage, strengthen, and extend these capabilities.
From Computers to Cognitive Partners
David Jonassen (1995) argued that computers should not merely deliver instruction but act as cognitive tools that help learners construct knowledge. Tools like databases, spreadsheets, semantic networks, and expert systems require active engagement, reflection, and problem solving. They don’t simply provide answers; they shape how learners represent and reorganize knowledge.
By offloading routine tasks and amplifying reasoning, these tools leave behind what Jonassen (1995, p. 45) called “cognitive residue,” mental habits and structures that persist long after the technology is gone.
Generative AI as a Cognitive Tool
Adopting Jonassen’s framework, Generative AI can function in much the same way. When learners use large language models to generate hypotheses, draft arguments, simulate dialogues, or reorganize knowledge, they are not outsourcing thinking but extending it.
The key is intentional use: Generative AI should be a partner in cognition, not a substitute. When used to prompt reflection, inspire alternatives, and scaffold complex tasks, AI supports higher-order thinking. If treated merely as an answer machine, however, it risks undermining the very cognitive growth it could enable. Major AI providers are already moving in this direction - OpenAI with Study Mode in ChatGPT, Anthropic with Claude for Education, and Google with Guided Learning in Gemini, all emphasizing AI as a scaffold for reasoning, reflection, and deeper understanding rather than a shortcut to answers.
In this sense, Generative AI is best understood as part of a long tradition of cognitive tools: controversial at first, transformative in practice, and ultimately essential for shaping the way we learn, think, and create.
References
Jonassen, D. H. (1995). Computers as cognitive tools: Learning with technology, not from technology. Journal of Computing in Higher Education, 6(2), 40–73. https://doi.org/10.1007/BF02941038
Langley, P., Laird, J. E., & Rogers, S. (2009). Cognitive architectures: Research issues and challenges. Cognitive Systems Research, 10(2), 141–160. https://doi.org/10.1016/j.cogsys.2006.07.004
*Image by freepik.com