top of page

Part 2: Can technology be critically literate? 

  • Writer: Tabar Smith
    Tabar Smith
  • Dec 10, 2025
  • 5 min read
A growing body of research on automated technologies has begun to consider the implications they have for… activities often understood as deeply, irrevocably, and centrally human, activities like writing.

Bradley Robinson | "Speculative Propositions for Digital Writing under the New Autonomous Model of Literacy," p. 127



A buddha image's head emerging from the trunk of a banyan-like tree

Literacy is recognized broadly as an essential human activity. Through literacy, we grow as humans in search of personal and social meanings, and through the process of becoming literate, we develop as productive and compassionate citizens. Yet, current research and discourse are concerned with generative AI’s negative impact on recognized human activities, like literacy. In his book, More Than Words: How to Think About Writing in the Age of AI, Warner discusses how generative AI is denying its users “the deep pleasures of writing as a process through which you come to know your own mind in the context of the existing set of knowledge available to us all” (6). Thus, if we rely on generative AI to do our writing, we lose an opportunity to grow as meaning-makers. Other literacy scholars, practitioners, and artisans, such as Stephen King, share Warner’s perspective that we make meaning through writing about our observations and experiences (King 269-70). But it is not just “the experts” in the field of literacy who are concerned about generative AI’s impact. High school students recognize writing as an essential human trait that generative AI could be threatening. As Higgs and Stornaiuolo report: 


One of their [students] core concerns centers on the link between writing and what it means to be human—specifically, how writers’ authenticity and creativity are threatened when the labor of meaning-making can be delegated to machines. (633)


Can technology become critically literate? Can it displace human authenticity and creativity? To reflect on this question, I turn to one of the giants in literacy praxis, Donald M. Murray. In his 1972 essay, “Teach Writing as a Process Not Product,” Murray recognizes the necessity of honoring the literacy process as part of our human development.


What is the process we should teach? It is the process of discovery through language. It is the process of exploration of what we know and what we feel about what we know through language. It is the process of using language to learn about our world, to evaluate what we learn about our world, to communicate what we learn about our world. (4).


Murray breaks the writing process into three stages, with each stage guiding literacy students towards more refined ideas, stronger comprehension, and improved literacy skills. The first stage, prewriting, requires being aware of our world, interpreting our experiences, and perceiving ways to share our thoughts (4). Murray describes the next stage, writing, as the most avoided because it demands a commitment to our ideas and to our ability to effectively communicate them (4). The final stage of rewriting encapsulates rethinking, revising, refining, and polishing (4). This stage asks that we deeply reflect on our ideas, audience, format, and tone. Through rewriting, we hone our ideas and our authentic voices. Simultaneously, we are aware and considerate of our potential audiences’ interpretations and perspectives, which at times could be at odds with our own. 


What’s interesting about Murray’s writing process is that each stage speaks to inherently human abilities and traits, which generative AI cannot currently replicate. As described by Robinson, citing Edwards et al., generative AI produces content stochastically by selecting and stringing text together based on its training data and user-entered prompts (123). When comparing this description to Murray’s writing process, we can see that generative AI lacks the human abilities and traits to autonomously write. Generative AI does not prewrite the ways humans do. It cannot experience the world and form authentic perspectives based on those experiences. During the writing stage, generative AI is not concerned with being exposed as a fraud when it produces output. In fact, generative AI is often exposed as a fraud when caught “hallucinating." And because generative AI lacks authenticity (its output being data compiled from other sources) and cannot generate revised output without guidance from a human, it cannot autonomously rewrite. Through this comparison, we see that Generative AI’s content production process is strikingly different from the human writing process.  


This idea that writing — literacy — requires human abilities and traits beyond what technology is capable of is also supported by research completed before generative AI was developed. The New London Group’s "A Pedagogy of Multiliteracies: Designing Social Futures" was published in 1996, decades before generative AI was introduced. Yet the group recognized literacy as an inherently human trait that cannot be replicated by technology:


… the human mind is not, like a digital computer, a processor of general rules and decontextualized abstractions. Rather, human knowledge, when it is applicable to practice, is primarily situated in sociocultural settings and heavily contextualized in specific knowledge domains and practices… a process that is acquired only through experience… (84)


So when technologies, such as the Internet, were just entering the scene, research and theory were already positioning literacy as a human skill that could not be replicated by machines. 


Literacy requires awareness and experience within a sociocultural environment. And although sociocultural systems are not relegated to humans — other living species have their own sociocultural behaviors and activities — generative AI is not an entity that is equipped to autonomously participate in such systems. Generative AI is impressive in the amount of information it can process at beyond-human speeds, but it still cannot do what humans do. It cannot perceive, interpret, and comprehend the world based on unique sociocultural experiences. 


Before The New London Group’s multiliteracies pedagogy, Louise Rosenblatt presented her transactional theory of reading. This theory positions the act of reading as an interactive process of meaning-making between the author, text, and reader (Soosaar 141). A text does not exist in isolation, but requires both the writer’s and the reader’s experiences, interpretations, and perspectives to be actualized (Soosaar 141). Through the transaction, the reader also becomes changed. 


Just as the reader’s background strongly affects their understanding and interpretation of texts, so each subsequent text they read further contributes to the reader’s store of knowledge, refining the reader’s education and taste and forming a backdrop for the next text, thus creating a system of “reciprocal interplay." (Soosaar 143)


Reading is an opportunity for humans to create new meanings and to grow intellectually, emotionally, and socially. Generative AI does not create new meaning based on its lived experiences; it consumes data and generates random strings of text as output. 


The National Council of Teachers of English (NCTE) also recognizes literacy as a human endeavor. The council’s “Understanding and Teaching Writing: Guiding Principles” position statement asserts that writing is inherently human. It is an act completed “by people, in specific situations and contexts… represent[ing] different ideologies, values, and identities” based on lived experiences. The council asserts that writing is a social transaction occurring within and for a community. Writing is the outcome of uniquely human experiences within a sociocultural environment. Again, this outcome is not something that generative AI can currently replicate. 


Generative AI currently cannot experience, make meaning, or participate in a sociocultural world the way humans do. So is our concern about generative AI’s impact on human essentialism founded in reality? Or is literacy so exclusively human that generative AI cannot displace us in the activity? Perhaps our concern is more a reflection on the potential for technology to one day displace us.



Comments


bottom of page