Teaching with Technology and Critical Reflection
This post is part of a series that comprise my Penn State Teaching with Technology (TWT) Portfolio. More information on TWT and links to each of these posts can be found at this hub page.
Teaching with Technology Rubric Item: Teaching with Technology: Philosophy Statement
The responsible use of technology in the classroom requires two types of critical reflection on the part of the instructor. The more obvious is forward-looking: what software applications or presentation tools might I want to incorporate that would improve my students’ learning experience? The less obvious is a sobering look at the present: what technology is already incorporated into my classroom and how do I responsibly and ethically deal with that? We need to recognize the latter before we can make good, educated decisions about the former.
Before the first day of class, even the most devoted luddite of an instructor—one who forbids phones and computers and uses as visual aids only pieces of chalk and a board—is already deeply integrated into institutional technological systems. The official class roster lives in a student information system (SIS); case management for students having trouble in their courses may be managed in a separate database; a Learning Management System (LMS) may be mandated by the institution, or de facto mandated by students’ expectations. In the classroom, students’ phones, despite being on silent, will inevitably buzz with a social media notification, an urgent text from a family member, or a message from the campus’s emergency alert system (which the students have been strongly encouraged to use).
All these conditions are predetermined before an instructor decides to add any element of technology to their classroom.
Effective teaching with technology requires a critical awareness of such infrastructural technologies and their attendant power structures and politics. As Langdon Winner has argued, all artifacts, including technical devices, have a politics, whether that be in the direct effects of the devices themselves, or in the particulars of the social political configuration that allows that device to exist.[^1] For example, the LMS Canvas reports to the professor data on how students have interacted with the website. In simplest forms, as of this time of writing it offers a stars-system for the student’s level of activity relative to that of the rest of the class. Another set of analytics provides seemingly more specific graphs and numbers for “participations” and “page views.” Such measures suggest a norm of what constitutes a good student and offer what are actually proxies for the thing we actually need to know:[^2] is the student learning and how might we help them learn better? At a program-wide or institutional level, tools might successfully correlate these things. For the instructor, graphs and yellow stars paint too simple a picture. Instructors need to apply a critical digital literacy to their own tools, and then model for students how to exercise their own acumen in the same way.
Adopting technologies mindfully and intentionally, and narrating that process to students, will aid them in not only the particular educational pursuit of a given class, but also in their own inevitable tasks of mindfully adopting or refusing technologies in the future. In my own classroom, I’ve practiced this through exercises in which they use and critique technologies they will likely use in professional or postgraduate work[^3], such as Microsoft Office 365 and ChatGPT.
Microsoft Office 365 is a useful platform to discuss because it is so ubiquitous and near-invisible. Most students will have to work with it in some way, whether that be writing memos on a company template or exchanging book revisions with an editor. But for the undergraduate classroom, its utility is how it can make visible to students how technology tools shape writing and how corporate and institutional decisions shape technology tools. In my literature courses, I mandate that my students use Microsoft Word to collaborate on a shared Course Journal document. The formatting requirements are simple—they must apply Microsoft Word styles to headings and subheadings. They are learning a simple, but very useful, skill that they can use professionally.
But this also occasions a discussion about how large institutions make decisions about software. In a discussion during my Business and Literature course, students noted that they had rarely touched Microsoft Office in the past. Rather, they had used Google Docs, which I pointed out to them was likely due to Google’s push into K–12 education in recent decades. Now at the college-level, my journal assignment and others in the course were compelling them to learn (and occasionally struggle with) Microsoft Office, a platform widely supported by the student’s institution in both classroom and administrative settings.[^4] While students might have no interest in large-scale procurement decisions, discussions like this help them recognize how administrative decisions affect the technology tools they use and how they use them to produce and circulate knowledge.
The speed at which generative AI has been deployed and at which its use has been normalized among students similarly demands critical reflective practices. In two of my classes, I’ve read with students the Chronicle of Higher Education piece by Owen Kichizio Terry and students largely concurred with Terry’s main claim that serves as his title: ”I'm a Student. You Have No Idea How Much We're Using ChatGPT.”[^5] Terry’s piece demonstrates the most common use cases that my students reported: not wholesale generation of essays, but assistance with brainstorming, finding synonyms, and revising select sentences and paragraphs. For every student whose essays are laden with AI-sounding hyperbole and tell-tale nonsensical phrases are likely five or six who use the tool just as they’d use Grammarly, another technology whose use is ubiquitous among students (and as one professor working in the writing center reminded the faculty: “by the way, all your students are using Grammarly”).
To defamiliarize the tool, I adopt Annette Vee’s Annette Vee, Carly Schnitzler, and Tim Laquintano’s call for courses to “work with LLMs [Large Language Models] as objects of study.” Rather than instrumentally using the tools for writing, students should be writing and studying the tool itself.[^6] In an asynchronous technical writing course, I include a discussion board assignment in which students must use a generative AI tool to write a 100-word text that they would likely use in a college or workplace setting. By refining their results and documenting their process, students gain a better sense of what AI sounds like and use cases where it might be more or less effective. I also challenge them to consider the ethical implications of using the tools for the purpose that they’ve chosen.
In his commencement 2005 Kenyon College commencement speech[^7], David Foster Wallace begins with the anecdote:
“There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes “What the hell is water?”
Wallace reminds the graduates that their liberal arts education empowers them to make well-informed, conscious decisions about how they choose to engage with the world that surrounds them. Similarly, we must challenge our students to recognize the water around them and act with deliberation and intention. Though new technologies are often (rightly or not) credited with dramatic “revolution” or “disruption” of our social world, our most effective points of intervention as teachers, and specifically as writing teachers, is to help students recognize the more subtle ways in which corporate relationships, technological systems, and programming affect what they write, how they write it, and how they circulate it.
1 Winner, Langdon. The Whale and the Reactor: A Search for Limits in an Age of High Technology, Second Edition. Second edition. Chicago: University of Chicago Press, 1986.
2 Cathy O’Neill uses the term proxy data to describe data that is used as a stand-in. Proxy data may be related to an attribute that a model is attempting to predict, but it can be a poor substitute, used because it is easier to work with than any measure for the attribute itself. For example, a model may attempt to quantify teachers' effectiveness, but that model may heavily rely on testing results, which may be caused by factors far outside of a teacher's influence. See Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Reprint edition. New York: Broadway Books, 2017.
3 This approach focuses on what Stuart Selber has described as functional and critical digital literacies. Selber, Stuart A. Multiliteracies for a Digital Age. Studies in Writing & Rhetoric. Carbondale: Southern Illinois University Press, 2004.
4 During my time at Penn State, the institution slowly replaced a number of tools and products provided by other vendors with their Microsoft equivalents: email, two-factor authentication, cloud storage, and office telephony.
5 Terry, Owen Kichizo. “I’m a Student. You Have No Idea How Much We’re Using ChatGPT.” The Chronicle of Higher Education, May 12, 2023. https://www.chronicle.com/article/im-a-student-you-have-no-idea-how-much-were-using-chatgpt.
6 Vee, Annette, Carly Schnitzler, and Tim Laquintano. “It’s a Good Time to Experiment with AI in Writing Classes,” May 8, 2024. https://annettevee.substack.com/p/its-a-good-time-to-experiment-with.
7 Wallace, David Foster. “This Is Water by David Foster Wallace (Full Transcript and Audio).” Farnam Street (blog), n.d. https://fs.blog/david-foster-wallace-this-is-water/.