Writing Under Surveillance

Small Wins & Teaching Tips

If AI use by students in your class conflicts with learning goals or disrupts community, see the quick tips below for recourse and future direction:

Progressive Disclosure

Disclose information and run activities in bite-sized portions, with learning framed as an ongoing, iterative process tied to students' growth in your class and as independent learners.

Example Marc Watkins suggests allowing students access to only the first few weeks of course content at a time, which has the added effect of making it hard for students to use an LLM to flatten the course in a single sitting.
Assignment Makeovers

Audit your assignments for tasks that AI can complete faster and more fluently than a student just learning the material. Then redesign those tasks so that AI assistance is either clearly permitted, clearly irrelevant, or clearly insufficient. An assignment that asks students to apply a concept to something specific in their own lives — a neighborhood, a family history, a job — is harder to outsource than one that asks them to explain the concept in the abstract.

Example A research paper prompt asking students to "analyze themes of justice in Beloved" can be largely AI-generated. A revised prompt asking students to trace a single recurring image across three scenes they personally selected, and explain why they chose those scenes, requires the kind of reading and judgment that cannot be delegated.
Check-ins and Analog Opportunities

Build in low-stakes, in-class moments that give you direct contact with students' thinking. Short handwritten responses, verbal check-ins, and brief one-on-one conversations create a running record of student development that no AI can fabricate in the moment. These also shift the weight of evidence from any single high-stakes submission to a pattern of visible engagement over time.

Example At the start of each class, ask students to spend five minutes writing a "where I am" note: one thing they understand, one thing they're confused about, and one question the reading raised for them. Collect these on index cards. Over the semester, they become a portrait of intellectual growth — and a record that makes false accusations of AI use nearly impossible to sustain.
Transparent Dialogue

Open a frank class conversation about AI early — not as a policy announcement but as an inquiry. Ask students what they already use, what they find useful, and what concerns them. The conversation typically surfaces anxieties you didn't know were there, and it repositions the instructor as someone engaged in the same questions students are, rather than someone administering a policy from above.

Example Jesse Stommel recommends running a "syllabus day" activity where students collaboratively draft the course's AI policy rather than receiving one pre-written. Students who helped write the policy are more likely to understand its reasoning and less likely to feel surveilled by it.
Disclosure Over Detection

Replace detection with disclosure. Ask students to note when and how they used AI assistance, what it produced, and what they changed or rejected. This treats AI as a citable resource rather than a forbidden tool, builds habits of transparency, and gives you far more useful information than any detection score. It also makes clear that the problem isn't AI use — it's undisclosed substitution of AI output for student thinking.

Example Add a brief reflection prompt to any major assignment: "If you used an AI tool at any point in this project, describe where, what it gave you, and what you did differently as a result." Students who used AI thoughtfully often write the most interesting reflections. Students who didn't use it at all have nothing to hide.
Process Documentation

Require students to show the work alongside the product. Draft submissions, revision histories, annotated outlines, and research journals make process visible. They also shift what you're evaluating: not the polish of a final artifact but the quality of thinking that produced it. AI can generate a fluent essay in seconds; it cannot generate a genuine revision history that shows a student changing their mind.

Example For a research paper, require students to submit (1) their initial search terms and why they chose them, (2) three sources they considered and rejected with a brief note on each, and (3) one place where their argument changed during writing and what changed it. These three pages tell you more about the student's thinking than the paper itself.
Clear, Specific Disclosure Policies

Ambiguous policies create anxiety without providing guidance. A policy that says "do not use AI" leaves students unsure whether asking an AI to explain a concept counts, whether running a draft through a grammar checker counts, or whether reading an AI-generated summary of a book they were supposed to read counts. A specific policy that names the tools at issue and describes what constitutes use gives students something to work with — and gives you a defensible standard that doesn't require detection software to enforce.

Example The Teach@CUNY AI Toolkit offers a range of ready-to-adapt policy statements, from "AI use is required" to "AI use is not permitted," each with language that explains the reasoning behind it. Starting from a template reduces the cognitive overhead of drafting from scratch and ensures students receive a policy they can actually act on.
Personalized Assignments Rooted in Positionality

Design writing tasks that ask students to draw on where they come from, what they know from living it, and what their community has taught them. A student's neighborhood, family history, language background, or work experience is knowledge no language model has access to. Assignments that require students to locate themselves in the question — to write from a position rather than above one — produce writing that is irreducibly theirs.

Example Instead of asking students to analyze code-switching as a linguistic phenomenon, ask them to write about a time they shifted registers — at home, at work, in a classroom — and what that shift cost or gained them. The analysis follows from the experience, and the experience belongs only to the writer.