I teach technical writing at the local community college, and lately my students and I have been exploring the mechanics of AI prompting. At first, it seemed like a relevant extension of the course and a way to show students how the tools of language are evolving in the modern workplace. But the more we practice, the more prompting becomes something stranger. It’s not merely another interface skill, but a form of vibe writing.

I don't mean "vibe writing" in the lazy sense ("Write an essay on the theme of revenge in Hamlet"). I mean it as pragmatic co-authorship with a machine trained on an astonishing sweep of publicly available text. Like a Google search history, prompting doesn’t just express a need; it traces the contours of whoever's asking. In other words, our prompts say a lot about us.

Every prompt is a clipped imperative that reads more like an admission than a command. We think we’re instructing a language model, but in the process we’re revealing more about ourselves than we mean to. Our tastes, our tensions, the paper cuts and micro burdens of everyday life. Whether you use AI as a shortcut or as a vibe-writing companion, the prompt is the new grammar of desire.

Take vulnerability, for instance. Certain patterns surface quickly when you’ve read and written enough prompts: the pursuit of validation ("Make this cover letter sound more professional"), a hunger for intimacy ("Revise this apology to make it sound more heartfelt"), the veiled flex ("Respond to this email in a way that makes me seem generous but establishes dominance"). These aren't operational needs. They're emotional blueprints. In asking us to articulate our needs upfront, prompts function less as inputs and more as mirrors.

They demand a peculiar kind of clarity, too. Yes, the technical clarity of good instructions, but also something akin to shameless confession. To get the best possible output, we must spell out exactly what we want with a bluntness that would breach decorum under normal circumstances. The social choreography we perform in ordinary conversation—the hedging, the euphemistic doublespeak, the polite indirection—collapses into “Hey ChatGPT, make this sound smarter.”

This is transgressive clarity: that is, the freedom to state our vanities and petty manipulations plainly to something that won’t judge us. The prompt as a genre permits a frankness we've trained ourselves to suppress. Not "help me improve this" but "revise this so I sound more clever, capable, and emotionally grounded." What emerges is a new rhetorical mode that, in the name of productivity or meeting a deadline, allows us to speak without the usual filters of respectability. Machines don’t blush, so neither do we.

In this way, prompts map our collective insecurities around sounding like an imposter or being misunderstood. The corpus of tiny revelations from a billion users forms a vast meta-literature of human wish fulfillment.

After all, our prompts do not vanish into the ether. These capsule yearnings are parsed, categorized, and, unless you navigate byzantine opt-out settings, used for training purposes. Temporary chat windows are designed to imply disposability, not deliver it. The instructions you enter there aren’t merely inputs; they’re behavioral signals that paint an alarmingly accurate portrait of your inner life.

Prompts accumulate, and the inner lives they disclose increase shareholder value, creating an enormous archive stored in vaults beyond mortal reach. Which means prompts are never just artifacts of personal expression. They’re data, digital gold, a traceable record of who you are in moments that feel private but aren’t.

And yet there's something liberating when opening a chat. I see it again and again with my students. The prompt is an emerging literary form, and like a video game, it lets you experiment without real-world consequences. You can try out new voices, simulate different modes of expression, and A/B test competing versions of who you might become. In the process, prompting offers a space for self-fashioning. Longing becomes playable.

Naturally, this also means longing becomes commodified. Marketplaces appear overnight. Online training programs abound promising lucrative careers. Promptfluencers and SEO writers peddle killer templates for writing better emails, resumes, and dating app openers. Prompting is often touted, somewhat exotically, as a kind of engineering and paraded as a discipline in its own right to draw in less technical audiences. The grammar of desire, it seems, will eventually require a monthly premium plan with no limits.

There’s a risk, too, that the precision prompting demands might flatten ambiguity over time. As we learn to specify our deepest wants more efficiently, we may lose touch with their shapeless edges. There's a difference between feeling lost and asking Claude to "add a paragraph to this essay that evokes existential disorientation in the style of early DeLillo." One is an experience; the other is a spec.

But then again, maybe prompting offers a novel way to express ourselves. When you ask a chatbot for help, you’re not just calibrating tone or fussing over syntax. You’re trying to put words to a thought you can’t quite define, or to a feeling you’re not quite ready to admit out loud. AI doesn’t share your struggle, but it can provide useful scaffolding, and sometimes that's enough.

In short, the prompt isn't just a technical input. It's a psychic document. A crystallized want. A prayer flung into the void. We tell ourselves we want output that’s less raw and more coherent, but what we're really after is harder to name. Some confirmation that our longings still matter. That they might, against all odds, be more clearly understood.

Or at least rendered presentable enough to earn a passing grade. Because of course, there’s a prompt for that.