I built a literary website using AI coding agents. Not a template. A full publication with original fiction, poetry, book reviews, prize coverage, interactive tools, automated reporting, structured data, and a content workflow that produces search-optimised long-form work at a pace I couldn’t sustain alone. I did it in weeks, not months, working from a laptop with no development team.
I’m also a published writer with twenty years of practice, a Pushcart Prize shortlisting, work in literary magazines across four continents, and a Master’s in Creative Writing. I care about the sentence. I care about the line break. I care about the difference between a word that’s right and a word that’s almost right.
I say both things because the conversation about AI and writing is almost always conducted by people who’ve done one or the other. They’ve either used AI tools extensively and don’t write literary fiction, or they write literary fiction and haven’t used AI tools beyond a passing experiment. The first group overestimates what AI can do with language. The second group underestimates what AI can do with everything else.
This essay is about what I’ve learned from sitting in both positions at once.
What AI can do
Let me start with the uncomfortable part, uncomfortable for writers who want AI to be irrelevant to their practice.
AI can produce competent prose. Not brilliant prose. Not prose that surprises you at the level of the sentence. But prose that’s structurally sound, grammatically clean, tonally consistent, and good enough to pass for professional writing in most commercial contexts. If you need a product description, a marketing email, a blog post about SEO strategy, or a 1,500-word article on a subject you’ve researched, AI can produce a draft that’s 80% of the way there in minutes.
AI can handle the architecture of a piece of writing. Structure, headings, logical flow, the scaffolding that holds an argument together. It’s good at organising information. It’s good at maintaining consistency across a long document. It’s good at the kind of editorial work that used to take a junior copywriter half a day: checking tone, catching repetition, flagging structural weaknesses.
AI can do the technical work that surrounds writing. Schema markup for search engines. Metadata for social sharing. Deployment scripts. Automated reporting. The entire infrastructure layer that sits between “I’ve written something” and “someone can find it on the internet.” This is where I’ve used AI most heavily, and it’s where the gains are largest. The technical work that used to require a developer, a designer, and a project manager can now be done by a writer with the right prompts.
AI can learn your voice. Not perfectly. Not in a way that would fool someone who knows your writing intimately. But well enough that with careful instruction, it can produce drafts that sound approximately like you. I’ve built what amounts to a digital twin: a set of instructions, style rules, and voice profiles that let AI produce first drafts in my register. The drafts aren’t finished work. They’re starting points. But they’re starting points that sound like me rather than starting points that sound like a machine.
What AI can’t do
Now the part that should reassure every writer who’s worried about being replaced.
AI can’t write a Carver sentence. I’ve prompted it with Carver’s rules, fed it examples, asked it to produce compressed minimalist prose in Carver’s register. What comes back is competent imitation. The sentences are short. The verbs are plain. The emotional content is understated. But the thing that makes a Carver sentence a Carver sentence, the specific weight of what’s been left out, the feeling that the silence between the words is doing more work than the words themselves, isn’t there.
This isn’t a limitation that will be fixed with better models. It’s a structural limitation of how language models work. AI generates text by predicting the most probable next word given everything that came before. Carver’s power comes from choosing the improbable word, or more often, choosing no word at all. The omission is the craft. A system built on prediction can’t reliably produce the unpredictable, and the unpredictable is where literary writing lives.
AI can’t write a Hempel ending. Hempel’s story endings work because they arrive from somewhere the reader wasn’t looking. The final line reframes everything that preceded it, but the reframing feels inevitable only in retrospect. That retroactive inevitability is the hardest thing in fiction to produce. It requires the writer to know where they’re going before the reader does, and to conceal that knowledge in every sentence leading up to the reveal. AI doesn’t conceal. It predicts. Concealment and prediction are opposite operations.
AI can’t sustain emotional truth across a piece of fiction. It can produce emotionally resonant sentences in isolation. Some of them are good. But the accumulated emotional weight that builds across a story, the feeling that you’re inside a consciousness that’s processing something real, doesn’t hold. After five paragraphs of AI-generated fiction, you can feel the absence of a living mind behind the words. The prose is correct but it’s not inhabited. That gap is the difference between writing and generation.
AI can’t surprise itself. This is the most fundamental limitation, and it’s the one that matters most for literary writing. A writer sits down to write a story and discovers, in the act of writing, something they didn’t know before they started. The sentence takes a turn the writer didn’t plan. A character says something the writer didn’t expect. The ending arrives from a direction the outline didn’t anticipate. This generative surprise, the experience of the writing teaching the writer, is what most writers describe as the reason they write. AI doesn’t have it. It can simulate surprise by producing unexpected outputs, but the unexpectedness is statistical, not experiential. There’s no one home to be surprised.
Where the panic goes wrong
The loudest voices in the AI-and-writing conversation are the ones predicting either utopia or apocalypse. AI will democratise writing. AI will destroy writing. AI will make every person a published author. AI will make authors obsolete.
None of this is happening. What’s happening is more mundane and more interesting.
AI is absorbing the middle layer of professional writing. The competent, functional, serviceable prose that fills corporate blogs, marketing emails, product pages, and content farms. That work is being automated because it was always automatable. It didn’t require a writer’s sensibility. It required a writer’s time. AI is cheaper than time.
Literary writing isn’t in the middle layer. It never was. The novel that changes how you see the world, the poem that makes you put the book down and stare at the ceiling, the story that stays with you for years after you read it, none of that is producible by a system that predicts the next word. The next word in literary writing is often the wrong word, chosen deliberately, placed precisely, and made to carry meaning that the right word couldn’t carry. That’s craft. It’s not automatable because it’s not predictable.
The writers who should be worried are the ones who were already producing work that could be mistaken for AI output. Generic prose. Safe structures. Predictable emotional arcs. Workshop-polished sentences that do nothing unexpected. If your writing is indistinguishable from what a language model produces, the problem isn’t the language model. The problem is the writing.
How I actually use it
My daily practice involves AI in ways that would have been unthinkable five years ago, and none of them involve AI writing any of my fiction or poetry. That’s been original since the day I put ink on paper. And always will be.
I use AI to build and maintain the technical infrastructure of my website. Schema markup, deployment scripts, automated performance reports, sitemap generation, structured data. This work is essential for search visibility and it’s work I couldn’t do without either AI tools or a developer. AI replaced the developer, not the writer.
I use AI to produce first drafts of critical essays and reviews that I then rewrite substantially. The AI draft gives me a structure, a set of arguments, and a rough tonal register. I rewrite every sentence. The final piece is mine. The process is faster than starting from a blank page, not because the AI writes well but because it writes badly in a useful direction. It gives me something to react against.
I use AI to manage the editorial workflow of a publication. Content calendars, deployment checklists, cross-referencing between pieces, internal linking strategies. The organisational work that a publication at any scale requires is exactly the kind of structured, rule-based work that AI handles well.
I don’t use AI to write my fiction. I don’t use it to write my poetry. I don’t use it to produce the sentences that matter. The creative work remains mine because the creative work is the thing AI can’t do. Everything around the creative work, the infrastructure, the distribution, the technical optimisation, is where AI has changed my practice fundamentally.
What this means for writers
If you’re a writer in 2026 and you’re ignoring AI, you’re making a mistake. Not because AI will replace your writing, but because AI can handle the work that sits around your writing: the technical publishing, the search optimisation, the newsletter infrastructure, the reporting, the metadata. That work takes time. AI gives you that time back.
If you’re a writer in 2026 and you’re using AI to write your fiction, you’re also making a mistake. The output will be competent and lifeless. Your readers will feel the difference even if they can’t articulate it. The prose will be correct but uninhabited. The sentences will be smooth but unsurprising. The work will read like a simulation of writing rather than the real thing.
The craft argument about AI is simpler than either camp wants it to be. AI is a tool. It’s an extraordinarily powerful tool for the technical and organisational work of publishing. It’s a weak tool for the creative and literary work of writing. Use it where it’s strong. Don’t use it where it’s weak. The distinction is obvious to anyone who’s tried both.
The panic is wrong because it assumes AI and literary writing are competing for the same territory. They’re not. They never were. Literary writing happens at the level of the sentence, in the gap between what the writer intended and what the language produced, in the surprise of a line that the writer didn’t plan. AI doesn’t operate in that gap. It operates in the space of prediction, where the next word is the most probable word. Literary writing is the practice of making the improbable word the inevitable one. That’s a human skill. It’s going to stay that way.
For more on what literary fiction is doing in 2026, read Literary Fiction in 2026: What the Best New Books Are Actually Doing. For the craft principles that AI struggles to replicate, see The Iceberg Theory and What George Saunders Teaches Writers.
Save for later →I publish fiction and poetry on Substack — gritty, minimalist, written from trains and borrowed rooms. Over 1,200 readers, free to subscribe.
Read the newsletter →Internationally published fiction writer and poet. Pushcart-shortlisted. Writing from trains, borrowed rooms, and strange cities. Publisher of Tumbleweed Words on Substack for twenty years.
Read the newsletter →
