Rage Over Drops of Water

People have called my process lazy. They're not entirely wrong. I don't like doing unnecessary work. We just have different ideas on what's considered "necessary". For me, staring at a blank screen or fretting over the placement of a few words doesn't fall into that category.

I've seen folks label works that involve AI as "soulless," "fake," and "not art." In some cases - I agree. I also see stuff that's 100% human-made, churned out with little actual care. We're talking about algorithm-chasers here, not novice creators putting their works out into the world. You can spot the difference.

When you look at conversations on platforms like Threads, you'd think folks like me are personally responsible for the economic and environmental impacts.

Not the AI model developers. Not the tech companies. Not the businesses pushing their employees to use Gen-AI.

People like me. For being open about using it as a tool.

Look, the harms are real. I don’t claim to have clean hands. I did the moral calculus and decided that for me, the end justified the means. I needed a tool that helps my brain bypass the inner critic during the development process. More time spent with my family and community, or taking care of my other needs, and less time staring at a blank screen, my brain going “you suck as a writer, Jess.”

And if you're still not satisfied with that answer — if my brain chemistry and my need to make something feels insufficient — then let's talk about what you're actually asking me to give up — and whether you're asking that of yourself.

The saga I'm building isn't a power fantasy. It's a story about life on the fringes — people who are tired, overwhelmed, and finding ways to push back. The antagonists aren't shadowy masterminds. They're petty. They're emotionally immature. They're embarrassingly human in their cruelty, which is the point.

There are no heroes in this story. There are people who use what they have — however imperfect, however compromised - because that's what survival looks like. Power, in this story, isn't about domination. It's knowing who you are when everything around you insists you should be something else — and choosing, stubbornly, to stay that person anyway.

My protagonists didn't wait for the perfect instrument. Neither did I.

Every choice I make carries its own weight, its own ledger of consequences.

When my ferret chews up an important computer cord on a Tuesday night, do I rearrange my schedule for the forty-minute round-trip to a big-box store? Or do I pull up Amazon and have it at my door by the time I wake up the next morning at 1/3 the cost? I already know the answer. I've clicked that checkout button a hundred times without losing a single night of sleep.

Then there’s clothes shopping. I’m very particular about the sensory side: a certain fabric texture, how it breathes, how it drapes, and whether there are tags (no, please). My go-to brand: A New Day by Target. I’ve tried similar tops from other (also big-box) stores; they aren’t the same. So do I uphold a boycott and wear clothes that don’t feel good or fit my body? Or do I learn how to take care of the shirts I have, and buy more from Target when the stains and frays become too much?

When I'm standing in the grocery store under those cold fluorescent lights, staring at a shrink-wrapped package of chicken thighs — the cheap ones, the ones I can actually afford — do I put it back? I know what those facilities look like. I've read the reports. I've seen the footage. My hand still reaches for it.

I fill up my gas tank and drive around because it’s more convenient than taking the bus (especially with those groceries). My dishwasher runs once or twice a day because I spend my energy on cooking and don’t want to clean. I let the shower run a little longer on mornings when everything feels too heavy to face quickly.

None of this makes me a monster. None of it makes me a saint, either.

We are all, constantly, negotiating the distance between the world we want and the one we actually live in. The moral calculus isn't a clean equation — it's a running tally full of asterisks and footnotes and items we've quietly agreed to stop examining too closely.

Critics raise concerns about the consequences of AI without acknowledging that we are already making numerous moral and environmental trade-offs to maintain the systems we live in. They go “all or nothing” because a nuanced, middle-ground view of AI use in creative works is scary.

Who does this approach actually serve when we, as individuals, don’t have much sway? Those who are genuinely impacted by emerging technology in a society that’s pulled back its safety nets, or those who want to perform “goodness” by villainizing others who are just trying to get by?

I'll give the critics this much: AI is different. Not morally different — but feels different, and that's not nothing. There's something about watching a technology colonize human expression at this speed that trips a wire that Amazon never did. Maybe because language feels like the last sovereign territory. Maybe because we've already ceded so much — our attention, our privacy, our social lives, our sense of shared reality — and this feels like the final room. The one we thought we'd always have to ourselves. That unease is real, and I'm not interested in arguing anyone out of it.

It's not just that jobs are threatened or industries disrupted — it's that the thing being disrupted is meaning-making itself. The stories we tell about who we are. The particular human struggle of trying to find words for something — the hours, the false starts, the moment a sentence finally arrives and feels true. There is an identity wound in here that the economists can't quite measure.

We are creatures who have always defined ourselves, in part, by what we make and how we make it. Art, language, expression — these weren't just outputs. They were proof of something. Evidence we were here, that we felt, that the feeling mattered enough to shape into form. And now a machine can approximate that shape, at scale, in seconds, and we are asked to simply update our understanding of what creativity means.

But here's what I keep coming back to: we stopped flinching at the various trade-offs. Not because we resolved them; we just got used to them. Normalized them right down to the bone. So when someone holds AI to a standard of purity they've never applied to anything else they do before noon, I don't think it's because they're more ethical. I think it's because this one still feels new. Still feels like a choice. Like refusal still means something.

Maybe it does. I genuinely don't know.

What I know is that I'm creating. That the work exists that didn't exist before. And that I'll keep making the same kinds of trade-offs everyone makes — visible and invisible, clean and not — for as long as I'm doing this.

The tally runs. It always has.