I built an AI system that can write an entire book. Here’s why I won’t let it.

human and AI heart

Let me get this out of the way from the outset. I use AI to help me create content. Every single day. And I’m proud of it.

But I’m also not sure I’ve figured it out yet. Honestly, I don’t think anyone has. And I’m starting to believe that the people who sound the most certain about AI are the ones I trust the least.

Because this isn’t a simple conversation. It’s a deeply human one. And I think it’s time we had it honestly—without pretending there’s one right answer.

The Tension I Can’t Resolve

A close friend of mine—someone I deeply admire and respect—recently made the case that AI orchestration is the new creative skill. He argues that if you build sophisticated enough systems around AI—voice profiles, quality frameworks, multi-stage pipelines—then the output is authentically yours because your methodology produced it. The recipe is you, even if a machine helped cook it.

He’s not wrong.

Another friend—someone I respect just as deeply—holds the opposite view. He argues that the act of creating should stay human. Use AI for research, data analysis, proofreading, even stress-testing your ideas. But the actual writing? The first draft? The original thinking? That should come from you. His reasoning is simple. When everyone automates the creative process, the person who still shows up and does the hard work by hand will stand out like a kid going door-to-door selling Girl Scout cookies in a world of QR codes.

He’s not wrong either.

I sit between these two people and I’ve spent months trying to figure out which one is right.

Here’s what I’ve landed on… they both are. And that’s exactly what makes this so hard.

What I’ve Built (and Why I’m Proud of It)

I’ve spent nearly twenty years creating online learning experiences. I’ve studied how people learn, what makes them engage, and why most courses fail to change anyone’s behavior. I’ve built frameworks informed by some of the best minds in instructional design. I know how to meet a learner where they are and guide them toward a genuine shift in how they think, feel, or act.

And I’ve taught AI to do a lot of that with me.

I’ve built custom skills that contain two decades of my expertise—my methodology, my voice, my understanding of what makes learning stick.

Recently, I used one of those systems to create a course. When my friend reviewed it, he said, “This isn’t just a course. It’s a book!”

I know that comment was meant to be a compliment.

Instead, it opened a door I’ve been standing in front of ever since.

Is This the Line I Can’t Cross?

Here’s the thing. I don’t have a single reservation about publishing that course for our company. It provides genuine value. It contains real expertise—mine, built over years of working with real people. Sure, AI helped me organize, articulate, and scale what I already knew deeply. And I’m 100% confident that every learner who goes through it will walk away with something meaningful.

So why is it that a book feels different?

When I heard the suggestion that the content is strong enough to become a book, something in me resisted. Not logically. Deeper than that. In my gut.

Because a book carries a different kind of promise.

When you pick up a book with someone’s name on the cover, there’s an unspoken agreement between you and the author. It says: I lived with these ideas. I wrestled with them at 2 a.m. when the words wouldn’t come. I chose every sentence, not because an algorithm suggested it, but because I believed it was the truest way to say what I needed to say.

A course is a service. Its value lives in the outcome… what you can do after you finish it. A book is something closer to art. Its value lives partly in the outcome, but also in the relationship between the reader and the person who wrote it.

And I don’t think I can put my name on a book that AI helped write and feel right about it.

“By Shawn Hesketh.” But is it, really?

I know not everyone feels this way. Some people I deeply respect have used AI to produce entire books, and they’re genuinely proud of the result. They’d argue that their frameworks, their ideas, and their expertise are what made the book valuable—AI just helped with the assembly. And I understand that argument.

I just can’t make it for myself. Not yet.

The Blind Men and the Elephant

There’s an old parable about a group of blind men encountering an elephant for the first time. One touches the trunk and says, “An elephant is like a snake.” Another touches the leg and says, “No, it’s like a tree.” A third touches the ear and says, “You’re both wrong—it’s like a fan.”

They’re all touching the same elephant. They’re all telling the truth about their experience. And they’re all missing the bigger picture.

That’s where we are with AI right now.

(And a LOT of other things, if we’re honest.)

The person who says, “AI is just a tool—use it for the grunt work and protect your creative process”? They’re touching a very real part of the elephant.

The person who says, “AI orchestration is the new creative skill—build systems, show your methodology, scale your best thinking”? Also touching something real.

The person who says, “I used AI to write a book, and the ideas are mine, so the book is mine”? I get it. I disagree, but I get it. They’re touching the elephant too.

And the person who feels uneasy about all of it—who uses AI every day but still wonders if something essential gets lost when we hand over too much of the creative process? That’s me. And if it’s you too, I want you to know: your uneasiness isn’t weakness.

It might be wisdom.

Nobody has the full picture yet. Nobody. And anyone who tells you they do is either selling something or hasn’t thought about it deeply enough.

Where It Gets Personal

I want to share something I’ve been watching play out in real time—not to point fingers, but because I think it reveals something important about what’s at stake.

I work alongside someone who’s deeply gifted at creating amazing content that connects with people on a deeply emotional level. She’s a fantastic content creator and designer.

But she doesn’t just “make things look pretty”. She thinks deeply about how every piece of content, every element of a design, will make someone feel, think, and act. She considers the emotional resonance—the experience of each person who encounters that piece of content. There’s an intentionality to what she does that goes way beyond aesthetics.

And it works! The content she creates resonates and compels people to take action.

She’s been told she should use AI to produce more content, faster. Dozens of articles per week. Automated graphics. Volume over depth.

And I’ve watched something happen to her that I think is happening to a lot of creative people right now… she’s starting to wonder if her gift even matters anymore.

It does. God, it does. Now, more than ever.

Because the truth about AI-generated content at scale is that it’s efficient. It’s consistent. And most of the time, it’s completely forgettable.

Content that actually changes how someone thinks—the article that makes a reader stop and say, “They’re talking about me”—almost always comes from someone who poured themselves into creating it. Someone who stayed with the work longer than was efficient because they knew the feeling wasn’t right yet.

AI doesn’t know when the feeling isn’t right. It can’t. That’s a human skill. And it might be the most important one we have left.

The Irony Nobody’s Talking About

Here’s something else I find fascinating.

Some of the loudest advocates for AI-first content creation are the same people who rely on humans to do the emotional heavy lifting in their organizations. They need someone on their team to read the room. To make sure people feel valued. To create a sense of belonging that no algorithm can manufacture.

The skills they want to automate are the exact skills they can’t function without in person.

I don’t say that as criticism. I say it as an observation about a blind spot that I think a lot of organizations share right now. We’re so focused on what AI can produce that we’re undervaluing what it can’t. Things like genuine human presence, emotional intelligence, and the kind of deep caring that makes team members feel seen, needed, and appreciated.

Those things don’t scale. And that’s precisely what makes them valuable.

Where I’ve Actually Landed (For Now)

So here’s my working framework. It’s not perfect. It’ll probably evolve. But it’s honest.

I use AI as a thinking partner, not a ghostwriter. When I sit down to create, I already know what I want to say. I know the insight. I know the story. I know the person I’m writing for and the shift I want to create. AI helps me organize, refine, and sometimes find a clearer way to express what I’ve been circling. But the raw material—the lived experience, the hard-won insight, the empathy that comes from sitting with people in their real struggles—that’s mine. That’s non-negotiable.

I draw a line at authorship. Courses, training materials, frameworks—I’m comfortable using AI as a co-creator because the value is in the outcome. But books, blog posts, personal writing, and art? That’s different for me. The act of writing and creating is part of the product. The struggle is part of the value. I’m not ready to hand that over, and I don’t feel I should have to defend that position.

I refuse to judge anyone else’s line. Your comfort level with AI is yours. If you’ve built sophisticated orchestration systems and you’re proud of the work they produce—great! If you only use AI for spell-checking and research—also great! If you’re somewhere in between, trying to figure out where the line is—welcome to the club! There are a lot of us in here… and we brought snacks.

I believe the people who feel uneasy are paying attention. Not because AI is bad. It’s not. It’s extraordinary. But because the uneasiness is pointing at something real: the difference between producing content and creating connection. Those aren’t the same thing. And the gap between them is where your humanity lives.

What I’d Ask You to Sit With

I’m not going to tell you what to think about AI. Honestly, I’m still figuring it out myself—and I’ve been thinking about it constantly for the past couple of years.

But here’s the question I keep coming back to:

When someone experiences your work,
do they meet a system or do they meet YOU?

Do they encounter optimized content that checks all the right boxes? Or do they feel like someone actually sees them—not because an algorithm identified their pain point, but because another human being has been there too?

That feeling—being truly seen by another person—is still the most powerful force in communication. It’s what builds real trust. It’s what turns a stranger into a client, a client into an advocate, and an advocate into a friend.

No AI on the planet can manufacture that. Not yet. Maybe not ever.

And here’s the part that keeps me up at night. If we automate too much of the creative process—if we optimize for volume and speed at the expense of depth and presence—we might not lose the ability to create content.

We might lose the ability to connect.

My Promise (and an Invitation)

I’ll keep using AI. I’d be foolish not to. It makes me faster, sharper, and more organized than I’ve ever been.

But I’ll never let it replace the parts of me that actually matter. The stories I tell are mine—lived, felt, and earned. The insights I share have been pressure-tested in real conversations with real people who trusted me with their real struggles. And when I write something that I hope will shift how you think or feel or show up in the world, I need you to know: a human being—this specific human being—meant every word.

We’re all still learning. Every one of us. The elephant is enormous, and we’re all touching different parts of it.

So wherever you are in this conversation—whether you’re building AI pipelines or writing everything by hand or standing somewhere in the messy middle—your perspective matters. It’s a real part of the picture.

And the world doesn’t need more content.

It needs more you.

I’d love to hear where you’ve landed. Or where you haven’t. Drop me a note—I read every one. And if you want to talk about how your wiring shapes the way you think about work, creativity, and connection, that’s kind of my thing.

Comments

6 responses to “I built an AI system that can write an entire book. Here’s why I won’t let it.”

  1. Lyn Fitzpatrick Avatar
    Lyn Fitzpatrick

    I read every word of your post, Shawn, and as always, was awe struck by your beautiful gift of writing and the Shawn I discover anew each time I read your work, be it technical or a good friend’s eulogy. You write from the deep places within, not just from a well supported opinion. The result is always intriguing but most importantly….connective. You let the reader know you. Who doesn’t love that?

    1. Thank you for dropping by my little corner of the web, and for your thoughtful, encouraging comment, Lyn. You’re the best!

  2. Signal versus noise. Signal is purely human. Anyone can create noise. The one-of-one content that garners attention isn’t the most “right” or perfect. It speaks to the soul. Humans crave it. And AI can’t replace it.

    1. Couldn’t love this more. Thank you for sharing, Kathy!

  3. I was glad to see this article.

    As I begin to write the new edition of “3D Technology in Fine Art and Craft: Exploring 3D Printing, Scanning, Sculpting and Milling. “ I’m going to have to address mechanical learning and AI.

    Here’s the kicker, I’m writing for a publisher, that is the biggest academic publisher, and who has also sold the rights to other academics works for millions of dollars without contacting the writers for permission or for any type of monetary value to the originators of the content. It took me an entire year to digest this and feel comfortable enough to sign with the publisher. Of course, I was also busy with commissions.

    I tried to put it in my contract that my book would not be sold to ai or shared to ai, and what if my artwork is featured in the book? Is that now sold? This turns copyright on its ear. Oh and most of my wording was not accepted by the publisher.

    Also, I began to look for artist, only the best of the best in the world to feature. I started this in the last two years. First focusing on the artists in the first edition. I got some pushback. If I even mentioned AI in the book, they wanted nothing to do with the project. People are passionate about this subject.

    That’s impossible because 3D Scanning in general uses AI, but that was a year ago, now ai is going much deeper than that.

    Beyond repurposing of people’s work. Selling of people’s work and no compensation the two points that bothered me the most about AI is similar to what I wrote about about in 2015 about 3D Scanning of people‘s faces for fun using hand 3-D scanners phones, etc. I warned people,” you’re giving away your children’s faces.” This 3d scan can be used in many ways, without you knowing it, because you uploaded it. The ways can be quite horrible. As parents we are all cautious about our children-
    This opened up an entire new way of harm.

    And finally, I keep thinking of Soylent Green. The connection in my head is a weird one. No we are not eating people, but we are removed from the process. Basically we are told this will fill a need and look how fun it is to use. It will save us time and at what environmental cost?

    My entire life I have felt that clean water would one day be like gold. Never in my wildest imagination did I think it would be from us “making things easier” or creating something funny. My soul has a hard time with contributing to that- no matter what the creative outcome. And this is not talked about.

    Many don’t know what these data centers take from our environment. It is in the background , unspoken. In my book , “One Foot In Front Of The Other: Art, Hiking, and Healing.” I wrote about the texas aquifers and the danger we are in. But hey, let’s invite the data centers to use that water.

    Have I tried AI? Yes. I was surprised how spot on it was, or how it gave me a jumping off point. But what truly is the cost? I’ll be doing a deep dive in the next weeks as I add a chapter about this in my new edition of the book. Funny , as I search for citations and studies on the web my research will be prompted by algorithms and hidden ai that was not available in the 2015 edition. Research will be faster.

    Thanks for the long rant.

    Time to jump down this rabbit hole.

    1. Bridgette! It’s been way too long. I wouldn’t call this a rant — this is the kind of honest, clear-eyed thinking we need more of right now. I remember when you warned US about 3D scanning our children’s faces back in 2015, long before most of us were paying attention. So you’re exactly the right person to write this chapter. What I admire most is that you’re not taking the easy position on either side. You tried AI, you acknowledged what it does well, and you’re still asking the harder question: but at what cost? I can’t wait to see what you uncover. Don’t be a stranger!

Leave a Reply

Your email address will not be published. Required fields are marked *