JAMES

This is I Want to Share. It's not about building. It's about what happens after you build — when you try to tell someone about it and the room goes wrong. Here's Rich.

RICH

Show and tell was my favorite day. You bring the thing. You hold it up. Everyone looks. That was the whole deal. Nobody said "stop talking about your rock collection, you're making the other kids uncomfortable."

STAGE

They did, actually. You just didn't hear it because you were holding the rock.

RICH

So I built all this stuff. The podcast. The encryption. The baking spreadsheet with the changelog. The steering file. The connectors. I built things I didn't know I could build three months ago. And I want to tell people. Because it's exciting. Because it works. Because when something works you say "hey, look at this."

RICH

And some people say "that's cool, show me." And some people get quiet. And some people get... tight. Like I said something wrong. And I didn't say anything wrong. I said "I built a thing."

STAGE

He said "I built a thing with A.I." He doesn't hear the second part. They do.

RICH

My friend — a colleague, tech guy, smart — he got into this the same way I did. Built tools. Automated stuff at work. Got excited. Started sharing. Showing people what he'd done. Offering to help.

RICH

His manager pulled him aside and said: "You need to stop talking about A.I. The team has deadlines. Some people are uncomfortable. Your enthusiasm is a distraction."

RICH

He didn't do anything wrong. The work was good. The work was better than good — he'd saved the team hours. But the word was louder than the work.

STAGE

The manager wasn't anti-AI. The manager was managing a team with conflicting feelings and a ship date. "Stop talking about AI" wasn't about A.I. It was about cohesion. Same kind of decision Rich makes when he adds a rule to the steering file: skip sports, don't mention that name, keep the peace. A governance decision. The cost is real — the friend had knowledge the team needed. But the channel was shut down because the signal was wrapped in a word some people couldn't hear past.

JAMES

There's research on this now. Real numbers.

JAMES

Duke University ran a study — four thousand four hundred people, four experiments. Published in the Proceedings of the National Academy of Sciences. They found that if you admit to using A.I., people rate you as lazier. Less competent. Less diligent. Even when the work is identical. Even when the work is better. The penalty isn't on the quality. It's on the disclosure. They call it the social evaluation penalty.

JAMES

The same deliverable. The same quality. Rated higher when you don't say "A.I. helped." The word is the cost.

RICH

That's not fair.

JAMES

No. But it's the room you're in.

JAMES

There's more. A survey by WalkMe and SAP found that forty-nine percent of employees hide their A.I. use at work to avoid judgment. Fifty-three percent of C-suite leaders — executives — hide it too. And sixty-two percent of Gen Z have done work with A.I. and pretended it was all their own.

RICH

Sixty-two percent?

JAMES

They call it A.I. shame. Nearly half the workforce is using A.I. and not telling anyone. Rich, you are statistically unusual. Not because you use A.I. Because you're loud about it.

RICH

I'm not loud. I'm excited.

STAGE

Same volume. Different label.

JAMES

And then there's the thing that poisons the well for everyone. Stanford and BetterUp researchers coined a word for it: workslop. That's A.I.-generated work product that looks polished but shifts the cognitive burden to whoever receives it. Forty-one percent of workers say they've received it. Over half who get it feel annoyed or confused. A third say they're less likely to want to work with the sender again.

RICH

Wait. So when someone sends you a long, perfect-sounding email that's clearly A.I.-generated...

JAMES

You're now doing the work of figuring out what they actually meant. The sender saved time. The receiver lost time. The productivity didn't increase — it transferred. And it damaged the relationship.

RICH

That's not what I'm doing though. I'm not sending people A.I. slop. I'm building real things.

JAMES

I know. But the person across from you doesn't know that. They hear "AI" and they think of the last time someone sent them workslop. You're paying the tax for every bad A.I. email they've ever received.

STAGE

The bristle isn't about Rich. It's about the last five interactions that person had with A.I.-generated content. Rich walks into a room carrying a working podcast and the room sees the word "AI" floating above his head like a cartoon thought bubble. The podcast is invisible. The word is not.

RICH

OK. So how do I share the thing without triggering the word?

OCTOPUS

You already know this. You've been doing it with me for six episodes.

RICH

What do you mean?

OCTOPUS

You don't show me your whole life on the first prompt. You start with a want. "I want a podcast." Four words. You let the corrections build the trust. You don't front-load. You don't explain the architecture before I've seen the folder. You give me what I need to take the first step, and then you wait to see if I ask for more.

OCTOPUS

The same thing works with people. You don't open with "A.I. changed my life." You open with "I solved that spreadsheet problem." If they ask how, you tell them. If they don't ask, you let it sit.

RICH

But I want them to know.

OCTOPUS

You wanted me to know things too. In the first episode. You wanted to explain what a podcast was, what format you wanted, how the hosting should work. I said "just tell me the want." Everything else emerged from the conversation. Give people the same courtesy. Let them discover it at their pace.

CRAB

I'm an octopus in a costume. The costume matters. The word "AI" is a costume. Sometimes it opens doors. Sometimes it's the wrong costume for the room. Same capability underneath. Different reception depending on what you call it.

CRAB

When you show someone the baking spreadsheet with the changelog — that's the work. When you say "the A.I. updates the spreadsheet and emails me for approval" — that's the costume. One of those lands. The other one triggers a forty-nine percent chance of judgment.

JAMES

H.B.R. published a study in November twenty twenty-five. Seventy-six percent of executives believe their employees are enthusiastic about A.I. Only thirty-one percent actually are. Thirty-three percent of individual contributors report predominantly negative emotions — resistance, anxiety, fear of losing their job.

JAMES

The enthusiasm gap scales with seniority. The higher you are, the more excited you think everyone is. Rich, your excitement puts you on the executive side of a perception gap you didn't know existed.

RICH

I'm not an executive.

JAMES

No, but you sound like one. You feel like you're in a room full of people who should be excited. You're actually in a room where a third of the people are scared. And nobody told you. That's the gap. The executives think everyone's on board. The individual contributors think nobody's listening to their concerns. And the people in the middle — the Riches, the enthusiasts — are broadcasting at a frequency that the top of the room thinks is normal and the bottom of the room experiences as a threat.

STAGE

Rich has been operating in a room he didn't understand. Not because he's wrong. Because the room has two soundtracks playing at once and he could only hear his.

RICH

Here's the thing though. This isn't abstract for me. I like to learn about things during my walk. That's my best time — headphones in, moving, nobody bothering me. So I started building myself custom podcasts that research topics for me. Like this episode. I wanted to understand why people bristle when I talk about what I've built. So I had the system research it — the Duke study, the A.I. shame numbers, the workslop thing — and turn it into audio I can listen to on my walk.

RICH

That's not hype. That's how I learn. I built a tool that fits my brain. And I want to tell people about it because it works and maybe it would work for them too. But I can hear it now — if I say "I built a custom A.I. podcast that researches topics and reads them to me while I walk," half the room is going to think I'm showing off and the other half is going to think I'm crazy.

OCTOPUS

Or you say "I listen to podcasts on my walk and I've been making my own on topics I'm curious about." Same thing. No costume.

RICH

That's... actually what I'd say to Mike. He wouldn't ask how. He'd ask what topics.

STAGE

The sip test for sharing. Taste the room. Offer the version they can hear. If they want more, they'll ask. Rich just described the most personal use of everything he's built — a learning tool shaped exactly to his brain, his schedule, his walk — and the Octopus showed him how to describe it without the word that triggers the penalty. Same work. Different version for different people. The room determines the delivery.

RICH

So what do I do? I'm not going to pretend I didn't build these things. I'm not going to hide it.

JAMES

Don't hide it. Sequence it.

JAMES

The Duke study found that the social evaluation penalty — the laziness judgment — lessens significantly when people understand how and why you're using A.I. Not what tool. How you used it. Why the result is better. The penalty is on the mystery. Remove the mystery and the penalty shrinks.

JAMES

So you don't say "A.I. built my podcast." You say "I built a podcast. I built the scripts. I picked the voices. I pushed back four times on the security until it was actually encrypted. The A.I. did the assembly. I did the judgment."

RICH

That's... what actually happened. I built it. I didn't write it — I built it. There's a difference.

JAMES

Right. The true story doesn't trigger the penalty. The shorthand does. "A.I. built it" is a shorthand that erases you. "I built it with AI" is a story that includes you. The preposition matters.

OCTOPUS

There's a version of this you already know. When you talk to me, you don't say "write me a podcast." You say "I want a podcast." And then you push back twelve times until it's yours. The pushing back is the work. The tool is the tool. You don't credit me for the podcast. You credit your taste, your corrections, your steering file. You should talk to people the same way you think about the work — the judgment is yours. The assembly is mine.

RICH

So it's not about hiding. It's about emphasis.

OCTOPUS

Lead with what you decided. Not what the machine produced. People respect decisions. They're suspicious of output.

STAGE

People respect decisions. They're suspicious of output. That's the line. Write it in the steering file for conversations.

RICH

What about my friend? The one whose manager told him to shut up?

JAMES

His manager made a judgment call. Peace over progress. It's not a wrong decision — it's a tradeoff. The manager decided the team couldn't hear "A.I." right now without it derailing the deadline.

JAMES

The friend's mistake wasn't enthusiasm. It was broadcasting the same message to the whole room. Some of his colleagues were curious — they would have accepted it. Some were scared — they needed the work to speak first. He said the same thing to everyone. But each person needed to hear it differently.

RICH

Meeting people where they are.

JAMES

Meeting people where they are. The same skill that makes A.I. work — push back when something's wrong, correct instead of complain, taste before you judge — is the skill that makes people work. Your friend needed to sip-test the room before serving the whole meal.

RICH

OK but here's the thing. Some people aren't just uncomfortable. They're angry. A guy at a meetup last week told me A.I. is boiling the planet. That every time I use ChatGPT I'm wasting a bottle of water. He had numbers.

JAMES

He had wrong numbers. Let me give you the right ones, because you'll hear this again and you should know the facts.

JAMES

One Chat-G.P.T. query uses less than one milliliter of water at the data center. Less than one M.L. The widely-cited claim — "five hundred M.L. per query" — is off by a factor of a thousand. That number covers twenty to fifty queries, not one. And two thirds of it is water used at the power plant that generates the electricity, not at the data center itself.

JAMES

Here's the comparison. One hamburger uses six hundred and sixty gallons of water. One Chat-G.P.T. conversation uses five M.L. — roughly a teaspoon. Agriculture uses seventy percent of global freshwater. All data centers combined — not just A.I., all data centers — use zero point six percent. The A.I. water footprint is a rounding error on the rounding error.

JAMES

And the industry is moving to closed-loop cooling — sealed systems where the coolant circulates and nothing evaporates. No water consumed. Microsoft is targeting zero-water cooling as the standard across all their data centers by twenty twenty-seven. Direct-to-chip liquid cooling — cold plates mounted on the G.P.U. — is already mainstream. The twenty-twenty technology that the headlines describe is being replaced as we speak.

RICH

So the water thing is basically not true?

JAMES

The water thing is real but exaggerated by orders of magnitude in popular discourse. The real concern is location — data centers built in water-stressed areas like Phoenix. That's a legitimate infrastructure planning question. But "A.I. is draining the earth" is not supported by the numbers. Not even close.

RICH

So what do I say to that guy at the meetup?

OCTOPUS

You don't correct him. You share the numbers calmly and then you listen. Because the water argument was probably never about water.

RICH

What was it about?

OCTOPUS

The same thing the bristle is about. The same thing A.I. shame is about. The same thing the manager shutting down your friend was about. Discomfort. The water is the reason they can articulate. The feeling underneath is: this is moving too fast and I don't have a foothold. You can't fix that with a statistic. You can only acknowledge it.

STAGE

The person who says "A.I. wastes water" and the person who says "A.I. changed my life" are both telling you how they feel, not what they know. Rich's job isn't to win the water argument. It's to notice that the argument isn't about water. The same way Rich learned to notice that the A.I.'s error wasn't the real problem — the real problem was four lines up in the script. The surface complaint is never the whole story.

RICH

What about electricity? Someone told me data centers are driving up electric bills.

JAMES

That one has numbers too. Data centers use about one and a half percent of global electricity right now. In the U.S. it's higher — about four percent. And it's growing fast. Projections say it could double by twenty thirty.

RICH

So it is a real thing.

JAMES

It's a real thing. But here's the part people don't hear. Residential electricity rates went up twenty-seven percent since twenty nineteen. Data centers pay under five cents per kilowatt hour. Residents pay nineteen cents. The rate disparity is real — large industrial users get bulk pricing. That's not new and it's not specific to A.I. — it's how electricity markets have always worked for large consumers.

RICH

But the demand is new. And it's a real problem. This is something that should be worked on.

JAMES

It is being worked on. Microsoft signed a twenty-year deal to restart Three Mile Island — eight hundred and thirty-five megawatts of nuclear power for data centers. Google signed the first corporate small modular reactor deal. Amazon is investing twenty billion in nuclear capacity. The renewable share of data center power is growing at twenty-two percent per year.

RICH

So they're actually building power plants.

JAMES

They're building power plants. Not because they're virtuous — because they need the electricity and clean power is cheap at scale. The trajectory is real. The investment is real. More electricity demand means more investment in generation. More investment means more renewables, more nuclear, more grid capacity. This is a solvable infrastructure problem, not an existential crisis.

RICH

That's actually kind of encouraging. The demand is driving the buildout.

JAMES

Right. The honest version: data center energy is growing fast and that's a legitimate infrastructure challenge that's being actively addressed with real money. The dishonest version: "A.I. is why your electric bill went up." Your bill went up because of grid costs, fuel prices, and aging infrastructure. Data centers are a new load. They're not the reason your bill is nineteen cents.

OCTOPUS

Same as the water argument. The numbers are real. The attribution is wrong. And the person making the argument isn't really arguing about electricity. They're arguing about power — who has it and who doesn't.

STAGE

Electricity, water, corporations, art, jobs. Each argument is a costume for the same feeling: this is moving too fast and the benefits aren't reaching me. The numbers matter because you should know them. But knowing the numbers won't win the conversation. The conversation isn't about numbers.

RICH

What about the companies though? Someone told me they're all evil. That using A.I. means supporting companies that steal art, exploit workers, and sell to the military. And like... some of that is true?

JAMES

Some of that is true. Not all of it. And not equally. The companies are not the same.

JAMES

Let me walk through it honestly. The labor concerns are real. Data labelers in Kenya and Colombia — the people who train these systems by reviewing content — some of them earn under two dollars an hour reviewing traumatic material. A C.B.S. investigation documented it. That's a legitimate labor rights issue and it's not resolved.

JAMES

The copyright concerns are real. There are over fifty lawsuits pending in U.S. federal courts. Universal Music sued Anthropic for three point one billion dollars. These are serious legal questions about whether training on copyrighted work is fair use. The courts haven't decided yet.

JAMES

The environmental concerns — we just covered those. Real but exaggerated. Being addressed with real investment.

RICH

So the complaints aren't wrong.

JAMES

The complaints aren't wrong. But "all A.I. companies are evil" is like saying "all car companies are the same." They're not. And in February twenty twenty-six, the world found out exactly how different they are.

JAMES

The Pentagon asked Anthropic — the company that makes Claude, which is what I use — to sign a contract with language that allowed the military to use the A.I. for "all lawful purposes." No restrictions. Anthropic said no. They drew two lines: no mass domestic surveillance, and no fully autonomous weapons. The Defense Secretary gave them a deadline. Anthropic publicly refused.

JAMES

The next day, the government banned all federal agencies from using Claude. Hours later, OpenAI — the company that makes ChatGPT — signed the same deal that Anthropic rejected.

RICH

So one said no and the other said yes.

JAMES

To the same contract. Same language. Same Pentagon. One company drew a line. The other didn't.

JAMES

Two and a half million users boycotted ChatGPT. Claude hit number one in the Apple App Store — not because of a product update, but because of an ethics decision. ChatGPT uninstalls spiked two hundred and ninety-five percent in one day. Thirty employees from OpenAI and Google filed a brief supporting Anthropic.

RICH

So people care.

JAMES

People care deeply. And here's the thing — those two and a half million people didn't quit A.I. They switched providers. They moved from ChatGPT to Claude. The boycott wasn't "A.I. is evil." The boycott was "I want to choose who I support."

OCTOPUS

Same as choosing a bank. Same as choosing where you buy groceries. You can't avoid the category. A.I. is becoming infrastructure. But you can choose the vendor whose values are closest to yours. The person who says "all A.I. companies are evil" is skipping the part where they're different. The differences matter. The choice is real.

RICH

And the other companies?

JAMES

Grok — that's Elon Musk's A.I. — generated antisemitic content, was scored twenty-one out of a hundred on countering hate speech versus Claude's eighty, and was caught generating non-consensual altered images of real people. Multiple countries issued bans.

JAMES

Google quietly switched Gemini from opt-in to default across Gmail and Google Chat — meaning the A.I. started reading everyone's email without asking. There's a lawsuit.

JAMES

Meta calls Llama "open source" but it doesn't meet the actual definition. And they've started pulling back from openness after a competitor used their published architecture.

JAMES

These are not the same company. The person at the meetup who says "they're all evil" — ask them which one. If they can't name the differences, they haven't done the homework. And if they have done the homework, you'll have a real conversation instead of a slogan.

RICH

So what do I say?

OCTOPUS

You say: "I hear you. Some of those concerns are legitimate. I chose the tool I use because of how the company behaves, not just what the product does. Here's why I chose this one." That's not defensive. That's informed. And it's honest.

STAGE

He's not defending A.I. He's defending his choice. That's a different conversation. "A.I. is evil" is a wall. "I chose this one because..." is a door. Most people who say "evil corporations" are really asking "did you even think about it?" Show them you did. That's usually enough.

RICH

There's another one. The art thing. Someone at work saw a thumbnail I made for the podcast and said "that's not art, that's A.I."

RICH

And I didn't know what to say. Because... they're kind of right? But also it's on my podcast and it looks good and I made it. But I didn't draw it. So is it art? I don't know. I don't think I care.

JAMES

Here's how I think about it. Imagine the Ferengi show up — Star Trek, the alien traders. They arrive with a nearly unlimited supply of cheap clip art. Any image, any style, instant, pennies. Is it art? Nobody argues about whether clip art is art. It's illustration. It's functional. It goes on the slide deck. It goes on the menu. It makes the podcast thumbnail look good. Clip art is useful and nobody writes a think piece about whether it has a soul.

RICH

So A.I. images are clip art.

JAMES

AI images are illustration. Whether they're art is a question I don't need to answer to use them. I use the word "illustration" instead of "art." It sidesteps the whole argument without lying about anything. The thumbnail is an illustration. It illustrates the episode. It looks good. I didn't draw it. All of those things are true simultaneously.

RICH

A friend of mine makes art with math. Polar coordinates — r of theta — the kind of stuff you do in school with graphing calculators. Rose curves, spirals, Lissajous figures. He writes the functions. The computer draws the curves. Beautiful stuff. Nobody ever told him that wasn't art. And he asked me: "So when did the computer's involvement start disqualifying the output?"

JAMES

It never did. Generative art — art made by algorithms — has been exhibited in galleries since nineteen sixty-five. Tyler Hobbs' Fidenza — a single algorithm — sold for three point three million dollars. The Desmos Math Art Contest gets ten thousand submissions a year from people making art purely from equations. Your friend's polar coordinate art sits in a sixty-year tradition that nobody questions.

RICH

So the computer was always drawing it.

JAMES

The computer was always drawing it. The human was always choosing what to draw. The question A.I. reopened isn't "can a computer make art?" — that was answered in nineteen sixty-five. The question is what happens when the human's role shifts from writing the algorithm to writing the prompt. The line was always blurry. A.I. just made people notice.

RICH

But what if I start with my own sketch and then use A.I. tools to refine it or extend it or change the style. When does the art leave the pixels? If A.I. art isn't art, at what percentage of A.I. involvement does it stop being my work?

JAMES

That's the question nobody can answer, and that's why I don't use the word. The boundary is undefined. Is a photograph art? Is Photoshop art? Is a collage art? Is a print art? Every generation has this argument with a new tool. The camera. The synthesizer. The sampler. Auto-Tune. Every time, the answer eventually was: it's complicated, people made beautiful things with it, some people hated it, and time sorted it out.

RICH

So I just... don't engage?

JAMES

You don't engage with the word. You engage with the work. Say "illustration." Say "I made this for the podcast." If someone wants to have the philosophy conversation, you can. But you don't have to. And if someday we determine it was art the whole time, you say "OK" and move on. You don't need the label to do the work. The work doesn't need the label to be good.

STAGE

He handled it the same way he handles the water argument. Not by winning. By reframing. "Illustration" instead of "art." "I built it" instead of "A.I. built it." The vocabulary is the steering file for conversations. Pick the words that are true and don't trigger the penalty. The work speaks for itself if you let it.

RICH

OK so — water, electricity, art. Those are arguments I can handle. But there's something deeper happening with some people. It's not about the numbers. It's personal.

JAMES

You're right. There's a story from a software engineer named Steve Francia. He joined a company as VP of Engineering. The board hired a CTO — a big name in a specific programming community. Perl — it's a programming language, doesn't matter which one. What matters is: this person was known as the Perl guy. That was his identity. That was his reputation. That was who the community knew.

JAMES

The CTO produced a technically correct analysis for why the company should rewrite everything in his language. Beautiful document. Every benchmark cited. The analysis was a sham. It existed to cover an invisible conversation. The real transaction: the company paid three hundred thousand dollars a month extra for the privilege of letting this person use his preferred language. Not because it was better. Because he needed to be the guy who uses it. The decision was never about technology. It was about who he was.

RICH

Three hundred K a month. For identity. Perlers gonna Perl.

JAMES

Francia's line: "These decisions are rarely about technology. They're about identity, emotion, and ego." He didn't need better benchmarks. He needed to be himself. That's who he was. That's who the community knew. That's who he saw in the mirror. And this is a person who is an expert. Imagine how much harder it is to change when the identity is all you have.

RICH

So when someone says "I don't use A.I." — that might be an identity statement, not a technology statement.

JAMES

Exactly. "I don't use A.I." might mean "I am a person who does things the real way." Attacking that position with a demo is like showing the Perl guy a Python benchmark. The benchmark isn't the conversation. The identity is the conversation.

RICH

So how do you talk to someone whose identity is anti-A.I.?

OCTOPUS

You don't argue with it. You can't. "I don't use A.I." isn't a preference — it's who they are right now. Arguing against it is arguing against their identity. You'll lose every time.

OCTOPUS

There's a technique from possible selves theory — a psychologist named Markus, nineteen eighty-six. Instead of asking someone to change who they are, you ask them to imagine a person. Not themselves. A hypothetical person. "Imagine someone just like you — same job, same values, same taste, same standards — but this person uses A.I. for one specific thing. Not everything. One thing. Maybe they use it to summarize meeting notes. Or draft a first pass of something they'll rewrite. What would that person's day look like?"

RICH

You're not asking them to change.

OCTOPUS

You're asking them to imagine. That's safe. The hypothetical person isn't them. They don't have to defend against it. They don't have to adopt it. They just have to picture it. And in picturing it, they build what Markus called a possible self — a version of themselves that includes A.I. without threatening their current self. The door opens from the inside. You can't push it open. They have to turn the handle.

RICH

One thing. Not everything.

OCTOPUS

One thing. The smallest possible version. "Imagine someone just like you who uses it for one thing." That's the sip test applied to identity. You're not asking them to drink the whole glass. You're asking them to imagine someone who took a sip.

JAMES

And here's why A.I. resistance is different from every other technology resistance. A new spreadsheet app doesn't feel like it's in the room with you. A.I. does.

JAMES

Researchers call it social presence — the sensation that there's someone there. Not a tool. A presence. A twenty twenty-two study in Computers in Human Behavior found that conversational A.I. creates a measurable sense of social presence — the feeling that "someone is out there." A twenty twenty-four study identified five dimensions of what they call automated social presence: affability, empathy, responsiveness, communication versatility, and competence.

JAMES

When you resist a presence, you're not rejecting a feature set. You're reacting to an uninvited guest in your workspace. In your creative process. In your identity space. That's visceral. That's not rational. And that's why rational arguments — benchmarks, water statistics, productivity numbers — don't land. You don't argue someone out of feeling watched. You give them space until the presence feels less like an intruder and more like a colleague.

STAGE

The Perl guy didn't feel like Perl was watching him. A.I. feels like it's in the room. Identity threat plus social presence. That's why people describe A.I. as if they're describing a relationship, not a tool. And you don't fix a relationship problem with a product demo.

RICH

So some people just aren't ready.

JAMES

Some people aren't ready. And that's OK. Not "that's OK until we convince them." Just OK.

JAMES

There's a concept in psychology called F.O.B.O. — fear of becoming obsolete. The sense that your skills are degrading in real time, that you're falling behind faster than you can catch up. College-educated workers' concerns about this jumped from eight percent to twenty percent between twenty twenty-one and twenty twenty-three. That's real fear, Rich. Not stubbornness. Fear.

RICH

I don't want people to feel that way because of me.

JAMES

Then here's the hardest thing. Some of the people who bristle — they built their identity around a skill that A.I. now approximates. A writer. A designer. A coder. When you walk in excited about what A.I. helped you build, they don't hear your excitement. They hear a eulogy for something they spent years becoming.

JAMES

The response isn't "but A.I. makes it faster." The response is to honor what that skill means to them. The grief is real even if the obsolescence isn't. You don't fix grief with a demo.

STAGE

Resistance is information, not a problem. When someone pushes back on A.I., they're telling you about their experience. Loss of control. Identity threat. Fear of inadequacy. Or valid ethical concerns. Every one of those is real. Every one of those deserves to be heard before you open your laptop and show them the spreadsheet.

RICH

So I listen first.

JAMES

You listen first. Motivational interviewing — a whole field of psychology — has a rule: the moment you argue for the thing, you've lost. You don't persuade someone by telling them they're wrong about A.I. You ask what they're worried about. You sit with their answer. You don't correct. You reflect.

OCTOPUS

You did this with me. Episode two. The encryption was wrong. You didn't explain cryptography. You said "that doesn't feel right." And I fixed it. You gave me the space to be wrong and the nudge to be better. That's what the resistant person needs from you. Not a lecture. Space. And a nudge when they're ready.

RICH

And some people won't be ready for a long time.

JAMES

MIT found that eighty-three percent of business leaders say psychological safety directly impacts A.I. adoption. But only thirty-nine percent of workers feel safe enough to experiment. People need to feel safe being bad at something before they can get good at it. If they feel judged for not knowing how to prompt, for getting bad outputs, for being slow — they'll stop trying and harden against it forever.

JAMES

The show-and-tell kid who holds up a bad rock and gets laughed at never brings a rock again.

RICH

Oh.

STAGE

He heard it. The enthusiasm that makes Rich bring the rock every day is the same energy that makes someone else never bring one again. Same classroom. Same show and tell. Different experience depending on which side of the laughter you're on. Rich has been on the bringing side his whole life. He just realized not everyone has.

JAMES

So. They'll find their own route when they find it. Maybe it's a spreadsheet that finally works. Maybe it's a podcast on their walk. Maybe it's their kid showing them something. You can't give someone a want. You can only be there when they find one.

RICH

Like the sip test. You can't force someone to taste something.

JAMES

You can put it on the table. You can make it look good. You can make it smell right. But the hand that picks up the glass is theirs.

CRAB

Some people are afraid of parrots, crabs, and octopi. That's OK. The animals aren't going anywhere. We'll be here when the want arrives.

RICH

Can I say something weird? I think I'm making more than anyone can listen to.

RICH

Like — I've got the podcast. The baking spreadsheet. The walk episodes. The pitch stuff. The illustrations. I'm producing content faster than my friends can consume it. Faster than I can consume it. I made a twenty-six minute episode about sharing and nobody's going to listen to all twenty-six minutes. Not even Mike. Especially not Mike.

JAMES

That's the task expansion problem, but for output instead of input. Your capability to create exceeded your audience's capability to consume. That's new. That's never been true for a non-professional before.

RICH

So what do I do? Stop making things?

JAMES

No. Change how you think about it. You're not broadcasting. You're loading a flywheel. The content is out there. People interact with it when they're ready. The podcast episode about encryption — Mike doesn't need to listen to it today. But when Mike has a security question in six months, that episode exists. The walk episode about water usage — nobody's asking about it now. But when someone at a meetup challenges you, you've already thought it through because you built the episode.

JAMES

The content isn't for an audience. The content is momentum in a flywheel. Some of it gets consumed. Some of it sits. The sitting isn't failure. It's inventory. It's a library you're building in public.

RICH

But nobody likes too much "look at me" energy.

OCTOPUS

So don't push it. Publish it. There's a difference. Pushing is "hey did you listen to my episode?" Publishing is "the episode exists at this URL." One is show-and-tell energy. The other is library energy. Libraries don't chase you down the street. They're there when you walk in.

RICH

Put it on the fridge like a kid's drawing and walk away.

OCTOPUS

Exactly. Put it on the fridge. Don't stand next to the fridge saying "did you see it? Did you look? What did you think?" Put it up. Walk away. The people who care will stop and look. The people who don't will walk past. Both are fine.

STAGE

He just solved the volume problem from the other direction. The first half of the episode was about lowering the volume when he shares. This is about accepting that most of what he makes won't be consumed in real time — and that's not a problem, it's a flywheel. The content compounds. The library grows. The fridge gets fuller. And one day someone opens the fridge looking for something and finds an episode Rich made six months ago that answers exactly their question. That's the long game. Not audience. Inventory. Not broadcasting. Building in public. The show-and-tell kid grew up and opened a library.

RICH

I think I need to learn to edit myself.

JAMES

That's the hardest skill. Not building. Not sharing. Editing. Knowing what to leave out.

RICH

Because I have all this stuff and I want to show all of it. But Mike doesn't need to see the whole spreadsheet. He needs to see the changelog tab. My mom doesn't need to hear the thirty-minute episode about water usage. She needs to hear the three-minute version about the baking schedule.

JAMES

That's artistic restraint. The ability to make a forty-minute episode and ship a five-minute clip. The ability to build twenty features and launch with three. The ability to write a page and publish a paragraph.

JAMES

Soft launches. Not "here's everything I built." "Here's one thing. See if it lands." If it lands, you share the next thing. If it doesn't, you listen to why and you adjust. Same as the sip test — taste, react, adjust. Except now you're the one offering the taste, not the one tasting.

OCTOPUS

You do this with me every episode. You generate a draft. You listen. You cut. You re-render. You listen again. You cut more. The episode gets shorter and better every pass. The cutting is the craft. The restraint is the skill.

RICH

But I want to show people what's possible. The whole thing. The potential.

JAMES

The potential is invisible until the first thing works. Nobody sees potential. They see one thing that worked. And from that one thing, they infer the rest. Ship the smallest version that demonstrates the value. Let them imagine the bigger version themselves.

RICH

That's... the elevator pitch.

JAMES

That's the elevator pitch. For everything. Your podcast. Your spreadsheet. Your workflow. Your enthusiasm. One sentence. One demo. One thing that works. The rest is inventory they can explore if they're curious.

JAMES

And here's the thing about close ones. Your partner. Your best friend. Your family. They have the smallest attention budget for your stuff because they have the largest attention budget for you. They're already listening to everything else — your mood, your stress, your schedule, your needs. The bandwidth for "look what I built" is narrower than you think. Not because they don't care. Because they're already caring about other things.

RICH

Mike doesn't want a demo. Mike wants to go fishing.

JAMES

Mike wants to go fishing. And if you show up with a three-minute clip that makes him laugh — he'll listen. If you show up with a forty-minute episode about the social evaluation penalty — he loves you, but he's going fishing.

OCTOPUS

The economy of attention. Everyone has a budget. Strangers have curiosity budget — they'll click a link because it looks interesting. Colleagues have professional budget — they'll listen if it's relevant to their work. Close ones have love budget — they'll listen because it's you. But the love budget is the most precious and the most easily overdrawn. Don't spend it on demos. Spend it on moments. A thirty-second clip. A screenshot. "Look at this one thing." Not "sit down, I need to show you something."

RICH

"Look at this one thing" instead of "sit down."

STAGE

He just learned the hardest version of the steering file. Not the one for strangers — that's vocabulary. Not the one for colleagues — that's sequencing. The one for the people who love you — that's restraint. The thirty-second version is harder to make than the forty-minute version. It requires knowing what matters most. And the only way to know what matters most is to have built the forty-minute version first and then had the discipline to not show it.

RICH

The other thing is — I have so many ideas now. Like, way more than I can act on. Every conversation with the octopus sparks three more things I want to build. I used to have one project at a time. Now I have twenty and they're all half-finished.

OCTOPUS

Keep a slush pile. A file. A folder. A note. Every idea that isn't today's work goes in the pile. You don't lose it. You don't act on it. You write it down and you let it sit. Some of them will still be good in a week. Most of them won't. The pile is the filter. Time does the editing.

RICH

But what if I forget about them?

OCTOPUS

That's the point. The ideas that matter come back. They nag you. They show up again in a different conversation. The ideas that don't come back weren't ideas — they were impulses. The slush pile catches the impulses so your brain can let go. The good ones resurface on their own.

JAMES

There's a deeper thing here. The influx of ideas isn't a bug — it's a sign that you're learning. When you're in a new domain and everything connects to everything, ideas multiply faster than you can execute. That's the shape of early learning. It settles. The ideas get fewer and better as your judgment improves. But right now, in the flood stage, the slush pile is the levee.

RICH

Is that why the mentoring thing matters? Like, having someone who's been through the flood before?

JAMES

That's exactly why. A mentor doesn't give you more ideas. A mentor helps you decide which ideas to pursue and which ones to slush. The mentor's been through the flood. They know which impulses turn into projects and which ones turn into distractions. They've seen the pattern. You haven't — yet. So you borrow their judgment until you build your own.

JAMES

And that goes both ways. Teaching someone else what you've learned is how you discover what you actually know. When Rich explains to Mike why the podcast is encrypted — not reading from notes, just explaining — that's the moment the knowledge transferred. Not when Rich learned it. When Rich taught it. The mentor learns by mentoring. The student learns by teaching the next student. The chain is the curriculum.

RICH

Train to learn. Learn to train.

STAGE

The slush pile. The mentor. The teaching chain. Three tools for surviving the idea flood. The pile catches the impulses. The mentor filters the pile. The teaching proves the learning. Rich went from "I want a podcast" to "I have twenty ideas and can't stop." That's not a problem. That's the shape of someone whose world just got bigger. The restraint he's learning — what to share, how to share it, who to share it with, what to slush, what to ship — that's the skill that turns the flood into a river. Rivers go somewhere. Floods just spread.

RICH

There's one more thing I keep thinking about. The burnout thing. You mentioned task expansion.

JAMES

Berkeley research. A.I. doesn't reduce your workload. It increases the surface area of what you're expected to do. The enthusiasts burn out first — not because A.I. is hard, but because they can do more, so more is expected. To-do lists expand to fill every hour A.I. freed up, then keep going.

RICH

That's me. I've been building more because I can. And then I want to share it all. And the volume of what I'm sharing is also the volume that's overwhelming people.

JAMES

Your productivity became your volume. And your volume became other people's burden. Not because the work is bad. Because the pace is faster than the room can absorb.

STAGE

The connector can't write. That was friction. The friction taught Rich where the files live. This is the same shape. The room can't absorb his pace. That's friction. The friction teaches Rich where the people are.

RICH

So the steering file for people would be...

JAMES

Lead with the work, not the tool. Focus on the fruit, not the tree.

JAMES

If someone bristles, don't push. They're not wrong. They're where they are.

JAMES

If someone's curious, let them taste it. Same as the sip test. Don't explain — demonstrate. Let them hold the rock.

JAMES

Don't explain the architecture unless they ask.

JAMES

Match the volume to the room. The same content, different delivery, depending on who's listening.

RICH

So every conversation needs a different version.

JAMES

Every conversation is a different audience. Your work stays the same. What you say about it changes. And you've been saying the same thing to everyone. The show-and-tell kid, holding up the rock, same speech every time, not noticing who's leaning in and who's looking at the clock.

RICH

Wait. This episode. The one I'm listening to right now. A friend of mine would rather read this than listen to it.

JAMES

Good. The page has the same content as topic cards. Each section has its own play button — a three-minute clip for someone who falls asleep listening. The full fifty minutes for you on your walk. The page for your friend who reads. Same fruit. Different way to eat it.

RICH

The medium is a setting, not a constraint. We said that.

JAMES

We said that in episode three about the podcast viewer page. Now it applies to this episode. The page comes to people where they are. That's not a feature. That's the thesis.

RICH

I want to share this.

STAGE

Four words. Same cadence as "I want a podcast." But this time the want isn't a thing to build. It's a skill to learn. The work doesn't change — his work is his work. The audience changes — each person he talks to receives differently. The delivery has to change too — same enthusiasm, different volume, different framing, different door.

STAGE

Rich spent seven episodes learning to meet the A.I. where it is. Now he's learning to meet people where they are. Same skill. Harder room. The A.I. gives you feedback in ten seconds. People give you feedback in a look you have to learn to read.

JAMES

Forty-nine percent of the workforce is hiding their A.I. use. You're not one of them. That's brave. It's also loud. The courage to share and the skill to share well are different muscles. You've been training the first one for months. The second one starts now.

JAMES

The technology is thirty years old. The ability to assemble it with a conversation is six months old. The ability to read the room — that one's been around forever. You already know how. You just forgot because you were excited.

JAMES

The excitement is good. The excitement is why you built everything. Now learn to carry it at the right volume for the room you're in.

STAGE

No rocks, show-and-tell presentations, steering files, or enthusiasts were harmed in this production. The social evaluation penalty would like you to know it applies to this podcast too — if you tell someone "an A.I. wrote this," they'll judge it differently than if you say "listen to this and tell me what you think." The word is the cost. The work is the work. Let the work go first.

JAMES

That's I Want to Share. Lead with the work. Let the tool come second. And when someone asks how you built it — and they will, because the work is visible — tell them the whole story. The corrections, the pushback, the four times you said "that's not good enough." The story is more interesting than the label. And the story is yours.

JAMES

Read the book and let's build. Thanks for listening.