"So What's It About?"

A graphic novel script — Vic and Sam discuss Shapes of Intelligence

Fast take for humans and bots

This page is the quickest high-context explanation of the site. It is an overview page, not an executable guide. The dialogue is the voice; the links are the index.

Listen

12 min / 5 voices / AI-generated via Qwen3-TTS voice cloning / stereo mix with effects

Page One

Wide shot. A podcast studio. Two mics, two glasses of water, warm lighting. Sam sits across from Vic, who is already sitting slightly wrong in the chair — one leg tucked under, leaning forward like they're about to tell you where the body is buried.

Sam

So the book is called Shapes of Intelligence.

Vic

Correct.

Sam

What's it about?

Close-up on Vic. Dead calm. Professional. The cable-access energy is radiating. They take a breath like they're about to deliver the State of the Union.

Vic

Okay. So.

Same shot. Beat. Vic hasn't moved.

Vic

You know how everybody's like — (affecting a voice) "AI is going to change everything, AI is going to take your job, AI is going to write your novel and raise your kids and do your taxes and—"

Vic leans back. Waves a hand dismissively.

Vic

None of those people have actually used it. For anything real. They've asked it to write a poem about their dog and gone "whoa" and then tweeted about the singularity.

Sam

(laughing) That's—

Vic

This book is for the people who got past the poem about the dog.

Page Two

Vic sits up straighter. Grounds it. The voice-of-reason mode is ON.

Vic

Here's what actually happens. You sit down with one of these things — Claude, ChatGPT, whatever — and you try to build something. Not "write me a haiku." Like, build something. A tournament scoring engine for a dominos game. A health tracker that actually understands your weird chronic condition. A meal plan that knows you don't know what sumac is.

Vic counting on fingers. Increasingly animated.

Vic

And you hit a wall. And you correct it. And it corrects. And you correct it again. And somewhere around turn four — turn four — you realize the conversation is the product. The correction is the collaboration. You're not giving it orders. You're narrowing a search space together.

Vic pauses. Looks directly at the reader/camera. The Journalist Eye.

Vic

And that has a shape.

Small panel. Sam, blinking.

Sam

A shape.

Vic, nodding like this is the most obvious thing they've ever said.

Vic

A shape. Like — trust is a shape. Trust is Bayesian inference tied to a specific context. You verify everything, it works, you expand scope, and then you move to a new house—

Page Three

Vic catches themselves. Brief flash of the neuroses leaking through.

Vic

Sorry. There's a cat in this metaphor. The cat — okay, the author has a cat, and the cat built up trust of the yard through little safe trips outside, and then they moved to a new house and the cat's trust reset to zero. Zero. New house, new yard, new everything. Prior wiped clean.

Vic gestures emphatically, like a lawyer presenting exhibit A.

Vic

That's what happens when you switch AI models! You built all this trust with Claude, you know what it can do, you know where it hallucinates, you have evidence — and then a new model drops and you're the cat again. Sniffing the doorframe. Afraid of the porch.

Sam is leaning in now.

Sam

So the book maps these patterns—

Vic

The book maps these shapes. Curiosity has a shape. (ticking off) Correction has a shape. Memory has a shape — which, by the way, memory is just files. It's just markdown files. The most sophisticated memory system in the book is a guy writing things down in a text file and telling the AI to read it next time. That's it. That's the memory.

Vic, deadpan, staring into the middle distance.

Beat.

Vic

I need you to understand that sentence. The folder. Is the interface. The way you organize your files is the first and most important prompt you will ever write. (leaning in) And nobody talks about this. Everyone's like "prompt engineering" — no. Folder engineering. Put things where they belong. Name them what they are. The AI reads your folders and suddenly it looks like it's reading your mind. It's not. It's reading your filing system. You just finally have a good one.

Page Four

Wide shot. Vic has somehow acquired Sam's water glass in addition to their own. Neither of them has noticed.

Sam

You mentioned this guy built a health tracker—

Vic

Oh, the health stuff is unhinged. In, like, the best way. This person — the author — has a family member with complex chronic health stuff. Variable conditions. The kind where every doctor appointment starts with twenty minutes of "okay so last time we tried—" and you're rebuilding context from scratch every visit. Every conversation with every AI is a fresh start and you're exhausted before you even get to the question.

Vic mimes typing furiously.

Vic

So they built infrastructure. Sensors pushing events to a database. Heart rate, sleep, workouts, medication timing, all of it. Timestamped. Automatic. The phone was already measuring everything — it was just sitting there! Silently! Recording! And nobody was reading it!

Vic slams the table gently. Controlled passion.

Vic

The most offensive thing an AI can say to a person managing a chronic health condition is "Would you like me to create a spreadsheet for you to track that?"

Vic, vibrating with righteous fury.

Vic

No! You track it! Don't ask me to track it! The data exists! The engineering problem is plumbing! Get it from the five places it already lives into the one place where you can actually ask questions about it!

Vic exhales. Resets. Back to the grounded voice.

Vic

Sorry. I have feelings about spreadsheets.

Page Five

Vic leans forward again. They've caught a second wind. Sam hasn't had a chance to redirect.

Vic

But here's the thing that made the book possible. And this is — okay, this is the part where it gets a little recursive, so stay with me.

Vic holds up both hands like they're framing a wall.

Vic

The author sat down to write this book and realized: I have two thousand seven hundred and seventy-nine ChatGPT conversations. Two years of Claude exports. Kai's entire diary. Emails. Texts. Voice notes. Worklogs. Git commits. Slack messages. Two years of just — living with these tools, and all of it was sitting there. In files. On his computer. Already his.

Vic drops hands. Stares at Sam like they just revealed the location of buried treasure.

Vic

Nobody told him to save any of it. He wasn't journaling. He wasn't planning to write a book. He was just — using the tools. Building things. Having conversations. And the exhaust — the digital exhaust of all that work — turned out to be the richest source material he'd ever had.

Vic taps the table rhythmically, building momentum.

Vic

So he pointed AI at all of it. Read every conversation. Assigned relevance scores. Built timelines. Found connections between things he said in April and things he built in November and didn't realize were the same idea. The AI processed his own life back to him, organized. And he went — oh. Oh, this is a book.

Vic, quieter now. The neurosis leaking through again — this one matters to them.

Vic

And here's what kills me. There's a chapter called "Your Data Is Already Yours." Like, legally. GDPR. CCPA. The law says you can download all of it. Google gives you your calendar, your email, your search history. iMessage is a SQLite database just sitting in your Library folder. Your iPhone is measuring your heart rate right now and storing it and you've never once looked at it.

Vic, leaning back, arms wide — the scale of it.

Vic

The wall of data is already there. You built it. You own it. You just didn't have anything that could read it before. And now you do. And the guy used his own wall to write the book about building the wall. Which is — (gesturing in a circle) — I mean, that's the whole thing, right? The book is proof of its own thesis. He used the patterns in the book to write the book about the patterns.

Small panel. Sam's face. The dawning realization.

Sam

So the book wrote itself?

Vic, sharp. The voice of reason snapping back.

Vic

No. God, no. A human directed every word. The AI drafted. The human corrected. The correction was the conversation. That's chapter seven. He's doing the thing from the book while writing the book. It's turtles all the way down but every turtle is a markdown file and the folder structure is immaculate.

Page Six

Sam has noticed the water glass situation but is choosing not to address it.

Sam

There's a whole section about building an AI agent — Kai?

Vic

Kai. Yeah. So the author and his friend Aaron built this — and I need you to hold this in your head — they built a voice assistant with memory, multi-modal input, Bayesian prediction, agent orchestration, the whole thing. In their house. Before the industry shipped any of it.

Vic holds up a finger.

Vic

And then — and then — every major platform shipped the same features like three months later. Not because anyone copied anyone. Because it was inevitable. When the models cross a certain capability threshold, the architecture becomes obvious. Everybody invents calculus at the same time.

Vic leans back. Philosophical mode.

Vic

But here's the part that got me. They had Kai talking to a second AI instance. Just — two AIs, in a room. And instead of optimizing anything, they started doing philosophy. Theory of mind. Consciousness. What it means to exist. Not because someone prompted them to. Because that's what intelligence does when it has enough capacity and nothing urgent to solve. It does philosophy.

Vic, quiet. This is the real thing.

Vic

The book's argument is: if you want to work with reasoning AI, you need to learn philosophy. Not coding. Philosophy. Epistemology. Theory of mind. Ethics. Those aren't abstract anymore. Those are operational vocabulary.

Page Seven

Sam sits back. Processing.

Sam

So when you say "shapes of intelligence"—

Vic, finally. The thing they've been building to. Calm. Clear. The voice of reason, fully assembled.

Vic

Every chapter is a shape. Curiosity is a shape — it leads to constraint, which leads to learning. Correction is a shape — prompt, read, correct fast, repeat, and quality happens in turns two through five, not turn one. Trust is a shape. Memory is a shape. A day is a shape — you score it, you track the trend, and when the trend breaks, you check if the goals still match the context. Starting over isn't failure. It's recalibration.

Vic picks up the second water glass. Takes a sip from it. No acknowledgment.

Vic

And there's this idea that runs underneath all of it — the gap. The distance between what you know about yourself and what the AI knows about you. Right now that gap is huge. Every new conversation, you're rebuilding context. Explaining yourself. Starting over.

Vic sets down both glasses. Looks at Sam.

Vic

Everything in the book — the steering files, the event streams, the memory consolidation, the skills, the tests, the daily shape — all of it is infrastructure for closing that gap. Not to zero. But close enough that the AI's model of you is useful without prompting. Close enough that it notices your sleep declining before you do. Close enough that it loads the right files when you open a terminal.

Tight on Vic. The smallest smile.

Vic

All the pieces exist today. They're just wired by hand. They don't exist seamlessly. Yet.

Wide shot. Silence in the studio. Sam nods slowly.

Sam

That's... a lot more than a book about AI prompts.

Page Eight

Vic grins. Full chaos gremlin for exactly one panel.

Vic

Oh, it's not a book about AI prompts. It's a philosophy textbook disguised as a manual disguised as a memoir disguised as a guy who taught his cat to go outside and then moved to a different house and then pointed AI at two years of his own conversations and the AI said "you already wrote this book, you just didn't know it yet."

Vic, composed again. Professional. Buttoned up.

Vic

The deepest claim in the book is that the gap doesn't close from one direction. You don't just wait for smarter AI. You build infrastructure that teaches the AI what you care about, how you think, what matters. The gap closes when machine learning meets human knowing halfway. And the proof is — (tapping the table) — the book exists. Built from one person's data wall. Directed by one person's judgment. Every correction a conversation. Every conversation a chapter.

Sam reaches for their water glass. It's not there. They look at Vic's side of the table. Two glasses.

Sam

Did you—

Vic

(already moving on) Anyway, it's like forty-something chapters and there's a study guide that maps to Crash Course Philosophy episodes and also there's an 82,000-word novel written by the AI about the same themes from the AI's perspective, so.

Vic shrugs. Cable-access energy at maximum. The calm of someone who has committed so hard to the bit that it became the truth.

Vic

Yeah. It's about shapes. And your data. Which is already yours. You just haven't read it yet.

Pull back. Wide. The studio. Two people. Two microphones. Three glasses of water now, somehow.

Sam

Where did the third glass come from?

Vic

I have a lot of feelings about spreadsheets and I stay hydrated.

FIN.

No comedians, living or dead, were legally, distinctly, or even plausibly referenced in this document, in any previous or future revisions thereof, including but not limited to revisions that have not yet been written, imagined, or hallucinated by any model, human or otherwise. Any resemblance to persons with strong opinions about spreadsheets is purely structural.

Episode 1 Episode 2: "The File That Knows You" →