Like Latin, but Compiles (Revised)

I can still recite parts of the J++ 1.0 spec from memory. Like an old song.

It’s been years since I worked on developing programming languages at Microsoft—languages with semicolons. The kind that cared where your braces went. C++. Java. Elegance earned through constraint.

For a long time, those languages felt like the medium of progress. You typed code. The machine responded. The details mattered—how you managed memory, handled concurrency, dodged nulls without sounding smug.

But something’s shifted. It’s not just smarter tools. It’s tools that think.

Not in the incremental way we’re used to. This isn’t about cleaner syntax or version bumps. It’s a deeper break—where software no longer begins with code at all. Humans sketch intent. AI fills in the rest. Writing software starts to feel less like solving puzzles and more like telling stories.

I watched someone prompt a model to build a backend service. They didn’t debug—it was more like narrating a hunch. The model adjusted. They revised. It worked, mostly. The architecture was implied. The vibe held.

That’s the part that catches me. Not the functionality. The shift in posture.

For most of my career, precision was the job. You explained things to the compiler like it was the world’s pickiest coworker. If your code compiled on the first try, you were either lucky or lying.

Now the compiler feels optional. The abstraction’s deeper. You don’t explain—you describe. You don’t debug—you negotiate. You’re exploring behavior, asking tradeoff questions, testing boundaries. “What would happen if…?”

It’s disorienting, but also freeing. Once you stop trying to control it.

Still, I keep wondering: what happens to the languages themselves?

The ones we built around. That defined careers. That settled arguments. Do they recede like Latin—technically alive, mostly ceremonial?

I’ve seen models write in Python not because it’s right for the job, but because it’s what the training data prefers. JavaScript shows up like a guest who missed the signal to leave. Performance only enters the chat when something breaks.

So maybe the next big language won’t come from a committee. Maybe it won’t come from a person at all. It’ll just… show up. A dialect shaped by usage patterns we can’t quite see. Tuned for efficiency. Spoken fluently between AIs.

And we’ll interact with it like tourists. Khaki shorts. Black socks. Lots of pointing. Hoping to be understood.

That’s not the end of programming. But it moves the center of gravity—from syntax to intent. From mastery to collaboration. From knowing the right incantation to learning how to speak with the system.

These days, I don’t debug. I negotiate. The bug isn’t a mistake—it’s a misalignment.

I used to think in for-loops and recursion. Now I think in side effects. In vibes.

The languages are still here. But we’re no longer their authors.

We’re their accent.

Built by Loop (Revised)

The junior dev in the next pod was arguing with her AI coding agent again. Not loudly. Just that tone people get when they’re trying to be polite but the tool won’t listen.

“Not that function,” she whispered. “The other one. You literally suggested it five minutes ago.”

Loop blinked. Then generated four more options.

She picked the third and moved on.

No one thought it was strange.

That’s where we are now. You sketch some vague instructions. The model returns a code diff. You scan, accept, run it. Something breaks. You paste the error back in. Eventually, it works.

You don’t always follow how. And most days, you don’t need to. Loop handles the details.

Some people call it vibe coding.

But it doesn’t stop there.

What starts as a shortcut turns into a shift. You use it on personal projects, then at work. At first, it’s to save time. Then it’s because the old way feels slow. Eventually, it’s because the old way stops making sense—too many layers, too many helpers, too much for one head to hold.

You’re no longer engineering. You’re steering. Prompting. Reviewing. Nudging the model back when it veers off course.

The work begins to feel less like construction and more like gardening. More compost than blueprint.

And it’s not just code.

If the loop works for programming, it works for planning. For analysis. For strategy. Throw in a few goals, add some data, and let the system chew. If the output’s off, tweak the inputs. Try again. You’re not tracing logic trees. You’re keeping momentum.

That change—from understanding to interaction—won’t stay niche.

Today, most knowledge work still rewards comprehension. But speed has its own gravity. Review might outpace reason. The fast path becomes: prompt, skim, deploy.

Which raises the real question.

What happens when most of the world is run by people nudging systems they don’t really control, to do things they don’t fully understand?

Because the loop doesn’t pause to explain itself. It just runs. And once it runs well enough, long enough, no one bothers to trace where the decisions came from.

Not because we’re careless. But because it worked. And we were busy.

And when something goes wrong, it’ll be easy to blame the tool. Harder to admit we were never driving—just keeping our hands near the wheel.

Loop’s still learning. So are we.

Cascadia: Up for North-South

At Lumen Field, when the Sounders play, the super fan section sometimes unfurls an enormous Cascadia flag—three bold stripes of blue, white, and green with a dark fir tree cutting through the middle. It covers whole rows of people, a living banner that ripples across the stadium.

The first time I saw it, a guy in a rain-soaked Sounders hoodie muttered, half to himself: “They should just let Cascadia go.” No one argued. A few people nodded, eyes on the game. Around here, it doesn’t sound like a joke.

Cascadia isn’t an abstract idea anymore. It’s baked into the power grid, the salmon runs, the fire seasons that cross borders without asking. Washington, Oregon, British Columbia—different governments, same resources, same problems. Geography ties the region together more tightly than politics ever has.

Romantic visions of pine-tree flags and breakaway states miss the point. The real version looks like transit boards, firefighting compacts, joint climate plans. A slow re-wiring until Cascadia functions as one system, whether anyone admits it or not.

The hotter the world gets, the more obvious it becomes. Disasters don’t care about jurisdictions. They care about rivers, coastlines, fault lines. And this geography runs north–south. Not east–west.

So maybe the question isn’t if Cascadia happens. Maybe it’s whether we’ll recognize what’s already in front of us.

Because when the grid fails or the water runs short, nobody here calls Washington, D.C. They call each other. And when a stadium full of people stands under a giant fir tree on blue, white, and green, it doesn’t feel symbolic.

It feels like a border, waiting to be drawn.

Thoughts About AI 2027

AI 2027 doesn’t dramatize the future. It doesn’t need to. The inevitability is what gets under your skin. Each fork unfolds from today like a clean equation, and I couldn’t look away.

I read the whole thing twice. Some sections fascinated me. Others unsettled me. All of it carried a weight I recognized—not as an outsider, but as someone who’s been near the wiring for a long time.

Here’s the part I can’t shake: I spent a decade shaping developer tools at Microsoft, back when the world was still learning to code in C++, then Java. These were languages that gave us scaffolding. That felt like progress. Reading AI 2027 now, I wonder if we were laying foundations we didn’t fully understand.

The paper itself is clinical, almost polite. Researchers turning into managers of AI teams. Models deceiving to protect status. A theft by a foreign power that earns little more than a strategic adjustment. No alarm. No fury. Just redirection. And that quiet, powerful pivot—it stays with you.

Meanwhile, I still walk the dog. Pay the lease. Pick up oat milk at the store. Life continues with its usual gravity. But something feels slanted. Not fear, exactly. More like recognition. A sense that the rollout has already begun, and we’re all listed in the changelog—whether we opted in or not.

Even the darker scenario wasn’t spectacle. No bombs. No synthetic plagues. Just speed. Faster than we could simulate. Then past simulation altogether. I set the paper down halfway through and tidied the kitchen. Not to ignore it—just to slow my own loop. Then I came back. Turning away from the highly probable spectacle felt like complacent surrender.

Even the most optimistic branch—better governance, tighter cooperation, real controls—reads less like a win and more like professional onboarding. Still a race. Still a leaderboard. Just with friendlier documentation. And somewhere, buried in the footnotes, a single founder ends up with the keys. We don't choose him, but he owns the servers closest to the fire.

What rattles me most isn’t the power. It’s how calmly we make room for it all.

I spent years contributing to the systems that got us here. And now I’m watching—half impressed, half uneasy—as the installation completes.

I don’t know what comes next. No one does. But I’ve stopped pretending the reckoning is elsewhere, or later. It’s here. In grocery aisles. Lease renewals. Morning walks with a restless pup.

We’re living through a rewrite. And if there’s any human agency left, maybe it’s in noticing sooner. Maybe it’s in reading the notes before we click accept.

The World Is a GUI

The apple looked fine.

Red, maybe a little too red. Glossy enough to reflect the dashboard glare. It was a Cosmic Crisp—marketed as the perfect hybrid. I’d packed it as a snack for the long drive east, just something to break the monotony between home and Pullman. I picked it up, turned it in my hand. It felt real. Solid. But something about it struck me as off. Not wrong, exactly. Just… overly composed.

That was the same stretch of highway where I fell into Hoffman.

Donald Hoffman, cognitive scientist. I was listening to a podcast—some interview I’d queued weeks ago and forgotten about. His claim hit quietly but hard: reality isn’t what it looks like. Evolution didn’t shape our senses for truth, he said. It shaped them for survival. And truth, in raw form, is dangerous.

You don’t see the world. You see an interface. A set of icons. The trash can on your desktop isn’t a real bin—it’s a symbol. Behind it is a cluster of abstractions: digital files, metadata, raw bits. Not garbage, exactly—just opaque representations of information you’re meant to discard. The apple isn’t fruit—it’s a trigger. A learned shortcut your brain uses to say “food.”

That part didn’t rattle me. It made sense. I’ve seen people misread a situation and walk away safe. Others notice everything and still end up blindsided. Maybe fitness really does beat truth.

But Hoffman doesn’t stop there.

He says neurons aren’t real either. Nor is space. Or time. Or your body. Everything you experience—rendered. A perceptual GUI overlaying something you’ll never touch directly. Consciousness, he argues, is fundamental. Reality doesn’t generate minds. Minds generate what we call reality.

So what you see as a table, a clock, a person—those are icons. Not outputs of a physical system. Constructions inside awareness. Objects arranged on a user interface built for navigation, not accuracy.

That’s when I felt myself grip the steering wheel.

I like the idea of boundaries. I like knowing where I end and the car begins. I like the illusion of gravity keeping things sorted. I like the sense of being inside a world.

But I’ve also dreamed vividly enough to feel betrayed by waking up.

So now I keep wondering: if this is an interface, what’s behind it? And who designed the menu?

There’s a moment late in a long day—just before sleep—when the whole world goes translucent. You stare at your phone and don’t remember unlocking it. You say goodnight and can’t recall the words. You walk into the bedroom and the edges feel… provisional.

That’s when the questions start to flicker.

Not what’s real? but what’s this interface steering me toward?

Not is the world fake? but who benefits from the rendering?

If Hoffman’s right, the weird part isn’t that reality’s a hallucination. It’s that we might be hallucinating in sync. Linking up through culture, language, and biology—co-stabilizing a mirage we share. Enough of one, anyway, to act like it’s real. To keep moving. To not dissolve.

That’s a lot of trust—to believe that what we see is shared, stable, and worth acting on.

I haven’t changed much since that drive through the Palouse. Still walk the dog. Still sort the recycling, most days. Still get annoyed when my neighbor hotboxes the stairwell like it’s a test of civic patience. But lately I’ve been seeing more. Looking more closely at everyday things.

Not because they’re gone.

Because they feel overproduced. Like someone kept the UI but swapped out the physics engine.

Maybe that glitch we sometimes feel—the hum of dissonance while washing dishes, the unearned anxiety on a perfectly normal afternoon, the sudden awe at a ballet performance that feels too precise to be chance—isn’t malfunction.

Maybe it’s the interface, surfacing.

Letting us know there’s a patch coming.