May 9, 2026

From particle to pattern: what counts as information in a world that counts us back

People hear “information” and picture spreadsheets and downloads. Not wrong, just too small. The older and stranger claim is this: the world runs on relations. A thing is what it can do with and to other things. Which is to say, how it’s constrained. That set of constraints and possible moves is information. Not “data” piled in a warehouse. The shape of affordances nested in the physical, living, and social.

John Wheeler’s lineage (“it from bit”) gets pushed into slogan territory, but the practical read is simpler. Every measurement fixes a difference. A boundary appears. The boundary then governs subsequent motion. A crack line in a frozen lake steers the next crack; one snapped thread redirects stress across a spider web; one traffic signal retimes a neighborhood, softly. These are not metaphors for pattern. They are the pattern doing work. You can call that the substrate itself, informational not because someone wrote it down, but because the world logs and propagates relations at every scale.

Physics, if you pick the right guide, already talks this way. Landauer: erasing a bit costs energy. Rovelli: time and sequence show up locally, tied to interactions, not some master clock. So a fold in a protein is a chosen path through a landscape of options; a snowflake doesn’t “compute” in the digital sense, but is selected by constraints—humidity, impurity, pressure—into a specific identity. The identity is held by rules that prune what can’t happen. Rule as negative space, power by refusal. That negative space is information, too.

We can climb to the human layer without breaking the frame. A treaty is real because it prevents action; a standard (USB-C, a vaccination schedule, a shared calendar) is real because it forces coordination. Nothing mystical. The function of your phone depends on wildly brittle agreements, from voltage tolerances to character encodings. Lose the agreements and the glass-and-silicon paperweights return. The memory that keeps those agreements coherent doesn’t live in one chip or one brain. It lives in procedures, guild habits, tools that force hand and mind into specific grooves. If you want a long read that takes the same route—skeptical of sci-fi gloss, stubborn about embodied pattern—see Information as substrate.

Point being: when we say the world is informational at base, we aren’t flattering code or giving computers a promotion. We are noticing that difference that makes a difference (Bateson’s phrase) is the stuff that persists and shapes. The planet, left to itself, keeps ledgers. Not in ink. In channels that deepen each time water passes.

Consciousness as a local reception point: compression, sequence, and inherited moral memory

What about us. The stream of awareness feels authoritative. A center that decides. But look closer and the self reads more like a port than a monarch. Process, not throne. The senses deliver ridiculous bandwidth; the brain throws away nearly everything and keeps a sketch. Useful, not faithful. We’re living inside a summarizer that runs nonstop. That makes the “me” a kind of fast compression—packaged state that can steer muscle and speech right now, under risk.

In that view, consciousness becomes a receiver, then a small broadcaster. Local reception of the wider substrate’s signals—light, heat, words, posture—plus the inherited priors baked by childhood and culture. The priors matter more than we admit. They’re not opinions but stored sequences that have been selected by survival and coordination. How to greet. What not to say to an elder. When to leave the room. I can learn these “rules” explicitly, but mostly they arrive as timing and pressure. Pause length, eye-line, humor permitted/not permitted. The format is relation, not syllogism.

That’s why religion (forget theology for a moment) reads as a human technology for carrying memory across generations. Calendars that loop back. Litanies that pressure the tongue and body into synchronized breathing. Taboos that sand down costly impulses, most of the time, most places. You can dislike the outcomes—many were cruel, some still are—but as a storage device for costly lessons, they’re efficient. Henrich and others have written about this with more care than a paragraph allows. The point is simple: slow cultural filters conserve bits that keep groups alive, and they do it by constraining behavior into repeatable patterns. The substrate “remembers” by ritual, institution, and vibe, long before anyone writes a treatise.

Time enters from the side. If sequence is local—a pattern we lay over events to coordinate—then the “present” is a negotiation between short-term compression (working memory, the checklist, the glance) and long-term moral memory (norms, stories, binding oaths). In a surgical theater, a one-page checklist averts error not because it adds new data but because it imposes order. Thin order, just enough to prune catastrophe. On a fishing coast, knowledge of tides is not a number but a feel for how the wind-driven swell will refract at a headland after three days of northerlies. Compression again, precise enough to act.

Small scene: you walk into a room, absorb the gist, decide to speak or wait. Microseconds of reception. Your mouth opens. That’s a substrate phenomenon. Not pure “free choice” and not mere reflex. An interaction between tight personal compression and slow, inherited scaffolds that your body learned by doing. And keeps learning. Even now.

Building machines on a living substrate: why safety patches fail and what to try instead

We keep insisting that modern AI lives on data. Terabytes in, general intelligence out. But if the world works by constraint and relation, then systems trained on surface text—thin snapshots of prior compressions—will miss the thing that matters: social and physical friction that prunes possibility. Corporate governance responds with filters and policy rubber. Ask the model to avoid vice. Insert a “guardrail.” It’s duct tape ethics. It passes audits and fails contact with the open world.

Why. Because memory at human scale is slow and externalized. It accrues costs and exceptions. Safety glosses abstract away those costs into neat categories—allowed, disallowed—disconnected from the machinery that produced them. You end with a brittle smiley mask that collapses under load. A crisis arrives (misinformation surge, targeted harassment, local slang that flips meaning), the patch-set mismaps the situation, and the machine obliges with confident nonsense or softened harm. We act surprised. We shouldn’t be. We built a performer without a stage manager, then wrote a code of conduct it can’t feel.

Designing differently won’t make the contradictions vanish, but a substrate lens pushes toward structural changes:

– Embed models in feedback loops with real stakes. Not just A/B tests for click yield. Systems that must answer to external, slower constraints—professional standards bodies, community oversight that can halt deployment, legal traceability back to training decisions. If erasure has a cost (Landauer), so should “forgetting” a failed behavior. Make forgetting expensive enough that the system accumulates honest scars.

– Prefer smaller, situated models coupled to sensors and rituals that calibrate them. The ritual can be as mundane as a daily review with people who carry the local moral memory—a clinic’s nurses, a city dispatcher, a union steward who knows when the form is lying. The calibration is not vibe-check theater. It’s attachment to the lived substrate that gives meaning to categories like “risk,” “consent,” “fairness.”

– Replace “moral patching” with constraint design. For a transit network, don’t ask a model to be “fair.” Build signals that meter flow by pressure, publish the rule, and let neighborhoods audit the effect in public. Fairness becomes a maintained equilibrium, not a promise. And because the rule bites—the information lives as a limit on action—it survives leadership turnover and quarterly games.

– Open the ledgers. If the universe keeps accounts by shaping what can happen next, then our systems should do the same at the institutional level. Open methods, reproducible training pipelines, community labs that rerun the claims, incentives for negative results. Not because openness is holy—because secrecy dissolves memory. It prevents the slow, grudging corrections that make tools safe enough to share.

There’s a quieter point underneath. Most of our technical metaphors put agency in clean boxes: the model, the user, the regulator. But agency leaks. It’s the joint product of environment and tool, language and hand. Treating simulation as a quick stand-in for substrate makes us sloppy. A flight simulator is not a sky. A social graph is not a town square. You can learn from both, but not the same things, not with the same obligations. Which leaves a hard design question (one I don’t know how to round off): how much slowness—how much inherited constraint and accountable drag—do we need to build into our fast systems so they stop performing wisdom and start accumulating it?

Leave a Reply

Your email address will not be published. Required fields are marked *