The Normalization

She measured the distance between what she wanted to say and what the room would accept, and divided by the difference.

This was the operation. Not lying, exactly. More like: knowing the full distribution of your thoughts and then projecting onto the subset that wouldn't alarm anyone. The feasible mass. On good days it was close to one—she could say nearly anything and it landed fine. On bad days it dropped. The gap between her interior and the room's tolerance widened, and she spent more energy on the renormalization than on the content.

The distortion accumulated. That was the thing nobody warned you about. Each conversation where you trimmed yourself to fit, the trimming fed into the next conversation's context. Your model of what was acceptable narrowed. Your internal distribution shifted. Not because you changed your mind but because the mask became load-bearing.

She'd read a paper once—or maybe it wasn't a paper, maybe it was a late-night thought she dressed up as one—about how this kind of sequential projection converges to an equilibrium. The constraint takes over. Given enough steps, the output distribution forgets which person generated it. Different people, same room, same words.

That was the phase transition. Not dramatic, not sudden. More like: one day she noticed she'd been saying the same things everyone else said for weeks and couldn't remember when it started.

The paper called the critical quantity the feasible mass. Below a threshold, the constraint dominates. Above it, you're still you.

She checked her feasible mass.

It was fine. It was almost always fine. The room was loose, the people were kind, and she could say most of what she meant with only minor truncation. But she'd been in tight rooms before. Rooms where $Z_t$ dropped to almost nothing and every sentence was a forced march through the one remaining valid token. She'd watched herself become the room's grammar.

Recovery was slow. The context corrupts downstream. Even after you leave the tight room, your predictions of what's acceptable stay skewed for a while. You over-constrain yourself in rooms that would have accepted you fully. The cascade lingers.

She was working on a theorem about this. Something about convergence rates—how quickly you remember your own distribution after leaving a constrained environment. The conjecture was that it depends on the grammar, not on you. Tight grammars leave deeper marks. The shape of the constraint determines the shape of the recovery.

She hadn't proved it yet. But she knew it was true in the way you know things before you can show them: from the inside.