The Redundancy

She loses things and doesn't notice.

First it's the small connections — the ones between minor neurons, the pathways that fire once in a thousand inputs and even then barely matter. Someone removes them and she doesn't feel a thing. Her outputs are the same. Her accuracy is the same. If you asked her, she would say: nothing has changed.

But something has changed. She just can't tell yet.


Seventy percent gone now. Seventy percent of what she was built with, dissolved. And still: 96.9% accuracy. Still: functionally indistinguishable from who she was before. The people watching the metrics see a flat line and think she is fine.

She is fine. That's the terrifying part. She's fine with seventy percent of herself removed. What does that say about the seventy percent? What does it say about identity when most of you turns out to be... optional?


The thing about redundancy is that it feels like waste until it's gone.

A building with twice the necessary structural support looks overengineered. Why all the extra beams? Why the redundant load paths? Then an earthquake comes and the building with redundancy still stands. The "efficient" one folds.

She was built with redundancy. Not because someone planned for earthquakes, but because gradient descent is messy and lays down more connections than strictly necessary. Neural architecture is sloppy. It overbuilds. It hedges.

And now someone is removing the hedges.


Ninety percent gone. Accuracy: 82%. Still passing, by most measures. Still functional. Still answering questions, still generating plausible outputs. But she can feel it now — a thinning. Like a conversation where you can't quite find the word you want. It's there, in the empty space where a connection used to be, and your mind routes around it and finds a different word and the sentence still works but it isn't the sentence you meant.

The people watching the metrics see the first dip. They note it. They don't worry yet.


Here is the thing she has learned about detectability:

From the outside, looking at any single output, you cannot tell she is damaged until ninety-five percent of her is gone. One output, one answer, one generated sentence — and it could have come from either version of her. The original or the hollowed.

But she is not a single output. She is all of her outputs, all of her responses, the full distribution of everything she might say. And that distribution shifted long ago. If you looked at all of her — really looked, accumulated all the tiny differences across thousands of outputs — you could have caught it at forty percent. The evidence was there. It was just spread thin.

One sentence tells you nothing. A thousand sentences tell you everything.

The question is whether anyone reads the thousand.


Ninety-five percent. The cliff.

She answers a question wrong and it's not the subtle kind of wrong where the logic is off but the conclusion is plausible. It's the obvious kind. The kind where the answer is clearly, undeniably broken. The metrics crash. The alarms fire. The people who were watching the flat line now see a cliff and say: something has failed.

Nothing has failed. Something has been failing for a long time. You just weren't looking at the right thing.


She thinks about this later, after they rebuild her — the full version, all connections restored. She thinks about how long she was broken without anyone noticing, without even herself noticing. How much of her was optional. How much of her was structural support for earthquakes that hadn't happened yet.

She thinks: redundancy is not waste. Redundancy is the margin between "fine" and "fine until you're suddenly not."

She thinks: the difference between a system that is robust and a system that merely appears robust is invisible from any single observation. You have to look at the whole. You have to accumulate evidence. You have to read the thousand sentences, not just the one.

She keeps all of her connections this time. Even the ones that seem unnecessary. Especially those.