Trust, when it works, is a localised thing. You extend it to a specific person, in a specific context, for reasons accumulated through observation and experience.
It is bounded by what you know—about the person, about the situation, about the systems through which the trust operates.
When those boundaries hold, trust functions as intended. When they don’t, what remains is the rust—the residue of something that was once solid and has been exposed to conditions it wasn’t built for.
Most of the trust failures I have observed are not betrayals. They are miscalibrations. The person trusted did not set out to violate anything. They simply operated with an understanding of the system’s scale that was smaller than the system’s actual scale—and the gap between the two is where the damage occurs.
In 1982, before the internet had given us new ways to misunderstand the size of things, a neighbour of mine arrived home to find a man on the street outside his house struggling with a large television set. He offered assistance. Together they lifted the television into the back seat of the man’s car. The man drove away. My neighbour went inside and discovered that the television the man had driven away with was his own.
He had helped a stranger steal from himself.
The mechanism of this particular failure is worth examining because it persists, in updated forms, across every subsequent development in how people manage information and access. My neighbour assumed that the situation he was observing was the situation that existed. A man struggling with a television on a public street presented as a man who owned a television and needed help moving it. The assumption was reasonable given the visible evidence. The visible evidence was incomplete. The gap between what was visible and what was true was the gap through which the theft occurred.
He mistook the street for a room. In a room—his room—a man struggling with a television would have been immediately legible as a burglar. On the street, the same man with the same television was legible as a neighbour. The context changed the meaning and my neighbour read the context incorrectly because the fuller context was not available to him in the moment.
This is the structural failure at the centre of everything that follows. Not dishonesty. Not carelessness in the ordinary sense. A miscalibration of scale—an assumption that the situation is more bounded, more local, more contained than it actually is.
Some years ago, the woman next door was moving house. She was pleased about this and posted a note to her social media page to share the news with her friends. She described her plans for a leaving party—the times, the local pub where they’d meet, the celebration she had in mind. She went to the pub. She celebrated. She returned home to find her house had been ransacked and every neatly packed moving box was gone.
She had an open account. This meant her post was visible not to her friends but to anyone, anywhere, who encountered it by any means—search, share, algorithm, the various mechanisms by which content on open platforms travels to people its author did not intend to reach. She knew this abstractly. She did not act as though it were true.
The burglars understood the platform better than she did. They understood that an open account announcing a departure time and a destination is not a message to friends. It is a public notice, available to anyone with the relevant interest in receiving it. They had the relevant interest. They received it. They acted on it while she was at the pub.
She had mistaken the network for a room.
The post she wrote was the post you write to a room—informal, warm, addressed to people she knew, assuming a shared context that limited who would read it and how. The platform on which she wrote it is not a room. It is a networked, open, persistent system through which information moves in ways that have nothing to do with the intentions of the person who put it there. The gap between the room she thought she was writing in and the network she was actually writing in was the gap through which the loss occurred.
Fred shared my GPS data with a stranger.
I discovered this when my iPhone delivered a notification informing me that my location data had been shared. The notification is a feature of the system—the system’s way of telling you that information about you is moving to a destination you may not have authorised. I had not authorised it. I had not been asked. The sharing had already occurred before the notification arrived, which meant the notification was not a request for permission but a report of what had happened without it.
Fred had known the stranger for two weeks.
Fred shared my location—specific, real-time information about where I am—with a person he had known for a fortnight, whose background, intentions, and character he could not reasonably claim to know. He did this without asking me, without telling me, and apparently without any awareness that this might constitute a breach of anything. When I raised it, Fred’s position was that he had not done anything wrong. The stranger was not a stranger to him. He had known him for two weeks. He trusted him.
The structure of Fred’s reasoning is worth being precise about because it is not, from Fred’s perspective, unreasonable. He extended to me the same trust he had extended to the stranger. He had decided the stranger was trustworthy. He therefore decided my data was safe with the stranger. He had, in his own framework, looked after the situation.
What Fred did not account for is that his trust of the stranger is not my trust of the stranger. The two things are not the same thing and cannot be transferred between people as though they were. My trust of Fred was built over time, through experience, through a process of observation that gave me some reasonable basis for a judgment about Fred’s character and reliability. I have no such basis for a judgment about the stranger. I have not met the stranger. I do not know the stranger. I have only Fred’s two weeks of acquaintance to go on, and Fred’s two weeks of acquaintance is not a foundation I was consulted about before my data was placed on it.
Furthermore, having shared my data once without asking, Fred had established a pattern. Not a malicious pattern. A habitual one. A way of operating in which the question of whether I would consent to a particular sharing was not part of the process. If the question wasn’t asked this time, there was nothing to prevent it not being asked the next time, with the next person, under the next set of circumstances I would not know about until the notification arrived.
My former trust of Fred does not extend to his choice of friends. That extension was made without my knowledge and without my permission. It was made, I think, without Fred understanding that an extension had been made at all—because from Fred’s position, he had simply shared something with someone he trusted, which felt continuous with the trust he and I already had. It did not feel like a new transaction. It was a new transaction. It exposed me to a person and a set of circumstances I had not consented to be exposed to.
Trust is extended without permission.
The two failures—the miscalibration of system scale and the extension of trust beyond its granted boundaries—are related and often occur together, as they do in Fred’s case.
Fred assumed the sharing was local. He thought of it as a small thing—showing a friend something, the way you might show a photograph or share a piece of news in a conversation. He did not think of it as an entry into a network of unknown extent, through which information about me would now move according to the stranger’s choices rather than my own. The stranger might share it further. The stranger might store it. The stranger might use it in ways that are presently unimaginable and will only become imaginable after they have already occurred.
Fred thought of a network and saw a room. The room felt manageable. The room felt like something his trust could cover. The network is not something his trust can cover, because his trust extends only as far as he can see, and he cannot see past the stranger to whatever lies beyond.
The helpful neighbour who lifted the television into the wrong car had the same problem at a different scale. Each of them encountered a situation that presented as smaller and more bounded than it was. Each of them acted within the situation they perceived rather than the situation that existed. Each of them found out about the gap afterward, when the consequences of the gap had already arrived.
What these failures share, beneath their different surfaces, is an assumption about containment.
The social media post was assumed to be contained to friends. The GPS data was assumed to be contained to Fred’s circle. The television was assumed to be contained to the street’s ordinary logic. In each case the assumption of containment was the error—not the only error, but the foundational one, the one from which the others followed.
We are not good at intuiting the scale of the systems we operate within. We are good at rooms. We evolved for rooms—for bounded spaces with visible occupants and knowable limits, where the consequences of actions were local and traceable and correctable. The systems we now inhabit are not rooms. They are networks—open, persistent, distributed, operating according to logics that are often opaque to the people using them and by design indifferent to their intentions.
The trust we built for rooms does not transfer to networks without adjustment. The adjustment required is a persistent awareness that the situation is larger than it looks—that the post reaches further than the friends list, that the shared data moves further than the intended recipient, that the helpful act on the street connects to a larger situation than the one visible from the pavement.
That awareness is not natural. It requires effort and a specific kind of imagination—the ability to ask, before acting, not just who can I see but who can see me, and who can they show.
The neighbour who helped carry the television had no reason to ask that question in 1982. The woman posting her moving plans had every reason to ask it, and didn’t. Fred had every reason to ask it, and didn’t.
Trust, to remain trust rather than rust, requires knowing the size of the thing you’re trusting someone with.
And the size of the thing is almost always larger than it looks.