Nostr's Core Principle Isn't Really Censorship-Resistance, but Actually Freedom of Association

Nostr is Pro-Censorship: Decoding the POW and WOT -- Unpacking the relay consensus and perceived immutability
Nostr's Core Principle Isn't Really Censorship-Resistance, but Actually Freedom of Association

The Misunderstood Protocol

There is a phrase on the Nostr website that confuses almost everyone who encounters it for the first time. It sits beneath a shield icon and reads: “Pro-censorship.”

This is not a joke. It is not irony. It is not some provocation designed to generate controversy among the digital freedom crowd. It is, rather, the most honest and most misunderstood statement in the entire protocol documentation.

How can a platform built by Bitcoiners, embraced by dissidents, and designed explicitly for uncensorable communication declare itself “pro-censorship”? The answer reveals everything about how Nostr actually works—and why its core principle is not censorship resistance at all.

The core principle of Nostr is freedom of association. Censorship resistance is an emergent property, a side effect, a consequence of deeper design choices. When you understand freedom of association as the foundation, the “pro-censorship” stance becomes not a contradiction but a logical necessity.


The Meaningless Banner of “Free Speech”

Before we can understand why Nostr is “pro-censorship,” we must first understand why the conventional notion of “free speech platforms” is fundamentally incoherent.

Every platform that has ever declared itself a bastion of free speech has eventually faced the same dilemma. Someone posts something genuinely awful—child sexual abuse material, explicit threats of violence, detailed instructions for committing crimes. The platform must decide: does “free speech” protect this?

If they say yes, they become complicit in criminal activity and lose advertising revenue, payment processing, and app store access. If they say no, they become censors, and the “free speech platform” label becomes meaningless.

This is not a bug in the implementation. It is a feature of the architecture. Centralized platforms, by their nature, must make content moderation decisions that apply to everyone. They are, in effect, governments of their own digital territories. And like all governments, they must either enforce rules or descend into chaos.

The phrase “free speech” in this context does real ideological work. It obscures the inevitable moment of choice. It suggests that there is some neutral position, some principled stance, that can be occupied indefinitely. There is not. Every platform eventually chooses who to exclude. The only question is whether they are honest about it.

Nostr refuses to pretend otherwise. It does not claim to be neutral. It does not claim to protect free speech. It recognizes that different people have different morals and preferences, and that each server, being privately owned, can follow its own criteria for rejecting content as it pleases.


Freedom of Association as First Principle

The American Civil Liberties Union, in its decades of free speech litigation, has consistently defended a principle that outsiders often find confusing: the right of the Ku Klux Klan to march is also the right of a private organization to exclude them. Freedom of speech and freedom of association are two sides of the same coin.

You cannot be forced to associate with those you disagree with. You cannot be forced to provide a platform to those you find abhorrent. The right to speak implies the right to choose who you speak with and where you speak.

Nostr takes this principle seriously—more seriously than any platform that has ever existed.

The protocol makes no attempt to define acceptable content. It imposes no global rules. It maintains no central list of banned topics or users. It has no community guidelines, no terms of service, no acceptable use policy. It cannot have these things, because it is not a platform. It is a protocol.

This is not an oversight. It is the entire point.

By refusing to define what is acceptable, Nostr refuses to become a government. It refuses to make the choices that every platform must make. It outsources those choices to the only entities that can legitimately make them: the individual relay operators who choose what to accept on their own servers.


Relays as Sovereign Territories

Imagine, if you will, a vast archipelago. Thousands of islands, each with its own ruler, its own laws, its own culture. Some islands welcome everyone. Some require visitors to pass tests. Some charge entry fees. Some are open only to specific groups. Some are hidden, known only to those who have been invited.

This is Nostr.

Every relay is a sovereign territory. Its operator decides who can post, what content is acceptable, how long data is retained, and who can read what. These decisions are not subject to appeal. There is no higher authority. There is no global court of relay justice. There is only the individual choice of each operator, and the individual choice of each user to associate with that relay or not.

This architecture is not a concession to reality. It is not a compromise forced by technical limitations. It is the deliberate, intentional design of a system that takes freedom of association seriously.

The relay operator who wants to run a family-friendly space can ban anything they consider inappropriate. The operator who wants to create a haven for political dissidents can accept content that would get them arrested elsewhere. The operator who wants to charge for access can do so. The operator who wants to provide free service funded by donations can do that too.

None of these operators is wrong. None of them is failing to uphold some abstract principle of free speech. They are simply exercising their freedom of association—and in doing so, they are creating the conditions for a network that no single authority can control.


The Paradox: Embracing Censorship at the Node Level

Nostr’s approach to censorship resistance contains a profound insight that its “pro-censorship” labeling makes explicit: the network resists censorship precisely because individual nodes are allowed to censor.

When relay operators face legal pressure or censorship demands, they can simply comply (delete specific content), thereby ensuring their own server’s survival. However, because the user’s client broadcasts the same event to multiple relays simultaneously—relays that may be distributed worldwide and subject to different jurisdictions—the information itself survives in the network. Even if one, ten, or even hundreds of relays delete a piece of information, as long as one relay retains a backup, it remains accessible.

This design creates a radically different incentive structure for relay operators. Rather than being forced to fight every censorship battle, they can comply with local demands while the overall network maintains resilience through diversity.

The system’s resilience comes from the diversity of the network, not the uniformity of its members. The system’s total knowledge is the union of data held by all relays, not the intersection. A piece of information survives as long as at least one relay somewhere continues to host it.

This is the paradox made visible: by allowing every relay to censor, Nostr creates a network that no one can censor. The “pro-censorship” label is not a contradiction. It is a precise technical description of how censorship resistance actually works.


Proof-of-Work as Anti-Spam, Not Consensus

Nostr includes a proof-of-work mechanism defined in NIP-13, but it serves a very different purpose than Bitcoin’s proof-of-work. In Nostr, proof-of-work is not a consensus mechanism. It is an anti-spam tool.

A relay can require that events include a proof-of-work—a computational effort that makes spam expensive. The event includes a nonce tag and a difficulty target. The client must find a nonce that produces an event ID with a certain number of leading zero bits.

This is optional. A relay can ignore proof-of-work entirely. It can require a minimum difficulty. It can adjust difficulty based on the user’s reputation. The choice is the relay’s.

Proof-of-work in Nostr is not about securing the network. It is about giving relay operators a tool to manage spam without making subjective content judgments. A relay that requires proof-of-work is not saying “your content is bad.” It is saying “you must prove you expended effort to post here.”

This is freedom of association in practice. The relay gets to set the terms of entry. The user gets to decide whether those terms are acceptable. Neither is imposing their will on the other. Both are exercising their freedom to associate or not.


The Web of Trust: Reputation Through Association

The Web of Trust, visualized at freakoverse.github.io/wotonnostr/, is the social layer that gives meaning to freedom of association. It is not a global reputation system. It is a personal one.

In the Web of Trust model, your feed is filtered by the actions of people you trust. If five people you follow have muted a particular account, their posts are less likely to appear in your feed. If five people you follow have followed a particular account, their posts are more likely to appear.

This is not censorship in the platform sense. It is curation in the community sense. You are not saying that posts from certain accounts are forbidden to exist. You are saying that you do not want to see them. Others can still run their own relays with different policies. The network accommodates both choices.

The Web of Trust can be extended. You can choose to see posts from people followed by people you follow (depth 1), or people followed by those people (depth 2), and so on. This allows you to scale your trust beyond your direct relationships without losing its grounding in personal association.

The Web of Trust is the practical implementation of freedom of association. It gives you the tools to build your own network of trusted voices, independent of any global algorithm, independent of any platform’s moderation policies, independent of any central authority.


NIPs as Possibilities, Not Mandates

The Nostr Improvement Proposals are not standards enforced by any authority. They are documentation of what may be implemented by Nostr-compatible relay and client software. The name itself—“Implementation Possibilities”—encodes this philosophy.

A NIP becomes widely used not because it is mandated, but because it proves useful. Developers implement it because it solves a problem. Users adopt it because it improves their experience. The protocol evolves through voluntary coordination, not central direction.

This is permissionless innovation. Anyone can propose a NIP. Anyone can implement a NIP. Anyone can ignore a NIP. The protocol does not care. The protocol cannot care. The protocol is just text.

The criteria for acceptance of NIPs reflect this philosophy. They should be optional and backwards-compatible. Clients and relays that choose not to implement them should not stop working when interacting with those that do. There should be no more than one way of doing the same thing.

Even the repository itself acknowledges its potential centralizing role. It asks: “Is this repository a centralizing factor?” The answer is that a centralized index of standards exists for practical reasons, but it can be challenged, migrated, or forked at any time. The protocol does not depend on any single point of control because the protocol is just text.


The Network Effect of Sovereignty

When every relay is sovereign, the network as a whole becomes resilient. No single decision affects everyone. No single operator controls the conversation. No single jurisdiction can silence a voice.

If a relay becomes hostile, users leave. If a relay shuts down, content survives elsewhere. If a relay is captured by an attacker, the damage is contained. The network routes around failure because failure is always local.

This is the opposite of centralized platforms, where a single decision affects everyone. When Twitter bans someone, they are banned everywhere. When Facebook changes its algorithm, everyone’s feed changes. When YouTube demonetizes a channel, the creator loses income globally.

In Nostr, decisions are local. Their effects are limited. The network absorbs shocks that would shatter centralized systems.


What “Pro-Censorship” Actually Means

We return, finally, to that confusing phrase: “Pro-censorship.”

On the Nostr website, the explanation appears under a shield icon. It reads:

“The protocol is ownerless, relays are not. Nostr doesn’t subscribe to political ideals of ‘free speech’ — it simply recognizes that different people have different morals and preferences and each server, being privately owned, can follow their own criteria for rejecting content as they please and users are free to choose what to read and from where.”

This is not a defense of censorship. It is a recognition that censorship is inevitable, and that the only meaningful question is who gets to do it.

In centralized platforms, censorship is done by a single entity that answers to no one. It is opaque, unaccountable, and global in effect. A single decision silences a voice everywhere.

In Nostr, censorship is done by thousands of entities, each accountable to their own users. It is transparent—you know exactly what rules a relay enforces before you choose to use it. It is local—a single decision silences a voice only on that relay, leaving it audible everywhere else.

This is the difference between tyranny and pluralism. Tyranny imposes one set of rules on everyone. Pluralism allows many sets of rules to coexist, and lets individuals choose among them.

Nostr is pro-censorship in the same way that a city with many restaurants is pro-vegetarian. It doesn’t force anyone to eat meat. It creates the conditions where everyone can eat according to their own preferences, and no one is forced to accept anyone else’s choices.


The Role of Proof-of-Work in a Freedom of Association System

NIP-13 defines proof-of-work as an optional anti-spam mechanism. A relay can require a minimum difficulty, and the client must find a nonce that produces an event ID with enough leading zero bits.

This is not about consensus. It is about cost. Proof-of-work makes spam expensive. A spammer would need to expend computational resources to post, which becomes prohibitive at scale. A legitimate user can afford the occasional proof-of-work.

But the choice to require proof-of-work is the relay’s. Some relays require it. Some don’t. Some adjust difficulty based on the user’s reputation. The system is flexible because the choices are distributed.

This is freedom of association applied to economics. A relay that requires proof-of-work is not saying “your content is bad.” It is saying “you must pay a computational cost to post here.” Users can choose to pay that cost or post elsewhere. Relays can choose to waive the requirement for trusted users. The network accommodates both choices.


The Web of Trust as Personal Curation

The Web of Trust visualization at freakoverse.github.io/wotonnostr/ shows how personal curation works in practice. Your feed is filtered by the actions of people you trust. Posts from accounts followed by many people you trust are more likely to appear. Posts from accounts muted by many people you trust are less likely to appear.

This is not a global reputation system. It is a personal one. The trust scores are computed from your perspective, not from some universal standard. Your Web of Trust is yours. It reflects your values, your relationships, your judgments.

The Web of Trust can be extended. You can choose to see posts from accounts followed by accounts you follow, and so on. This allows you to discover new voices without losing the grounding in personal trust. The extended Web of Trust scales association without diluting it.

This is the practical implementation of freedom of association. You are not forced to read content from people you don’t trust. You are not forced to accept the judgments of some global algorithm. You build your own network of trusted voices, using the tools the protocol provides.


The Immutability of Events, Not of the Network

There is a common misconception that Nostr events are immutable. They are. Once an event is signed and published, its content cannot be changed without invalidating the signature. This is cryptographic immutability.

But immutability of events does not mean immutability of the network. Relays can delete events. Relays can block users. Relays can shut down. The network is not immutable. It is resilient.

This distinction is critical. The immutability of events ensures that you cannot be impersonated. Your signature proves that you, and only you, created an event. No relay can change that. But a relay can choose not to store your event. It can choose to delete it later. It can choose to stop serving it.

The network does not guarantee that your content will persist. It guarantees that if it persists somewhere, anyone can verify that it came from you. The persistence is a matter of relay policy. The verification is a matter of cryptography.

This is freedom of association again. Relays choose what to store. Users choose which relays to trust with their content. Neither can force the other. Both are sovereign in their own domain.


The Responsibility That Remains

Freedom of association does not absolve users of responsibility. It distributes it. In a centralized system, responsibility is concentrated in the platform. Users can blame the platform for failures, censorship, and manipulation. In a protocol-based system, responsibility returns to the user.

You choose your relays. You choose your clients. You choose who to follow and who to block. You are responsible for your own experience, your own safety, your own community.

This is not for everyone. Many people prefer the convenience of platforms, the simplicity of having someone else in charge. They are willing to trade control for comfort. That is their choice.

But for those who understand what is at stake, for those who have felt the weight of censorship or the fear of deplatforming, the trade is not worth it. Control is not a burden to be avoided. It is a right to be exercised.


The Architecture of Freedom

Nostr’s core principle is not censorship resistance. It is freedom of association. Censorship resistance is what emerges when you take freedom of association seriously.

The protocol does not prevent censorship. It makes censorship local. It makes censorship transparent. It makes censorship accountable. It gives you the tools to choose which rules you live under, and to leave when those rules no longer suit you.

The “pro-censorship” label is not a contradiction. It is a recognition that censorship is inevitable, and that the only meaningful question is who gets to do it. In Nostr, everyone gets to do it, for themselves, on their own servers. The result is a network that no single entity can control.

This is the architecture of freedom. Not a world without rules, but a world where you choose your rules. Not a world without judgment, but a world where judgment is distributed. Not a world without consequences, but a world where consequences are personal, not structural.

The next time someone asks you about Nostr and censorship, tell them this: Nostr does not protect your speech. It protects your ability to find people who want to hear it, and to avoid people who don’t. It protects your freedom to associate with whom you choose, and to disassociate from whom you choose.

That is both less and more than what platforms promise. It is less because it offers no guarantees. It is more because it offers something better: genuine freedom of association, genuine choice, genuine escape from the tyranny of one-size-fits-all content moderation.

You will be rejected by some relays. You will reject some relays. This is not a failure of the system. It is the system working exactly as designed.

The question is not whether you will be censored. The question is whether you have somewhere else to go.

On Nostr, the answer is always yes.


References

  1. Nostr Protocol NIPs. Available at: https://nips.nostr.com
  2. Web of Trust on Nostr visualized. Available at: https://freakoverse.github.io/wotonnostr/
  3. NIP-13: Proof of Work
  4. NIP-65: Relay List Metadata
  5. NIP-85: Trusted Assertions
  6. Nostr.com. (2025). An open social protocol with a chance of working. Available at: https://nostr.com
  7. fiatjaf. (2024). Nostr Note on Censorship-Resistant Relay Discovery.

This essay was written for those who understand that freedom is not the absence of rules. It is the ability to choose your rules.


More from Digital

No comments yet.