My idea of moderation is:
1. Outline rules for a community, make them clear and understandable
2. Enforce those, and only those, rules.
3. If something objectionable happens, and the rules don't cover them but you think they should, then change the rules to how you see fit.
4. Handle the fallout of the rules changing.
Steps 3 and 4 are where things mostly fall apart in communities. Personally, any time something happens which leads me to change rules for something, it's reactive and is the voice of the majority of people (not the loudest, the majority). Most people are happy that way.
Disagreement is never a rule I'd have. I'd only make sure that people either have the tools to self-censor what they don't want to see, or tell them they might be happier going elsewhere.
I've seen so many moral-handwringers try to justify silencing people by trotting out that XKCD comic as though they were "the community" that Cueball weirdly stopped feeling like a positive hacker character for me - half the time I encounter him now it's as some apologist for "I know best" assholery. The "Moral Majority" all over again.
In my experience it's hardly ever been the community showing someone the door, it's always a corporation getting cold feet over potential controversy, or some narrow-minded zealot abusing their janitorial moderator powers.
Last time this happened it was the community that overwhelmingly wanted to show the zealot the door (a vote happened while the mod was away), but they couldn't - they were nice normal people and not the kind of weasels who dedicate hours of janitorial work for the chance of having veto power over others in an internet forum.
Moderation and censorship can be separated - readers can be given the option to circumvent the moderation when/if they choose, or they can have the option to "unsubscribe" from the actions of moderators they disagree with. To create genuine public spaces we are going to have to find a way that doesn't assume uncorruptible integrity of moderators (the role attracts weasels). Private websites can certainly behave however they want, but they are currently standing in for our public spaces - and presenting themselves as such, which means we have none.
Moderation vs censorship reminds me of https://en.wikipedia.org/wiki/If-by-whiskey#Canonical_exampl...
A moderator --is-- a censor.
If Disqus would simply design their software so it allows users to filter/block whichever users they don't want to see no one would need to be censored or moderated. All the trolls, paid or not, would vanish from a user's view and it would be up to the user whose messages they see. Problem solved.
Moderation: the action of making something less extreme, intense, or violent
Censorship: the practice of officially examining books, movies, etc., and suppressing unacceptable parts
Moderation is censorship.
To say otherwise is to make a distinction without a difference.
Rephrase your question to have meaning. Perhaps "When is censorship justifiable?"
Moderation is always censorship. But censorship isn't always problematic. Government censorship, or censorship by an entity that has effectively the same kind of broad control over media of communication is problematic.
As others have noted, moderation is censorship, at least by action. Neither however fall under the legal protections of free speech, which don't apply.
The first question is: what is a given community for, and how should it accomplish those goals? Individual moderation, collaborative filtering, individual killfiles, expert ratings systems, etc., are all tools toward these ends. They're non-trivial.
https://www.reddit.com/r/dredmorbius/comments/28jfk4/content...
Community behavior is very strongly dependent on BOTH scale AND founder cohorts.
A group with one person (blog, winking in the dark) is different from one with two, or a small set of people engaged in discussion (say 3-30), or a larger group discussing common topics (say 30-100), etc. Part of this can be thought of as a cost function in which the positive contribution of members falls with scale, while the cost of each additional participant is rather more constant. Eventually, adding more participants makes the experience worse for all.
It's really difficult for any one conversation to have more than a few key participants. Two and a moderator, or perhaps 5-6 participants who know each other well and get along.
If you're trying to arrive at some truth or understanding, it's really difficult not to have a truth-based moderation criterion.
Individual killfiles are somewhat useful, except that the killfilee tends to see a great many one-sided discussions (others interacting with those they've filtered). Unless the system blots out both sides, this accomplishes little.
I've started looking more at discussion tools which foster both smaller and larger groups. "Warrens" and "plazas". The idea of creating persistent communities well below Dunbar's number (say 50-300 people) and promoting up material from those has some appeal (you'd also want to allow for lateral movement of individuals). A small set of tiers would easily accomodate the entire global population. (And yes, there's a social network set up on this basis though the name escapes me.)
A huge problem with allowing noisy participants is that they draw the oxygen out of the room, and tend to very strongly discourage high-quality particpants. There's a curiously persistent asymmetry between an individual's proclivity to participate in a group discussion and the interest in others in their doing so. Good designs balance this mismatch.
It becomes censorship when a moderator removes a comment or discussion based purely on personal beliefs and disagreements.
Moderation should be about removing the trolls, not what it has become, which is censorship.
It feels like many people personally enjoy moderating down disagreements because they don't get any actual power in their own lives.
It's one of the reasons I stay away from most online discussions these days: because nobody can be open and honest. It really makes me wonder if this is the reason why secret groups like the masons were created. Back in those days, instead of getting moderating down, you were killed or attacked.
And Hacker News deletes it, <sigh>
... immediately.
Basically never, at least on the internet? No one owes anyone else a soapbox. My space, my rules. There's nothing stopping anyone from rolling up a site and saying whatever they want to say, but there's absolutely no reason I have to give space on my site, no matter what my site is, for anyone else's viewpoint.