HACKER Q&A
📣 patresh

Rules for a desirable, non-toxic and less exploitable social platform?


From what we have learned about the different flaws of different social platforms (networks and content), can we come up with some first principles that would make the platforms have the properties of being:

- Desirable: Most people actually want to be on it and find some use or pleasure in using it

- Non-toxic: I also added this one because some people might enjoy being on a toxic platform, this is not what this is about

- Less exploitable: Difficult to manipulate which is increasingly important in the age of cheap LLMs, but can also be a tradeoff with desirability as barriers are erected to prevent bot manipulation / vote brigading.

Taking Hacker News as an example of a desirable, non-toxic and less exploitable social platform. I believe several attributes make it so:

- Voting: Upvoted content rises up, contributes towards desirability / non-toxicity of the content

- Strict rules / moderation: Keeps the content on topic, constructive, friendly, more pleasant to parse. Contributes towards desirability / non-toxicity and also makes it less exploitable as manipulation can be detected.

- Novelty / surprisal: This third one is somewhat special, it is less a mechanistic property of the platform but a content focus choice. It contributes towards desirability but I believe also towards lower exploitability: it is more difficult to fake novel or thought-provoking content.

Now I do realize I could have phrased the target properties differently and semantics can always be discussed ad infinitum so take the spirit of what I mean rather than the exact wording.

What I'm specifically asking HN here is what sets of rules / mechanisms could have the above as emergent properties?

Is Hacker News a desirable, non-toxic, less exploitable platform also because it focuses on novel, thought-provoking content or could there be mechanistic rules that would allow having these properties for any kind of social platform?

I'll include some possible mechanistic rules that crossed my mind that each have their flaws:

- Member verification (ID / Credit card): Less exploitable but likely very undesirable for many

- Vouching: Start with a kernel of trusted members, include only members vouched for

- Contribution limits: Members can only contribute / vote n times per day / week

- Active discussion limits: Not everyone involved in the same conversation e.g. have two people discussing a topic, have a system for others to "raise hand" to participate in the conversation

- Exposure limits: Your post can never reach more than n random people, it has to be actively reshared by someone to spread further.


  👤 PurpleRamen Accepted Answer ✓
Calling HN non-toxic is really a stretch. Toxicity here is more subtle, but still around.

Voting and novelty, they also exist with other, more problematic, platforms. I don't think simple voting really helps in maintaining the social health of a platform, a more complex system would probably more beneficial than a simple count.

But what really helps is good and fair moderation, and a suitable sized group. If it's too small, nothing much happens, if it's too big, you will be drowned in noise and grinded in too many differing opinions. And size also helps moderation.

But I don't think enforcing low limits are really helping here. It's just another simple mechanical solution, like voting. It's too much depending on the topic, thread and persons involved how big or small a limit should be. Some topics need many involved people, some people don't have always the time to pay full attention to something, but others could continue their part. Good discussions evolve naturally and also randomly, because you never know which expert is around and how much time they have on that day.

Also, you are saying social platform, but social also means meaningless chat, while it seems to aim for meaningful high quality interactions.

If you aim for high quality-discussions, then maybe it would be more feasible to improve extraction and presentation of meaningful parts. Like let humans and AI marking useful parts, AI constantly creating summaries and so on. Kinda like having the discussion on side, and a result like a "Wikipedia-article" on the other.


👤 raxxorraxor
I do think toxicity is often extremely subjective. Ironically the limits and definitions are often defined by the most toxic people, because they have the least tolerance in general. But that is just my opinion.

I don't know what limits HN uses. You are green at first when it is euphemistically true. You aren't allowed to downvote at first (although restrictions always apply to direct responses?). Generally I would describe the limits as minimally invasive. I would guess the average upvote score for a comment on HN is probably something about 3?

These mechanisms are quite smart and not too invasive, but not the sole reason for HN being like this.

For your network it highly depends on what audience you want to nurture. Do you want the classic golf club where people feel superior and exclusive to others? Use vouching and ID checking.

Do you want free thinkers? Don't moderate much, but you may have to gatekeep people looking for offence (or just don't feed the trolls and ignore them).

Do you want a broad audience or enthusiasts? "Exploitability" is not only a matter of education, but it certainly helps. If that is a problem on your platform, you need to find out about the type of exploitation to counter it.

Not everyone is alike and will get along, there are different personalities having different expectations. If you cater to all, you probably won't be successful.

I cannot say what attracts people preferring "pleasant" (meaning?) discussions on the net. I probably more or less belong at the other end of that spectrum.