Trolls, vandalism and report abuse: a framework for online community management

Regulation

A far better model for encouraging quality contributions is to allow open access and production rights to all but restrict visibility and discoverability to the best.

1. Producers: Building a reputation system helps determine reliability of a producer based on the history of production. A user’s karma on Hacker News or Reddit is a function of the community’s assessment of her ability to produce consistently and with high quality. Algorithms on these platforms favor the reputed producer. Those with greater reputation are also accorded superior rights, as in the case of Wikipedia editors and super-users.

2. Seeds: Voting and rating functionalities enable social curation that allows curation of content after it is produced. Some platforms factor in reputation of producer as well as a quality score for the seed while ranking results or pushing content to news feeds, while some may factor in only the quality score for seeds.

In general, regulation strategies need to be built for scale and need to work as effectively at ten million users as they do with ten thousand.

Cure

Finally, the platform may get rid of the producer and/or her seeds to prevent abuse. Providing users with the ability to report abuse as well as algorithmically identifying patterns of abuse are crucial to building out such systems.

1. Producers: Dating sites often block users who behave undesirably. In some cases, users may be blocked not from the platform entirely but from specific interactions, e.g. LinkedIn allows users to block certain users from communicating with them. Facebook allows users to block seeds (status updates) from certain friends from showing up in the news feed.

2. Seeds: Often, users may not be removed but the seeds may be to prevent noise in the system. YouTube removing pornographic/copyrighted content and Quora collapsing ‘joke answers’ (as the community refers to them) are examples of removing such noise. More often than not, actions that remove seeds feedback and may ultimately lead to removal of producers as well.

Prevention of online vandalism and abuse in any form is especially crucial for startups as they scale. While most platforms struggle with solving chicken and egg problems in their initial days (read more on that here), the ones that do manage to solve those problems may lose any advantage gained if the community implodes as a result of such abuse.

Tweetable Takeaways

All initiatives to prevent vandalism and system abuse apply either on the producer or on the seed. Tweet

To design a scalable platform, think abuse prevention and regulation and community.   Tweet

Next challenge after solving chicken and egg problems: community abuse management. Tweet

Photo Credits: Creative Commons/xcode

platform-thinking-labs

AI won’t eat your job, but it will eat your salary

On rebundling jobs and skill premiums The AI augmentation fallacy goes something like this: “AI…

2 weeks ago

Finding the product in your platform

On the risks of over-emphasizing platform thinking In an age of platform hype, everyone scrambles…

3 weeks ago

Keyboard-as-a-platform

The untold story of the most under-used real estate on the phone screen Which players…

4 weeks ago

How stand-up comedy helps Amazon win at e-commerce

How stand-up comedy helps Amazon win at e-commerce On Attention Conglomerates and Internal Attention Markets…

1 month ago

Gen AI companions and the fight for the primary interface

The race for the primary interface in the age of AI Everyone (and their dog)…

2 months ago

Why offline retailers fail at online marketplaces

Once upon a time, retailers invested in large swathes of land, arranged aisles with the…

3 months ago