Platform Thinking Labs.
Concepts

Trolls, vandalism and report abuse: a framework for online community management

… Share
the article

Online platforms work on principles similar to offline communities. The onset of crime and vandalism in a community can lead to large-scale abandonment. The same applies to online communities and marketplaces, especially as they grow in size.

Network effects make a community more valuable as it scales. However, the onset of vandalism could make network effects work in reverse if the system doesn’t scale its ability to govern user activity. When this happens, fast-growing networks start losing users en-masse (think ChatRoulette), online conversations enter the realm of the ridiculous (think YouTube comments) and online decisions may even end up leading to unpleasant consequences offline (think ransacked house of Airbnb host).

Startups that succeed at gaining traction and scale rapidly can still end up failing if they fail to create abuse control systems that prevent, regulate and cure the onset of such abusive activity.

Understanding user-initiated activity

Every user-initiated activity online involves two objects:

The Producer: Certain users create content or information on the platform that leads to an interaction or exchange with other users. The writer on Medium and the seller on eBay are producers on their various systems respectively.

The Seed: This is the basic unit of content or information that is created by the producer. On YouTube, it’s the video that is created. On Airbnb, it’s the product listing. It is the basic unit of production and consumption on the platform.

All initiatives to prevent vandalism and system abuse apply either on the producer or on the seed, as we shall see in the examples below.

A framework to design for prevention of abuse

Borrowing metaphors from healthcare (vandalism to a community is like diseases to an organism), there are three specific ways in which a platform can be designed to prevent abuse.

Prevention

The easiest way to prevent vandalism is to cut it off right at the source. This works in two ways:

1. Producers: The platform curates producers before they are allowed to access the system. CupidCurated used a team to curate the men that got access to its system. The propensity for certain men to vandalize dating site experiences is well understood. Quibb curates access to the community while Medium curates access for writers but opens up the platform for readers. The problem with curating producers in this fashion is the trade-off with traction that this brings in.

2. Seeds: The platform curates seeds that are created and rejects the ones that don’t meet a certain standard. App stores require apps to meet certain criteria before they can be listed. Quora requires a nominal number of credits for asking a question to ensure that users have a minimum level of activity/reputation in the system before they start new interactions. Much as curating producers can lead to lower traction, curating seeds can lead to lower engagement.

While the easiest to execute, prevention strategies may actually come in the way of achieving growth and engagement for the platform.

All initiatives to prevent vandalism and system abuse apply either on the producer or on the seed

Feel Free to Share

Download
Download Our Insights Pack!

    • Get more insights into how companies apply platform strategies
    • Get early access to implementation criteria
    • Get the latest on macro trends and practical frameworks
    Regulation

    A far better model for encouraging quality contributions is to allow open access and production rights to all but restrict visibility and discoverability to the best.

    1. Producers: Building a reputation system helps determine reliability of a producer based on the history of production. A user’s karma on Hacker News or Reddit is a function of the community’s assessment of her ability to produce consistently and with high quality. Algorithms on these platforms favor the reputed producer. Those with greater reputation are also accorded superior rights, as in the case of Wikipedia editors and super-users.

    2. Seeds: Voting and rating functionalities enable social curation that allows curation of content after it is produced. Some platforms factor in reputation of producer as well as a quality score for the seed while ranking results or pushing content to news feeds, while some may factor in only the quality score for seeds.

    In general, regulation strategies need to be built for scale and need to work as effectively at ten million users as they do with ten thousand.

    Cure

    Finally, the platform may get rid of the producer and/or her seeds to prevent abuse. Providing users with the ability to report abuse as well as algorithmically identifying patterns of abuse are crucial to building out such systems.

    1. Producers: Dating sites often block users who behave undesirably. In some cases, users may be blocked not from the platform entirely but from specific interactions, e.g. LinkedIn allows users to block certain users from communicating with them. Facebook allows users to block seeds (status updates) from certain friends from showing up in the news feed.

    2. Seeds: Often, users may not be removed but the seeds may be to prevent noise in the system. YouTube removing pornographic/copyrighted content and Quora collapsing ‘joke answers’ (as the community refers to them) are examples of removing such noise. More often than not, actions that remove seeds feedback and may ultimately lead to removal of producers as well.

    Prevention of online vandalism and abuse in any form is especially crucial for startups as they scale. While most platforms struggle with solving chicken and egg problems in their initial days (read more on that here), the ones that do manage to solve those problems may lose any advantage gained if the community implodes as a result of such abuse.

    Tweetable Takeaways

    All initiatives to prevent vandalism and system abuse apply either on the producer or on the seed. Tweet

    To design a scalable platform, think abuse prevention and regulation and community.   Tweet

    Next challenge after solving chicken and egg problems: community abuse management. Tweet

    Photo Credits: Creative Commons/xcode

    State of the Platform Revolution

    The State of the Platform Revolution report covers the key themes in the platform economy in the aftermath of the Covid-19 pandemic.

    This annual report, based on Sangeet’s international best-selling book Platform Revolution, highlights the key themes shaping the future of value creation and power structures in the platform economy.

    Themes covered in this report have been presented at multiple Fortune 500 board meetings, C-level conclaves, international summits, and policy roundtables.

     

    Subscribe to Our Newsletter

      Platform Thinking Labs.
      Engage Our Advisory