Hello

I'm Jake McKee

People call me The Community Guy

How to Develop Robust Moderation Methodology

Posted on 23 Mar 2010 | 8 comments

Ant's Eye Views

Moderation, at its core is about ensuring that published content on a particular site, typically submitted by the site’s users themselves, meets the terms of the site’s Terms of Service (ToS). This function is, all too often, seen as an analog task: groups of moderators site at terminals clearing content submission queues asking simple yes/no questions like “Is this porn? Is this hate speech? Does this content have personally identifiable data?”

The problem with approaching the moderation task as an analog, queue-clearing activity is that it simply doesn’t scale. What if you have hundreds of thousands of content pieces being submitted every minute? (YouTube, for example, has 24 hours of video uploaded every minute!) Your company simply won’t be able to afford the sheer number of bodies needed to clear those queues.

Therefore, it’s important to stop thinking about “moderation” as that analog activity of reacting to full queues and instead look at the primary objective moderation is trying to fulfill: an overall reduction in inappropriate content being published. This means reacting to bad content that’s been submitted, but it’s also about reducing the amount of negative content developed in the first place. It’s about getting your moderators smarter, as well as getting your community more active. The goal is not to reduce moderation clear rates any more than the goal of customer service is to reduce call times.

No, the goal is to improve the community health overall by providing positive, safe environments using every method you can. To that end, here are nine methods that you should be applying as you develop any community. The question isn’t which one of these methods to apply, the question is in what ratio do you apply each.

Governance
The starting point, and all too often the stopping point with preparing moderation processes is the governance piece: Terms of Service, Community Guidelines, and other formal documents meant to define the concept of “appropriate behavior”.

Alison Michalk has a great overview of how to approach the creation of such documents. By far, my favorite governance example is Flickr’s Community Guidelines. Fun, clear, and shareable.

Engagement
General community management practices, development of culture, encouraging positive and discouraging negative activities, and participation from the company.

Engagement is a common discussion with message boards/forums, but is also highly effective, though implemented differently in all three levels of your Presence Framework.

As I’ve written about before, the moderation activity can, and should be used to help build your community’s culture.

Processes
Community moderation activities – generic approval processes, from one individual to multiple levels of approval.

This is where traditional in-sourced or out-sourced vendor moderation processes play. Straightforward concept of human approval before/after content is posted.

Positioning
Moderation is as much about providing a sense of security and safety as it is about simply deleting inappropriate content. Social experience have a culture and when the culture is one of positivity, the experience overall tends to have vastly more positivity. It’s not enough to simply have great moderation processes, you need to prove it out as well.

Algorithmic Tech
Using technology to discover and utilize patterns of tone, structure, users, response times, and other such data points to automatically identify potential problems and/or filter those problems out before moderators even see them.

UX Tech

Improved methods of user-facing technology like like buttons, report abuse, on-topic buttons, and other tools that give users a chance to actively participate in the identification and reporting of problems.

  • Amazon’s “was this helpful” buttons
  • Get Satisfaction’s smiley faces

Reputation Systems
In any online social experience, reputation is crucial. Whether that’s simply a culture reputation amongst community members, or specific points/badges collection, reputation can help with a range of activities in community building. Moderation efforts can be significantly helped by applying UX Tech and Algorithmic Tech together with reputation status. Yahoo Answers is extremely strong in this area.

For more on reputation systems, be sure to pick up the new book, Web Reputation Systems .

Tool Consistency
Undedicated moderator resources (moderators who don’t work on just one property day in, day out) spend a surprisingly large percentage of their time simply wrestling with poorly design moderation tools that lack consistency across properties. Moderators can clear multiple pieces of content per minute, so every minute lost to a struggle with the bad content is time spent in entirely the wrong way.

I would love to see a company like Yahoo or Google or any other organization who has a vast array or moderation-necessary properties lead the industry by creating a set of moderation tool standards, a UI/UX library of sorts, that any and all properties that use moderation tools are required to use. Yahoo already has experience with this type of concept through the YUI Library, for example. Let’s see that same thinking applied to moderation tools.

Programs
Specifically designed programs such as the Facebook Community Council that grant additional powers to select groups of partners, customers, or users.

After the AOL Community Leaders and About.com court cases a few years back, companies have been very hesitant to engage users to do anything that can be perceived as a “real job”. Even the Facebook Community Council is small and invite-only only because they are beta testing before rolling out to anyone who wants to participate. Programs can be successfully and legally implemented, they just need to take proper precautions and do proper planning. It’s actually quite simple: If you’re going to develop a program that treats volunteers like paid staff (only without actually paying them), stop it.

The fine folks at TextsFromLastNight.com recently added a “Moderate” feature to their iPhone app. You can pay 99 cents for the privilege of moderating content submitted by users. Hey, I bought it…. what can I say? I love my TFLN nuttiness!

Gaming/Application
Moderation functions wrapped in a shell of activity that users can enjoy as a game or useful secondary application

Remember: The question isn’t which one of these methods to apply, the question is in what ratio do you apply each.

  • http://buildingreputation.com/ frandallfarmer

    Great article!

    For readers interested in the details of the Yahoo! Answers reputation system, it is the main case study in our new book from O'Reilly – Building Web Reputation Systems, and you can actually read the draft of the book on our companion wiki: http://buildingreputation.com/doku.php?id=chapt

    Book link:http://oreilly.com/catalog/9780596159801/

  • http://www.communityguy.com Jake McKee

    D'oh! I meant to link to that article. I used it in my research, thanks for calling it out!

    And I should be getting my copy of your book any day now. Maybe I'll get you to sign it, rockstar!

  • TiaFisher

    Thanks Jake, that's really useful. I'm currently plugged in to the really interesting presentation of Yahoo Answers moderation you linked to (http://wikimania2009.wikimedia.org/wiki/Proceed…) – thanks!

    One aspect you didn't mention which can help to reduce the need for moderation in your community is the careful scripting of Terms of Use, to help prevent abuse occurring in the first place and provide back up from moderation decisions if it does. Here's a great article from Alison Michalk on community guidelines http://alisonmichalk.blogspot.com/2010/03/devel…. It's detailed but nice and clear about why and what should be included.

    Hope it's helpful :-)

  • http://www.communityguy.com Jake McKee

    Ha! THE most obvious one and I missed it completely. Great catch, Tia. Post is updated, thanks!

  • http://www.icucmoderation.com k3ith

    Great Post Jake.

    You have crafted a very thorough outline of what a high volume moderation methodology looks like. Well done.

    If I may add, that I would suggest one of the most important and influential points in the article that you wrote and say:
    “As I’ve written about before, the moderation activity can, and should be used to help build your community’s culture.”

    No question in my mind that it is the human element of experienced and well trained community managers and moderators that is the most influential way to build and foster a positive community culture.

    Keith

    @keithbilous | ICUC Moderation Services Inc.

  • http://www.mzinga.com/ Steve Brock

    Hi Jake. Great article! In my experience, it all goes back to the need for broad, but explicit language in the ToS, which sets the expectation for a user's conduct. Few will actually read it, but it still needs to be there as a reference to be consulted and referred to as you bring those who go over the edge back onto the high road: here's what you agreed to – now stick to it! I believe that moderators are great educators, as they help community members learn how to interact better (and companies learn how to interact better with their customers). Hopefully, they will all apply what they have learned to their real-world relationships.

  • http://alisonmichalk.blogspot.com/ Alison Michalk

    This is a fantastic post – it's so comprehensive and really covers a number of moderation elements so often overlooked. Many thanks also for the inclusion Jake and Tia – I'm honored :)

  • http://alisonmichalk.blogspot.com/ Alison Michalk

    This is a fantastic post – it's so comprehensive and really covers a number of moderation elements so often overlooked. Many thanks also for the inclusion Jake and Tia – I'm honored :)