[en]

I just finished reading Tamara Littleton (eModeration) whitepaper presenting six important techniques for building safer User Generated Content, or UGC for short. The document ended up being a nice overview on how to actively try to get control against negative participation and submissions on a user generated content site, which from experience we all know can ruin everyone experience from a particular Service, Site or Application.

The Balance between User freedom and the Production of Rich Content it’s nothing but simple, on one side we want Users to feel in complete control over their content, certain of their self-expression rights, and on the other, we have a company with a brand to take care off and a site which they hope can grow from the community. The problem arises when some of the site users cross “the line”, how can we prevent that from happening, and when it happens (since it’s pretty much inevitable!), how do you deal with it?

Flag ThisConsider the current moderation tools in the Communities you’re part of? The one that pops in your mind it’s probably going to be the ‘Flag this as being offensive, illegal, etc…’. The problem in many cases is that feature alone isn’t going to be enough, or at least the actual moderation tools in many communities don’t seem to be delivering the expected results! From experience, I can attest that in general these tools are only put to practice after the project has been deployed, turning them in closer to a ‘hack’ than to a feature itself.

What Tamara’s whitepaper state is that with the right design and preparation, moderation tools and the staff behind community management can deliver higher rates of success in their communities, leveling consumer freedom and brand protection.
[/en]

[en]
So I’ve just sumarized her whitepaper and and added a few personal notes about her Six Techniques, but you can always read the entire whitepaper online at eModeration site.

1. Craft Nicer Guidelines

We all came across ‘terms and conditions’ in our digital life, they’re pretty much present in every registration form for every online website or application, the problem is that many of those are ‘legal copywriting’ terms, filled with legal jargon that many of your endusers won’t understand or simply won’t dare to read. Solution? Simplify! Craft some short and simple guidelines for your users, something that don’t require much time to read, and simple enough to express your main rules and guidelines of participation.

Example? Consider the 3,848 words of Hi5Terms of use versus the short (only 828 words!) Flickr Community Guidelines for user participation, which one do you think will have better impact on user behavior?

2. Automated Filters

Independently of your guidelines and terms of use, there will be for sure cases of user malpractice or a forum discussion that will overheat, spammers will want to get their attention, so you’ll have to create right from the start tools that will help moderate your UGC from the very instance you deploy your site:

  • Multiple Filters: profanity filters aren’t enough, they help reduce posting based on black lists, but your filters should be smarter, they should for instance take in consideration things like multiple posting, entries based on url’s, past user history and submissions.
  • Allow Performance Tuning: filters can be very smart, but if they’re hard to change and customize, they’re more likely to help the spammers than to help you, so build easy and adaptable filters.
  • Accepted and Non-accepted Content Methods: Automatic filters tend to create situations were some ‘nice’ content is rejected simply because it might contain a word that’s on a black list for instance, the correct way to address this is to create a moderation queue, and whenever you consider a word that’s possible to be misused make the system alert you instead of immediately screening out that particular content.
  • Choose a reaction: your system can choose between different kinds of reactions when it cames across malicious content: block, hold for approval or let it thru your site doors and expect for user to report if there’s something wrong. What’s important is to let users know what and when your system chooses between them.

3. Embrace Technology

Filters and Guidelines are only the starting point of any community. Successful communities always depend on some type of Human moderation, wether it’s from your staff or based on user auto regulation. So making theses persons life easier might depend on the moderation tools that you build to help them do they’re job, build algoritms that help identify potential threats, each site has it’s own logic, but there are some common calculations that might help produce some kind of priorities to a moderation queue:

  • Time: keep an eye on user submitted reports within certain amounts and time-frames, make your system respond automatically above certain levels.
  • User history: not everyone participate the same way, and some users are more helpful than others, having that in mind you might create a ranking system for your user reports, every time the moderator agrees with a user report, make that user more valuable.
  • Traffic: in general, content that is taking heavy traffic, might be worth keep an eye on, so if something on your site is deserving quite a lot of attention from your users, have a look too, and make sure it’s worth it.

Once again, plan these systems from early on development, but make them easy changeable to deal with real usage once the system has been deployed.

4. Differentiate Users

As Stowe mentioned on his workshop at LIFT, not everyone is equal, at least not everyone deserves to be treated equal. In general all the users are on the lookout for a good experience (and therefore never forget to take particular care of their User Experience), but some of them are more than willing to give you a hand managing the community. Involving users, not only reduces your moderators work, but also helps Users develop a sense of ownership, which in Online communities is always desirable.

Create different views and tools for different users. Take particular care of your One percenters Users, those that really produce and/or take care of your community and engage them directly: personally email them! One thing to have in mind, is that volunteer work isn’t equal to free labour, so always, and I wont write this enough, always have some kind of rewarding system for your helping Users, if if it isn’t entirely declared upfront, it’s always good to receive something from your hard work when you’re on the User side, isn’t it? We’re not talking solely about financial compensations, the whitepaper states it and I called it Karma, develop a karma system, something that allow the Users to distinguish themselves between the remaining Users from their work and participation efforts.

5. Visible Moderation

Traditionally moderation has been one of those activities that sit behind under the table. If your Users aren’t ware of the Moderation Actions, how do you expect them to be able to know and conform with the guidelines? Plus, if your moderation isn’t visible, it’s like posting a ‘Free Drinks Sign on Bar, on a friday evening!’, people will drink more, meaning more easily there will be users abusing the system.

According to Tamara whitepaper, something things should be taken into consideration regarding the exposure of your community moderations activities:

  • Moderator must be able to communicate: being a moderator isn’t solely about acceptance or reprovable, isn’t also about educating and helping Users use your system correctly, so you must provide ways of letting them talk between each-other. Every moderator’s action should be followed by a clear explanations but essentially one that clearly state the reasons behind such action.
  • Notifications: Users should have easy and quick access to moderation messages, probably a specific area on their private area or as a message on their inbox on the system.
  • Edit vs Delete: moderation doesn’t come only in black or white, sometimes you have to choose between gray levels, meaning some content, might be still partially worth it, making no sense in deleting it, so you have too options, although one is certainly preferable over the other one, deleting or editing the offending content. Careful on the editing solution, Users might feel like you’re taking control over their speech, and in that situation it’s always preferable to opt for the next solution:
  • Offer a chance of change: If you came across some offending content, start by making it private and requesting it’s producer to actually change it, instead of immediately delete it, always have a positive attitude towards your Users, they too make mistakes, don’t judge them quickly.

Make Moderation Usable

Take equal care of your moderation tools, as you do with your service/application in general, bare in mind, that your moderator are Users too, if the moderations tools are easy and Users can access it quickly, they’re more likely to use them in the first place.

Make your moderations tools part of your development cycle, which is to say, that include them on your usability tests, develop having your moderator in mind, and don’t forget to test them against fictional data (that’s way their adaptability is so important!). Don’t you just wait for the system deployment to go out and test your moderation tools.
[/en]