Community Tip of the Day: Everything in Moderation
Let's face it; everyone in America loves, but not in moderation. Fame? More of that. Money? Second heapin' helping please. Cool gadgets? Yeah I have two phones but I need to get another one. Make sure it has an "i" or an "e" at the beginning, that's more Web 3.0.
But social media applications need moderation. And people have tended to design this aspect of social media last. They think - how can I make a user post into this forum/wiki/tweet/blog, and then, if they are business-oriented, they think, how can I get them to pay/watch ads/ subscribe for the privilege?
They don't think as their first or third thing- please, let there just be a way to take that user's content back to its component pre-meme atoms.
Or as my coworker Phil says in his British accent: That's when we bring the banhammer down.
Admin powers - you know, web site reporting, site monitoring, logging transactions, turning components off and on - are also forgotten many times and at times moderation powers get lumped into the pile of admin the team forgot. Because, let's face it, the backstage pulleys and levers are not what get you mentioned in the New York Times, the Wall Street Journal, or hot blogs. in Silicon Valley. Moderation is like ops teams in datacenters - you only really hear about it when it doesn't work.
Spam attacks on commerce sites, the Digg user revolt, death threats received while running a popular a blog --get enough of these horror stories and the public realizes that there has to be a way to turn users off, after you turn them on. It was interesting to me that Robert Scoble's critique of the iphone app for Wordpress centered around the ability to manage comments. Even more so that Techcrunch's recent article about Belgium startups included Mollom, a moderation/spam protection venture.
The point at which designing moderation systems gets intriguing for me falls between two ends of a social engineering spectrum:
- Automation, which lets the software act as your defender while you do more important things
- Boutiquing, which allows you to finesse your community in ways that allow for its growth
I'll also talk about the middle road....
- Empowerment, which falls between Automation and Boutiqing in terms of the effort the original app designer or community moderator has to make. This may also include a subsection of the research tools needed to support boutique moderation scenarios.
Automation
When I worked on Live QnA, members of the community let me know that Yahoo Answers seemed to automatically take questions/answers down in response to a community report (they were upset because they felt their content had been griefed, and what they liked about Live QnA was our team responded to them in the forums or in email personally). You could envision a sophisticated, machine-learning situation where blog comments or Web content are scanned for patterns of bad words and the offending content takendown without anyone human as intermediary or judge. Automation can save a lot of time, and software doesn't get tired or jaded like human moderators do.
The cons of course are that humans are expert at pattern matching and they find ways to say, Shut the F--- up such that it avoids your spam filter or auto-takedown tools. It seems to me that once people "Solve the search problem" you also solve the moderation problem. Once you can give perfect answers to searchers every time, you know how to shunt the content they don't want to see, perfectly away.
Boutiquing
I made up this word because in working on the moderation system for QnA I realized that you could have an infinite number of settings and we could not build them all. Say you want to ban a user:
- do you want to ban them for an hour? A day? A week? Lifetime? To what degree are you bringing the banhammer down?
- do you want to ban them from commenting on an existing thread? Posting new threads? Linking to another site? Inserting the wrong tags into your content? You have all these powers you gave to your community - are you removing them from doing one, some, all?
- do you want to see this user's content before it goes live to the public, or after? Do you want to flag this person for your moderators to "watch" like secret agents? (see all their transaction history on your site)? Do you want to see who they collaborate with ( important for determining who may be gaming your reputation system)
Once you have determined how extensive you want your moderation powers to be (say, you want to be able to limit someone's access to your forums in graduated steps - first ban for a day, then ban for a week, then irrevocable site ban forever. To justify the decision around how long to ban this user, you might need the reporting tools to look up:
- all posts by this user
- all posts by this user flagged by another user as violating your terms of use
- all posts by this user on threads started by another user
- has this person been banned before?
Empowerment, the middle road, blends some of the features you'd build to automate or boutique your moderation system, as well as some human interaction on the part of the host of the community. For example, the Microsoft MVPs can also act as moderators on the creators.xna.com forums. They use the forum tools to handle problems that arise there. As the community grows, our community manager may find other people she wants to empower, because they have proven themselves to the community and to Microsoft as trustworthy.
A few years ago I sat in a MSR session lead by Tom Coates about self-governing community systems. As with any political system, the founders and the precepts upon which an online society is based determine its destiny, but I was struck by how a community's online tools go hand in hand with shaping how that society evolves.
If you create a system where the powers can be extended to others easily, it will be easier to set up an online community "government" that is generous with privileges. If you create a system assuming a cabal has to hold tight the reins or needs a due process to change online content (think wikipedia) the tools will have to support it, or that mode of governance will not work.
Form follows function but function follows form. A community that takes shape around podcasts may need different tools than twitter or a forums Web site and a community where everyone is master ( a true wiki) creates a different set of moderation problems ( have a great backup system to revert to) than an realtime IRC channel. Try to gauge the moderation style your community needs and how much tool budget you have to support this.
And, it's a sad thing but begin with the end in mind. That is, when you start your community, think about how you might need to kill it (or at least the voices of the trolls that have taken it over). Sometimes, a subset of your members will need to spin off and do their own thing. Or, they find they are no longer interested in the main topic that brought the group together in the first place. Hopefully, using the communication tools built into your social media application, you can do a gracious good bye with no need for moderation tool nastiness. But if you find your community space taken over by troll invasion, (how Kathy Sierra closed her blog down came to mind) you may need all the secret rocket launchers you have stashed for this very day. You can rebuild a community from scratch or small membership - but you can't rebuild a community ethos once a negative one has taken it over.
One of the most thoughtful pieces I've read on the interconnectedness of technical and social in community applications is Clay Shirky's "A Group is Its Own Worst Enemy" . Here, he doesn't talk about the specifics of designing moderation tools as much as the human behavior that lead to strange things happening in/to groups online. If you are trying to build a business community, as well as second-guess the needs of its moderators, this is a great read to help you lay your groundwork.
Aim for a thriving community base - but make sure you stash those rocket launchers where you can reach 'em.
Live it vivid!
Update: I just had to add this SF Gate article about flickr policing their community.
Comments
- Anonymous
May 29, 2009
PingBack from http://paidsurveyshub.info/story.php?title=betsy-aoki-s-weblog-community-tip-of-the-day-everything-in-moderation