Thursday, August 30, 2012

An Insider's Guide to FB moderation: Have you done your job well?

Just a quick update to share this handy post from Melissa Gassman which I found on CPC's PR and Social Media for Businesses LinkedIn page.

In An Insider’s Guide to Facebook Community Management for Brands, Gassman shares her insights from a two-year stint as a Facebook page moderator. In light of recent tutorial discussions around brand's frequent social media stuff-ups as regularly highlighted on Mumbrella, I found her final paragraph very interesting:

"Think long and hard before pressing ‘delete’ on any comment. Facebook is an open forum and you need to accept the good and the bad comments about your brand. Unless comments clearly violate your house (or federal) rules, leave them be. If you’ve done your job well, your community may very well step in and defend your honour."

Particularly though-provoking was the last sentence: If you’ve done your job well, your community may very well step in and defend your honour.

If community managers/moderators have developed a free-flowing two-way model of communication in the first place, they may find that members may jump in to defend them if there is a negative comment. In his lecture last night, Jim Macnamara hinted that despite the media landscape changing from mass media towards social media, brands using new social media platforms are doing so in much the same way they use/d tradtional mass media - that is, to simply push out PR messages rather than using social media to create dialogue and improve brand awareness.

Many brands using Facebook may avoid social media disasters by undergoing a complete shift in their theoretical framework and work towards implementing a strategy that engages community members.

Monday, August 27, 2012

Facebook: Thumbs up to moderation



Smart Company recently published an 'Ultimate Guide to Facebook Moderation'. In it, Patrick Stafford raises many issues that businesses using Facebook now face that they didn't just a short time ago.

"The concept of monitoring and moderating the messages made on Facebook walls is odd. The practice didn’t even exist three or four years ago. But recent legal tussles show Facebook moderation isn’t just something you should do because it’s good business. It’s a legal obligation. Smart businesses will be aware of a finding by the Advertising Standards Board, and a subsequent affirmation from the Australian Competition and Consumer Commission, that small businesses have an obligation to take down misleading or deceptive posts on their wall. These messages can constitute ads, and if you don’t take care of them, it could result in massive fines. Businesses have been irate about this, suggesting it’s even more work for them to keep track of. But James Griffin, co-founder of reputation management company SR7, says this should have been the case for many businesses when they first set up their pages – and it exposes some laziness (emphasis added)."

In the article,  Stafford raises key points about setting up an online community on Facebook, key points that I think a lot of businesses have not heeded in the rush to be on Facebook simply for the purpose of being 'on' Facebook. The role of moderator would be a lot easier if these companies had spent time developing a strategy before simply jumping on one of the latest social media bandwagons.

Some of the key points raised in the article include:
  • Asking yourself what you want a Facebook page for. Have a strategy and envisage what feedback/interaction you would like to create;
  • Setting standards and having clear guidelines for posts;
  • Warning against misleading comments - even if a comment is positive it may not be accurate;
  • Don't delete controversial comments.

Interestingly he warns against regarding 'likes' as feedback, a point I brought up in our tutorial discussion last week. A lot of likes doesn't always correlate to engagement. Similarly, negative posts by consumers that garner alot of 'likes' may not be the death by social media or translate to loss of business that many may expect. As in the case of the Target social media story I mentioned in my last post, 44,000 'likes' of a mother's negative feedback to Target obviously demonstrates that a lot of Facebook users also share a similar point of view, however it could also simply be fellow consumers having an easy dig at a corporate entity. Sure it isn't one to ignore complelely, as 44,000 likes has spread one consumer's feedback much further than it would have ten years ago through 'Word of Mouse' and brought coverage of the issue to traditional media, but if I were moderating a page I'd be more inclined to analyse comments rather than the simplistic 'like'.

But back to the point I emphasised in the first section of Smart Company's 'Ultimate Guide to Facebook Moderation': Have companies become lazy in their Facebook moderation? Has the proverbial horse bolted and now companies must play catch up to avoid or better manage their own potential social media #Fail? Should companies re-assess their presence on Facebook and perhaps opt for another platform for social engagement?

I think setting up a Facebook page is the easy part, but constant vigilance is the price to be paid for wanting to be in that space. Laziness has crept into multi-national companies AND small businesses. Social media moderators must access their platforms daily, respond, and if a response can't be supplied, as at least an acknowledgement that the post or comment will be assessed and responded to as soon as possible.

Monday, August 20, 2012

Welcome to Social Media Moderation (SocMedMod)


Welcome to SocMedMod, initiated as part of Rethinking Media, my final subject of my Communication Management Masters degree at UTS.

Although being quite an active blogger for the past 6 years, I have never been a 'moderator' of a social media site or community, however recent discussions in our tutorials have focused on social media gaffes by moderators triggered my interest in the role of the online community moderator. The two incidents were the deletion of a post written by a distraught mother on Channel 7's Facebook page and the feedback Target received on its Facebook page from a parent requesting they stock clothes that are age-appropriate for her daughter.

A moderator's daily work is frought with potential issues such as when to comment, if to comment at all, to delete abusive comments/feedback or maintain a transparent site. With the Advertising Standards Bureau recently deciding that brands are responsible for all comments on Facebook, the role of moderation has become a position of  responsibility (especially for those brands using Facebook as their primary platform), with some questioning whether social media is worth it.

Through this blog I seek to share and generate conversation about the role of moderators and, in a way, have this blog become a live case study for creating and building an online community.

This blog will focus on:
  • the role of moderators and how their role is increasingly considered (correctly or incorrectly) as being like that of the traditional media's 'gatekeepers';
  • the issues moderators face;
  • best practice for setting-up and maintaining a community.
SocMedMod will also feature case studies of worst (as mentioned above) and best practice that generate online sharing and commenting for all of the right or wrong reasons.  As a PR practitioner, I'm also intrigued into how the moderators role fits into the overall PR strategy of a brand, organisation or community group.

So to get started, what do you think is the biggest mistake a moderator can make? 

You can follow SocMedMod on Twitter