There are currently 70 million active admins and moderators running Facebook groups around the world, and Facebook wants to empower them to manage and nurture their communities. Yesterday, June 16, the company announced this range of new admin tools, which will help community builders moderate comments, manage conflicts
First is the new Admin Home which is the main feature. The new group members will gradually understand that admins will have access to various tools for community management, and also view group tasks needed to be done each day, in a more intuitive interface.
The new dashboard of the Admin Home has a ‘To Review’ listing of items that require admin attention, making it easier to stay on top of the various tasks. ‘Tracking the list of To-Dos’ can seem like a challenge for groups that have up to 10,000 members. Hence the ‘simplified listing of key alerts’ will save group admins a lot of time and effort in going about their daily activities.
Facebook also wants admins to maintain a safe and healthy culture within their communities. It then added comment moderation to Admin Assist, to allow admins to set up criteria that will automatically moderate both posts and comments. Admin of Facebook groups can find this option on both desktop and mobile devices.
From the above screenshot by Social Media Today, admins can now, for example, automatically decline comments that include a link to a third-party site, which could be good for branded communities and avoid conflicts with competing options.
Through the updated Admin Assist, group admins will also be able to restrict people who don’t qualify to participate based on a range of options (including how long they’ve had a Facebook account and/or how long they have been a member of the group), while the tool will also provide access to Facebook’s advanced anti-spam tools.
Facebook notes that admins will be able to use these automated rules to “maintain positive discussions and resolve conflicts within the group”, which is also the focus of its new ‘Conflict Alerts’, which, according to Facebook, will use AI to detect and notify admins when there may be contentious or unhealthy conversations in their group so that they can take action as needed.
Next is in the area of Conflict Alerts. This feature will highlight interactions where Facebook’s system detects potential concern. Admins can then get involved to make sure issue don’t compound. Specifically, Facebook says this tool will help admins to,
- Restrict people who don’t qualify to participate based on several options, such as how long they’ve had a Facebook account or how long they have been a member of the group.
- Reduce promotional content by declining posts and comments with specific links, with the ability to provide feedback for the author, so they can edit their post and re-submit it for review.
- Use suggested criteria from Facebook to help defend the group against spam, maintain positive discussions and resolve conflicts within the group. Admins can browse, add and edit those criteria to meet the needs of their group. Admins have the option to undo specific actions from Admin Assist or to change and refine criteria over time.
Then there are new overall management and insight tools for groups, for example, new member summaries, which provide an overview of each group member’s activity. Of course, this will help admins evaluate and address various actions, by understanding each group member’s input.
There are many other features, and additional considerations added by Facebook to make group moderation as easy as possible. It makes sense to say that Facebook wants to lessen the burden on group admins, and maybe volunteer moderators.
I love the way Andrew Hutchinson closed his report by trying to compare Facebook’s efforts with what the owner of Reddit is currently doing.
In many ways, this is the model that has helped Reddit become a key platform for engagement around specific niches, while also reducing the impact of spam and junk, through a dedicated army of volunteer moderators who keep each subreddit in check.
Facebook is hoping that it can facilitate the same in its groups, which are used by 1.8 billion people per month – and if these tools and features can help reduce conflict, and the sharing of controversial material like misinformation (by eliminating link sharing, for example), that could be a big step in lessening such concerns more broadly, improving Facebook’s interactions without boosting Facebook’s own moderation workload.
In other words, Facebook really needs its group mods to keep doing this stuff for free. As such, making the task as simple as possible is a key step – and for brand communities, there’s a range of potential benefits in these new updates.
2 Comments
Pingback: Facebook Announces 2021 Communities Summit, $350k Community Awards Program | Innovation Village | Technology, Product Reviews, Business
Pingback: Facebook Announces 2021 Communities Summit, $350k Community Awards Program - TechieHood