I recently worked with Hemihelp (now part of Contact) to help them with training for volunteer Facebook group moderators for their peer support group on the platform.
The group was established in 2007 and became more active in 2011. It has over 5,000 members and over 11,000 interactions each month. It seems to be a strong community with a lot of peer support offered (there is little need for moderator input on this front). There is debate among members over the nature of the group and how it should be used.
I offered some advice around consultation with the community to help ensure the guidelines and moderation approach are based on the needs of the group. This can also be a really good way to introduce and embed changes.
Recruiting volunteers
Hemihelp planned to recruit volunteer moderators to help them manage the group moderation during office hours. I am usually a little wary about using volunteer moderators especially ones who are still active members of the group but this can depend very much on the community in question.
Initial response to the advert was quite low. This was perhaps understandable given the nature of the group (busy parents from all over the UK, many of whom couldn’t make it to London for training). I recommended that they emphasise the transferable skills volunteers might gain as well as the opportunity to help the community. This helped them recruit a few additional trainees but I think they would need to offer comprehensive online training to really increase their numbers.
Moderator training
The training itself was run over one (very hot) day in London. We covered:
- an introduction to community moderation and the role of moderators,
- language and tone,
- listening through language,
- emotional support and empathy,
- writing responses and using templates,
- signposting effectively – including understanding barriers to support,
- dealing with conflict,
- managing difficult posts/posts that break the guidelines,
- common moderator difficulties,
- managing the balance between your moderator responsibilities and your role/needs as a community member, and
- self-care – including understanding personal needs, debrief and supervision.
I really enjoyed the training. I appreciated you accommodating my tendency to go off the immediate topic and liked how you tweaked the training as we went along and you got a better sense of our particular challenges. I thought the different segments were well paced and the activities were engaging and useful. As usual, I found some of the ad-hoc discussion the most useful. Generally, it was a very informative day and it was lovely to find that we’ve been doing a lot of it fairly well through our own instincts, but also to have areas where we can improve. I love the idea of the consultation as a way of instigating change without meeting lots of resistance!
Lizzie Salter, Hemihelp Project Manager
A few thoughts about Facebook as a platform for peer support groups
Some of Hemihelp’s issues got me thinking about Facebook as a platform for peer support.
Six years ago I wrote a post for Taskforce Digital about charity Facebook pages and how to make the distinction between a marketing and information space and a support forum. I used the example of Mind’s Elefriends – a marketing page that evolved into a support space and eventually into an entirely separate community platform.
In 2019 most charities distinguish between their marketing page and their support groups (which may be public or private, closed or open). Some also need to distinguish between their own ‘official’ groups and others set up by peers to help each other. Some offer private forums on their own websites too. This gives people a chance to decide what kind of space is right for them (the Miscarriage Association gives people information to help them understand the different online spaces).
There are obvious advantages to using a Facebook group as a peer support platform. People are there already. It’s familiar. Posts from your group will show up in a timeline through which people scroll many times a day (although not everyone wants this of course).
There are disadvantages too. Facebook is slowly closing all ‘ghost’ profiles (Facebook profiles that are not based on a real person). Many places have been using ‘ghost’ accounts to ensure they have a single, consistent and gender-neutral moderator account through which all moderation is done. Closing these accounts will mean that admins (and indeed volunteer moderators) who do not want to use their personal accounts will be forced to moderate through a related Facebook page instead. This may have an impact on access, on handover and on assigning tasks and responding to messages.
Facebook also controls the algorithms and chooses which posts to promote within the group and on members’ timelines. I wonder if this can make group dynamics difficult. Do regular posters and commenters (who may have particular opinions or agendas) get promoted and seen more often? Are newer, shyer posters more likely to be missed? This cycle could easily become self-perpetuating. Facebook offers badges to certain members, identifying them as a particular type of group member. Admins have no control over these badges and the way these are seen by other group members.
Facebook’s algorithms may promote a particular kind of engagement, or they may make arbitrary decisions about which discussions to show to members. Either way, admins have little control. Moderating a busy support community is a complex and often subtle process. This is likely to make it harder still. While Facebook groups may be fantastic for some communities, there definitely seem to be some problems too – particularly when using them for peer support around sensitive issues.