One of my mantras is that social media is a setting, and like all settings, there are aspects that can damage health or improve our health.
Tools like community guidelines and moderation are mediating factors to creating healthy settings. It’s essential for establishing safe and healthy spaces online that you set clear boundaries on the conversations that occur in your space.
In addition, platforms themselves also maintain their own guidelines which you and all users should be aware of also.
- Facebook: https://transparency.fb.com/en-gb/policies/community-standards/
- Instagram: https://help.instagram.com/477434105621119/?helpref=hc_fnav
- TikTok: https://www.tiktok.com/community-guidelines?lang=en
- LinkedIn: https://www.linkedin.com/legal/professional-community-policies
- Twitter: https://help.twitter.com/en/rules-and-policies
All pages and groups should also be aware that they are themselves responsible for comments deemed to be defamatory left on posts. You are legally responsible for a member of the public leaving a defamatory comment – it is your responsibility to moderate and set guidelines.
I am also observing that more pages, in particular, media outlets are closing comments due to their own resourcing to moderate. The closing comments feature has been in groups for some time, but this is relatively new to pages. This moderation tool does significantly lower the risk of pages inadvertently hosting defamatory or unsafe comments while the moderator clocks off for the day.
Image: Screenshot of ABC Adelaide Facebook page with closed comments.
Most of my clients are small organisations, programs or not-for-profits which have one staff member managing their social media. It’s not reasonable to expect 24/7 moderation, so some risk management is needed.
With organisational risk in mind, I have been asked whether comments should be turned off on each and every post. My response is that this approach strips social media of being social. I do not recommend this. Users expect to be able to comment and engage with content, and social media is designed to build communities and not just be a notice board for one way information sharing. You can be social, build safe communities and facilitate constructive conversations through clear guidelines and discerning use of the closing comments feature.
What to include in your social media guidelines
Depending on the type of content that you’re posting, your community guidelines should include reference to the following:
- Hate speech, profanity, obscenity, or vulgarity
- Comments/messages that could be considered prejudicial, racist or inflammatory, including transphobic or ableist comments
- Nudity or offensive imagery including in profile pictures
- Defamation to a person, people, or organisation
- Name-calling, trolling and/or personal attacks
- Spam comments/messages such as the same comment posted repeatedly
- Comments/messages that contain or promote false information
- Personal information (including identifying information, email addresses, phone numbers or private addresses)
- Fake profiles
- Business promotion without express permission
You may also wish to consider how your page will address misinformation or anti-science content. Will you engage and educate, or will you simply remove. The choice is yours.
want some community guideline inspiration?
- Simple and easy to understand:
https://adf.org.au/adf-community-guidelines/
- Great example for service/customer-facing organisations:
https://www.sapowernetworks.com.au/policies/social-media-guidelines/
- Comprehensive guidelines for media (see part 3 in particular):
https://help.abc.net.au/hc/en-us/articles/360001548096-ABC-Terms-of-Use?fbclid=IwAR1jmB86HWLUp-lzuytpF95tJeDnMWF8wUtUarj7L3IAHMyzuDoxeaal0Rs
explain your moderation processes externally and internally
The best guidelines not only spell out the expectations, but also the steps you will take when moderating content.
Depending on your organisation, differing approaches may be taken.
For example, an organisation with more of a gentle approach to advocacy and community education may choose to have a policy where they contact a person privately to further explain why a comment was removed.
As an organisation you will need to decide what your steps are, and who is responsible for making the decision to remove content. Mostly, this will be the responsibility of the social media manager, but sometimes organisations will choose to delegate this decision to a manager. This internal decision making should not be in your public-facing guidelines but needs to be in an internal procedure.
I recommend that any comments (except for clear spam) are screenshot and saved in internal record-keeping prior to deletion so should the matter escalate, you will have a record.
New staff will need support, training, and onboarding to learn the organisation’s moderation standards. As these are established, I highly recommend you develop internal training procedures with examples to make this easier for future staff.
host your guidelines on your website
Finally, I recommend hosting your community guidelines as a page on your website just like the examples above. This way, if Facebook mucks around with their page ‘About’ section and remove half your guidelines, it makes it so much easier to maintain. It also means you have just one place for you to review and update your guidelines rather than different ones for each platform.
Thank you for making our shared social spaces safe and constructive.
Disclaimer: This blog post is provided for information purposes only. The contents of this post do not constitute legal advice and should not be used as such. Formal legal advice should be sought in particular matters.