Practical training using DefenderNet tools and real-world moderation scenarios.
After completing this module you will understand the following key concepts.
When harmful behavior is identified, moderators need to decide how to respond. Most situations involve one or more of three actions:
These actions are not always separate. In some cases, moderators may need to use more than one at the same time. For example, a user may be banned from the community, reported to the platform, and escalated externally if the situation involves serious harm.
When to Take Each Action
Reporting is an important part of moderation response. It helps create a record of harmful behavior and can allow the platform to review incidents that go within and sometimes beyond your own community.
When using Discord, suspicious harmful behavior can be reported inside of Discord or to the "Trust & Safety Team". Select the Message you wish to report. On mobile, hold down on the Message, and on desktop, "right-click" from there you can select "Report Message".
It is important to select the type of violation that you have encountered. The next screen will allow you to further identify the specific policy violation that's occurring. You can always click back and change your first answer, so you can select the most relevant category. For more information on controlling your experience and learning more about how to report content or behavior to discord please check out this link:
Discord Reporting GuidanceOn Minecraft, you can report either a player for inappropriate behavior or a server itself if it violates the Usage Guidelines. Depending on which version you use (Bedrock or Java), the report process might be slightly different. You can check the guide to reporting players from Bedrock or Java. Alternatively using a web browser, you can also use the unified Report a Concern Form, where you can specify under 'Reason for concern' that the report is about child sexual exploitation or abuse.
On Minecraft, you can report either a player for inappropriate behavior or a server itself if it violates the Usage Guidelines.
Where content is serious or illegal, you may need to go beyond the game itself and escalate further. This is where external reporting becomes both critical and vital.
Platforms cannot handle everything alone, and in cases of child safety, harassment, grooming, or exploitation, the correct step is to contact the right authorities. In the UK, this could mean reporting to CEOP. In the US, it could be through the NCMEC CyberTipline. For other regions, you can find a local hotline through InHope, a global network working to eliminate CSAM from the internet.
Ban, report, and escalate are core moderation actions, but they are often strongest when paired with the right systems and workflows.
In the next module, we will look more closely at how moderation and safety tools can support this work in practice.