Back to Curriculum

Ban, Report, Escalate

Practical training using DefenderNet tools and real-world moderation scenarios.

Module 5 of 812 min readLevel: FoundationalFocus: Enforcement, reporting, and escalation
Key Takeaways

After completing this module you will understand the following key concepts.

How to decide whether to ban, report, or escalate harmful behavior
The purpose of each action and how they can work together
How to report harmful behavior through Discord and Minecraft
How to recognize situations that may require external escalation

The Decision Framework

When harmful behavior is identified, moderators need to decide how to respond. Most situations involve one or more of three actions:

1
Ban
Restrict a user's access to protect your community from ongoing or repeated harm.
2
Report
Report harmful behavior through the appropriate platform or moderation system so it can be reviewed and acted on.
3
Escalate
Involve external organizations or authorities when behavior is serious, illegal, or poses immediate risk.

These actions are not always separate. In some cases, moderators may need to use more than one at the same time. For example, a user may be banned from the community, reported to the platform, and escalated externally if the situation involves serious harm.

When to Take Each Action

Ban when:
  • A user is causing serious harm or violating community rules
  • Behavior is repeated, targeted, or escalating
  • There is a risk to other users if no action is taken
Report when:
  • The behavior violates platform policies
  • You want the platform to review or take action
  • You need a record of the incident
Escalate when:
  • The situation involves exploitation, abuse, or illegal activity
  • A user may be at immediate risk
  • The issue goes beyond what platform moderation can handle

Reporting Issues to Discord and Minecraft:

Reporting is an important part of moderation response. It helps create a record of harmful behavior and can allow the platform to review incidents that go within and sometimes beyond your own community.

When using Discord, suspicious harmful behavior can be reported inside of Discord or to the "Trust & Safety Team". Select the Message you wish to report. On mobile, hold down on the Message, and on desktop, "right-click" from there you can select "Report Message".

It is important to select the type of violation that you have encountered. The next screen will allow you to further identify the specific policy violation that's occurring. You can always click back and change your first answer, so you can select the most relevant category. For more information on controlling your experience and learning more about how to report content or behavior to discord please check out this link:

Discord Reporting Guidance

On Minecraft, you can report either a player for inappropriate behavior or a server itself if it violates the Usage Guidelines. Depending on which version you use (Bedrock or Java), the report process might be slightly different. You can check the guide to reporting players from Bedrock or Java. Alternatively using a web browser, you can also use the unified Report a Concern Form, where you can specify under 'Reason for concern' that the report is about child sexual exploitation or abuse.

On Minecraft, you can report either a player for inappropriate behavior or a server itself if it violates the Usage Guidelines.

Escalating beyond platform reporting

Where content is serious or illegal, you may need to go beyond the game itself and escalate further. This is where external reporting becomes both critical and vital.

Platforms cannot handle everything alone, and in cases of child safety, harassment, grooming, or exploitation, the correct step is to contact the right authorities. In the UK, this could mean reporting to CEOP. In the US, it could be through the NCMEC CyberTipline. For other regions, you can find a local hotline through InHope, a global network working to eliminate CSAM from the internet.

United KingdomReport through CEOP when appropriate.ceop.police.uk/safety-centre
United StatesUse the NCMEC CyberTipline for relevant reports.report.cybertip.org
BrazilUse the SaferNet Brasil for anonymous reports.new.safernet.org.br
Other regionsFind a local hotline through InHope, a global network working to eliminate CSAM from the internet.inhope.org
Immediate risk noticeIn any country, if someone is at immediate risk, you should contact your local law enforcement without delay.

Coming Next

Ban, report, and escalate are core moderation actions, but they are often strongest when paired with the right systems and workflows.

In the next module, we will look more closely at how moderation and safety tools can support this work in practice.

Knowledge Check

Previous ModuleNext ModuleComplete and Continue