Built by the community. For the community. Practical training that makes safety actionable.
After completing this module you will understand the following key concepts.
Online spaces like Minecraft and Discord are full of creativity, collaboration, and friendships, but a small number of users can cause outsized harm, especially through harmful conduct targeting children. Individuals involved in grooming, harassment, scams, gender-based violence, and sexual harm often move between servers. Without effective tools and collaboration, moderators end up working alone, which allows harmful users to return and repeat their behavior. DefenderNet changes that — offering communities practical tools and training to support early detection, reduce harm recurrence, and build safer spaces together.
DefenderNet is a system designed to help Minecraft and Discord communities detect, respond to, and report harmful behavior, especially serious harms like online child sexual exploitation and abuse (OCSEA) and Child Sexual Abuse Material (CSAM).
Minecraft and Discord are places where players create, connect, and grow communities. But they also face real challenges. Grooming, scams, harassment, gender-based violence, and abuse happen across platforms, and often carry over from one server to another. Many moderators and server owners struggle to keep up with limited tools, time, or support. Without a shared system, harmful users can slip through the cracks by jumping between servers. GamerSafer's mission is to help online gaming communities protect their users and thrive.
Any Minecraft or Discord community can install GS Bans (for Minecraft) and GS Defender (for Discord) for free and enable DefenderNet. Our academy is free because the goal is to give every community access to stronger child safety protections, no matter their size or resources. Communities immediately benefit from using DefenderNet — making it harder for harmful users to move undetected between servers.
Some communities choose to take the next step by becoming DefenderNet Reporting Partners. These partners commit to higher standards of safety, moderation, and collaboration. In recognition of this commitment, GamerSafer works with communities to help them meet eligibility requirements through onboarding, mentorship, and ongoing resources — including access to the GamerSafer Academy advanced curriculum and a private peer group with other partners and invited experts.
Getting involved in DefenderNet helps both your staff and your players in meaningful ways. For staff, it provides stronger tools, making moderation less stressful and more effective. Your team gains key knowledge on risks and potentially harmful behaviors, clear processes to follow, and the chance to collaborate with other communities. For your players, DefenderNet means safer gameplay where bad actors find it harder to move between servers unnoticed.
One of the biggest challenges in online safety is that offenders rarely stop in one community. When banned or reported on a single server, they often reappear elsewhere, repeating the same behaviors. DefenderNet breaks this pattern by using a global block list across communities. When a Reporting Partner identifies and reports harmful behavior, that information contributes to a network of knowledge other communities can rely on.
Online games blend three powerful elements: access, anonymity, and opportunity. Games bring together millions of players of all ages, including children. Individuals seeking to cause harm can leverage easy access to connect with children, appearing helpful, offering to share rewards, and pushing children toward private conversations. The anonymity of many gaming platforms makes it difficult to verify identity or intent.
The impact of sexual harm is real, profound, and can be long-lasting. For victims, these experiences can cause severe psychological, emotional, and sometimes physical trauma. Children may struggle with fear, guilt, shame, or confusion, and these feelings can continue into adulthood. Communities also feel the weight of these harms. When exploitation or abuse takes place in a server, it can undermine the sense of safety, trust, and belonging that players rely on.
Reference
The term "child" is any person, of any gender, who is under the age of 18 years.
Actions, or patterns of action, seeking out, targeting, or manipulating children, for sexual, emotional, or financial exploitation. Predatory behavior often includes establishing a relationship of trust or emotional connection (known as grooming), pressuring secrecy, and attempting to move conversations to private or less-monitored spaces.
When a child (under 18) is involved in any sexual activity, to which they do not consent and cannot consent at that age because they are too young or not able to understand, it is considered sexual abuse. Child sexual abuse can take the form of contact or non-contact (e.g., livestreaming) abuse. Pressuring children to send inappropriate photos or have calls; sending sexually suggestive messages or adult content to them; involving them in inappropriate roleplay — these can be warning signs of child sexual abuse.
When a child is involved in sexual activity in exchange for something they can gain (e.g. gifts, rewards, benefits, social attention), it is considered sexual exploitation. The element of exchange is what differentiates between abuse and exploitation, but there are instances where they overlap.
Sexual harassment refers to unwanted verbal, non-verbal, or physical conduct of a sexual nature with the purpose or effect of violating the dignity of a person. When harassment is directed at someone because of their gender, it is gender-based harassment. Girls are commonly targeted just for the fact that they are girls.
A generic term that refers to any form of sexual abuse or sexual exploitation of children in which technology plays any facilitating role, including when sexual harm in-person is committed with the help of technology.
Child Sexual Abuse Material (CSAM) or Child Sexual Exploitation Material (CSEM) is any content that visually or descriptively depicts child sexual abuse and/or exploitation. Production, possession, sharing, or linking to such material is illegal and must be reported immediately to authorities.
Digitally generated CSAM/CSEM, including AI-generated material, can be produced without actual contact abuse of real children, but it creates the effect as if real children were depicted. It is not illegal everywhere, but the harm caused by sexualizing children is very real and should be reported.
OCSEA refers to all sexually exploitative acts carried out against a child that have, at some point, made use of technology. It must be reported and escalated immediately, even if only suspected.
Pictures of children that are used or shared for sexual purposes but are not legally classified as CSAM/CSEM. It is a serious violation of the child's dignity, privacy, and rights, even if the picture does not show sexual abuse.
This refers to sexual content taken by a child about themselves. When shared online, the context behind it can determine whether it should be flagged as explicit content, harassment, CSAM/CSEM or OCSEA.
Any CSAM content generated partly or wholly by AI, regardless of whether a real child was involved. Without protective policies, AI tools can produce an indefinite amount of CSAM, fueling harmful fantasies and normalizing its use.
Test your understanding of Module 1 with a short knowledge check before moving on to the next module.