GamerSafer Resources

Child Safety Glossary

Developed by GamerSafer in collaboration with ECPAT International, this glossary provides a shared reference for key child safety terms used in online platforms, community moderation, and digital safeguarding.

For games, servers, and communities participating in DefenderNet, this glossary also reflects essential terms used throughout the initiative.

A


AI-Generated Child Sexual Abuse Material

This refers to any CSAM content generated partly or wholly by using Artificial Intelligence, regardless of whether a real child was involved. AI-generated CSAM include those generated entirely by AI (fully synthetic); those that alter existing CSAM, possibly to generate new abusive imagery (AI-enhanced); those that blend two or more content like putting a child’s face into an adult body engaged in a sexual activity or "de-aging" adults to look like a child (AI-morphed). Without protective policies, AI tools can produce an indefinite amount of CSAM, fueling harmful fantasies, gender-based stereotypes, and normalizing CSAM and its use. AI-powered chatbots and large language models could also be misused to facilitate grooming and sexual extortion, translate or repurpose abusive content, reinforce harmful gender roles, and generate sexually explicit "role-play" narratives involving children.

C


Child

The term "child" should be understood as including any person, of any gender, who is under the age of 18 years.

Child Sexual Abuse

When a child (under 18) is involved in any sexual activity, to which they do not consent and cannot consent at that age because they are too young or not able to understand, it is considered sexual abuse. Child sexual abuse can take the form of contact or non-contact (e.g., livestreaming) abuse. Child sexual abuse can result from explicit force (like threats of physical violence), as well as other pressuring elements like age difference, having higher authority, power, or can be the result of manipulating the child, to commit the abuse according to the instructions of the abuser. Pressuring children to send inappropriate photos or have calls; Sending sexually suggestive messages or adult content to them; Involving them in inappropriate roleplay and flirty exchanges; Inviting children to talk on another platform — these can be warning signs of child sexual abuse.

Child Sexual Exploitation

When a child is involved in sexual activity in exchange for something they can gain (e.g. gifts, rewards, benefits, social attention), it is considered sexual exploitation. The element of exchange is what differentiates between abuse and exploitation, but there are instances where they overlap. For example, many cases of child sexual abuse also involve some kind of exchange often to win trust or ensure silence (especially benefits like small gifts, attention, and affection). Offering rewards (including social status and benefits), gifts, virtual/real money to children under secretive or privately communicated terms could be flagged as a potential child sexual exploitation case.

Child Sexual Abuse Material or Child Sexual Exploitation Material (CSAM/CSEM)

CSAM/CSEM is any content that visually or descriptively depicts child sexual abuse and/or exploitation. This includes still images, videos, live-streamed broadcasts, audio and written content (such as stories) and illustrated or computer-generated sexually abusive depictions of a child that appear realistic. Production, possession, sharing, or linking to such material is illegal and must be reported immediately to authorities. CSAM or CSEM is the correct term, rather than "child pornography" because this is a serious form of harm and a crime.

Digitally generated CSAM/CSEM, including AI-generated, can be produced without actual contact abuse of real children, but it creates the effect as if real children were depicted. It is not illegal everywhere, but the harm caused by sexualizing children is very real and doesn’t only stay in online or virtual spaces. This practice and the impact of producing these materials create real harm for children and should be reported.

F


Frauds and Scams

Manipulative tactics to deceive others for personal gain, often involving theft of in-game items, accounts, or personal data through trickery or impersonation.

H


Harassment

Targeted and often repeated behavior is meant to intimidate, demean, threaten, or emotionally harm a specific user or group. It may involve personal attacks, slurs, stalking, or obsessive attention and typically follows a pattern of behavior or shows a clear intent to harm or silence someone. Isolated incidents of rudeness or aggression may be better classified under "Hostility" unless they’re part of an ongoing pattern.

Hate Speech

Language or symbolism that targets individuals or groups based on protected characteristics and marginalization of factors like race, gender, religion, or sexual identity.

O


Online Child Sexual Exploitation and Abuse (OCSEA)

Online child sexual exploitation and abuse or OCSEA refers to all sexually exploitative acts carried out against a child that have, at some point, made use of technology. This includes confirmed or suspected incidents where a child is sexually manipulated, exploited, blackmailed, or harmed online. Building trust with a child online and forming inappropriate relationships with them; Grooming them to perform sexual acts in front of a webcam; Pressuring or blackmailing them to send private or intimate photos; Distributing or sharing CSAM/CSEM — these are instances of OCSEA.

OCSEA signs are not always obvious. It can start with casual DMs, seemingly generous gifting, playful interactions — done repeatedly to build trust, dependence, or emotional pull. Then follow attempts to isolate children or move to another platform to commit the abuse. OCSEA poses severe harm and must be reported and escalated immediately, even if it is only suspected and not yet confirmed in full. Remember this can be perpetrated by any adult but also any child — so behaviors rather than persons, have to be monitored.

P


Predatory Conduct and Behavior

Actions, or patterns of action, seeking out, targeting, or manipulating children, for sexual, emotional, or financial exploitation. Predatory behavior often includes establishing a relationship of trust or emotional connection (known as grooming), pressuring secrecy, and attempting to move conversations to private or less-monitored spaces.

These behaviors are early warning signs that may indicate intent and can escalate into Online Child Sexual Exploitation and Abuse (OCSEA). This does not only include the perpetrating of sexual abuse or exploitation, but also the production of any form of child sexual abuse and exploitation material. All predatory conduct and behaviors must be taken seriously, monitored, and reported without delay. Harmful or predatory conduct and behavior can come from anyone: it can be done by adults of any gender, and even by other children too, so moderation needs to pay attention to everyone, not just certain people. To understand how sexual abuse and sexual exploitation of children happens online, it is important to know some initial basic concepts:

S


Sexual Harassment of Children

Sexual harassment refers to the unwanted verbal, non-verbal, or physical conduct of a sexual nature with the purpose or effect of violating the dignity of a person. The purpose might be to humiliate, degrade, or offend the person. Even when this wasn’t the intention, if it creates the same effect for the person, it counts as sexual harassment. "Unwanted sexual comments" is an example, since the person making the comments may not necessarily intend for them to violate the dignity of the other person, but that is the effect they may cause. When harassment is directed at someone because of their gender, it is gender-based harassment. For instance, girls are commonly targeted just for the fact that they are girls. Children with a different gender identity or expression than what is considered "welcome" and "acceptable" may also experience gender-based harassment.

Self-Generated/Produced Sexual Content Involving Children

This refers to sexual contact taken by a child about themselves. It is possible for children and adolescents under 18 years to take sexual pictures or videos of themselves. This behavior on its own is not illegal, but when the content is shared online, the context behind it can determine whether it should be flagged as explicit content, harassment, CSAM/CSEM or OCSEA. Consider factors like potential coercion (seen/unseen) to produce it; distribution/sharing of the content by someone else other than the child; manipulating or tricking a child into producing such content. Understanding the context for producing and sharing this material is crucial to determine the scale of harm and legal implications.

T


Technology-Facilitated Child Sexual Abuse and Exploitation

This is a generic term that refers to any form of sexual abuse or sexual exploitation of children in which technology plays any facilitating role. This can include when sexual abuse and exploitation is committed in the digital environment, but also when sexual harm in-person is committed with the help of technology, so if an individual uses technology to manipulate and gain the trust of a child who they may harm sexually when meeting in person for instance.

See how these terms apply in practice across gaming platforms and online communities. 20 Core Violation Categories.