Child Sexual Exploitation Material (CSEM)
It is a crime to sexually exploit children in New Zealand.
It’s also against the law of New Zealand to make, possess, or share content that promotes or supports child sexual exploitation material (CSEM).
Child sexual exploitation material causes harm to many victims, whānau and communities. Creating this material involves the direct abuse and exploitation of the victims. The sharing and viewing of this content continues that exploitation and increases the risk of harm, including to future victims, by making the behaviour seem sexually appealing, easy to do, or presenting it as having no consequences.
This page tells you what New Zealand law (the Films, Videos, and Publications Classification Act 1993) says about child sexual exploitation material content and what you can do if you come across it.
This information is for general understanding and awareness. It is not legal advice.
Jump ahead to:
What is child sexual exploitation material?
What does child sexual exploitation material include?
Child sexual exploitation material is extremely harmful.
Can child sexual exploitation material be digital or AI-generated?
What about pornography with younger-looking people?
Is child sexual exploitation material illegal?
What does “promote or support” mean?
About the Classification Office’s role and how the law works
Why or where might people come across child sexual exploitation material?
What is child sexual exploitation material?
Child sexual exploitation material is explicit sexual content that encourages sexual activity with children or young people or a sexual interest in them. This is sometimes called ‘child sexual abuse material’ (CSAM) or ‘child pornography’. Child sexual exploitation material and child sexual abuse material are more accurate terms because they make it clear that the content is made by exploiting and abusing children. Content that or encourages or supports the sexual exploitation of children or young people is illegal.
What does child sexual exploitation material include?
Child sexual exploitation material can be images, videos, livestreams, or any online content showing sexual activity involving a child or young person. It often involves coercion, threats, manipulation, or abuse.
Child sexual exploitation material is extremely harmful.
Child sexual exploitation material is often distributed by organised crime networks that profit from child abuse. Watching child sexual exploitation material increases the demand for this type of content and encourages further abuse and exploitation.
Survivors of child sexual exploitation material describe how knowing their abuse material is still being viewed can cause ongoing harm – re-traumatising them, increasing their fear, and creating ongoing anxiety about being recognised in public.
Watching child sexual exploitation material also harms the viewer. It can negatively impact their sexual, social, and mental wellbeing – increasing safety risks both to themselves and others.
Does consent matter?
Even if a child or young person seems to consent or makes the images themselves, the content is still illegal. The law is designed to protect children from being exploited in any way.
Can child sexual exploitation material be digital or AI-generated?
Sometimes CSEM content is fake, but this doesn’t make it legal. The Classification Act treats Artificial Intelligence (AI), Computer-Generated Imagery (CGI), and animated CSEM in the same way as CSEM involving real children or young people. Even if the content is fake, it can still encourage real-life harm.
What about pornography with younger-looking people?
Many porn sites advertise ‘Teen’ or ‘Young looking’ categories – and there are some performers who look like young people. These are often performers who naturally appear younger or they may be made to look younger by wearing clothing like school uniforms. Even porn using adults can be illegal if they look and act like young people.
Any sexual content depicting children or young people is likely to be illegal. To stay safe online avoid any sexual or pornographic content where people look like they are under 18.
Is child sexual exploitation material illegal?
The law that says content promoting or supporting the sexual exploitation of children and/or young people is illegal in the Films, Videos, and Publications Classification Act 1993. It is a crime, and there are serious legal consequences if you are involved in creating, accessing, or sharing this material.
What does “promote or support” mean?
In this context, promote or support means content that encourages, normalises, or presents the sexual exploitation of children and/or young people as acceptable or harmless. This includes material that approves of it, justifies it, or suggests it should be engaged in.
It can also include content that provides instruction or endorsement, and in some cases, this may include material presented as fictional, humorous, or for entertainment, depending on how it is framed and the overall message.
Are all representations of child sexual exploitation material illegal? What about fiction, satire or news reporting?
The law, and the Classification Office, places a high priority on freedom of expression. The law does not ban all content that deals with the sexual exploitation of children and/or young people.
Content that refers to child sexual exploitation in a factual, critical, or harm-focused way, such as news reporting, education, or legal discussion, does not promote or support it and is not illegal.
References to child sexual exploitation material in fiction, film, or other forms of entertainment are not illegal, as long as the content does not endorse or encourage the behaviour. In some cases, an age restriction may apply because the material could be harmful for younger audiences (for example, an R16 film.)
Content that presents child sexual exploitation material for sexual arousal is almost certainly illegal.
About the Classification Office’s role and how the law works
The Films, Videos, and Publications Classification Act 1993 balances freedom of expression with protecting people from harm – especially children and young people.
Every piece of content the Classification Office sees is different and to decide whether something is illegal it must consider each one on its own merits and carefully apply the Act and the principles of fairness and natural justice. There are also serious penalties for offences under the Act. This is why decisions about whether content is illegal are made by trained experts at the Classification Office, who, guided by the Act, follow a very transparent and fair process before making a decision.
You can read more about the classification process here.
Why or where might people come across child sexual exploitation material?
Like any content, people can come across child sexual exploitation material in a variety of ways – for example, on social media feeds, websites, messaging apps, or by being shown content in person. Content might be unwanted or show up unexpectedly, or people might seek it out intentionally.
How to report harmful or illegal content
If you come across harmful, distressing, or illegal content online, it’s important to report it.
Reporting helps protect you, your whānau, and others, supports safer online spaces, and ensures the right agencies can act. You don’t need to be certain whether content is illegal, different New Zealand agencies can assess it and respond appropriately. You can report content even if it’s hosted overseas.
Find out who to report to and how to report harmful or illegal content in New Zealand.
How to Report Harmful or Illegal Online Content in New Zealand
People across Aotearoa have consistently told us they are concerned about the harms of extreme content. Young people, parents and caregivers, grandparents, aunties and uncles, educators, and community leaders are worried for themselves, for young people, and for their communities.
Addressing these harms is a shared effort, supported by clear laws, dedicated agencies, and community action. Together, we can reduce harm by not creating or sharing objectionable content, protecting wellbeing, and reporting harm when we see it.
What’s next?
For parents and whānau: The Classification Office has a range of practical resources to help you support young people who may be encountering extreme content online — from conversation starters to insights shared directly by rangatahi.
For educators and those working with young people: Our resources focus on supporting young people safely and confidently, including guidance on keeping yourselves safe in those conversations and knowing what to do if concerns arise.
For young people: Learning about this content is an important step in staying safe and understanding where the legal line sits. If you need to report something, you can find out how here. If you’d like to talk to someone, our list of support helplines is available here.