Gore Content
It’s against the law in New Zealand to make, possess, or share content that promotes or supports extreme violence. This might include certain content that is described as “gore”.
“Gore” isn’t a term that’s used in the Classification Act, and it isn’t a standalone classification criterion. Instead, it’s a commonly used, informal category that people use to describe content featuring graphic or gruesome depictions of physical injury.
Gore is often associated with viral or shock content shared online. This can include real-world footage (such as accidents or violence) that circulates rapidly and may reach audiences without warning or context. Because of this, gore isn’t just a film or entertainment concept, it’s something people may encounter unexpectedly, particularly on social media.
Gore content can cause distress for people by showing suffering and injury for entertainment. Sharing and viewing this content is harmful because it can shock viewers and leave lasting impacts. Over time, sustained exposure to this type of content can desensitise viewers to the harm and suffering depicted and reduce their ability to feel empathy.
This page tells you what New Zealand law (the Films, Videos, and Publications Classification Act 1993) says about gore content and what you can do if you come across it.
This information is for general understanding and awareness. It is not legal advice.
Jump ahead to:
What does gore content include?
What does “promote or support” mean?
Are all representations of gore illegal? What about fiction, satire or news reporting?
About the Classification Office’s role and how the law works
Why or where might people come across objectionable gore content?
What is gore?
‘Gore’ is content that shows violent or graphic acts (or the aftermath of those acts) including real-life torture, assault, murder, accidents, crime scenes, or suicide. Gore content can dehumanise victims and make their suffering seem trivial or like entertainment. This can happen when injury or death is shown in a sensational way, closely focuses on injuries, or is included in compilations of violent or harmful incidents.
Gore content has often been edited, modified or added to in a way that is meant to be entertaining or shocking to an unsuspecting viewer. Gore can be filmed, uploaded, and widely shared within minutes of harm occurring, and can appear on website and social media feeds without warning. Gore content often has website branding that indicates it has been made to share.
In film and television, gore is often linked to specific genres, such as horror (especially “slasher” or “splatter” films) and some action or thriller subgenres, where graphic effects are used deliberately to provoke shock, fear, or intensity. Audience expectations can play a role here, viewers may choose this content knowing what it contains, but the level of detail and realism still influences how it is classified.
By contrast, gore in objectionable content is more likely to involve real-world footage or highly realistic depictions of serious harm. This can include material that is exploitative, lacks context, or is intended to shock, distress, or desensitise viewers. Unlike fictional portrayals, this type of content may raise significant concerns because of its potential to cause harm, particularly when it is shared widely or encountered unexpectedly online.
What does gore content include?
Gore content often includes the real-life infliction of serious physical harm (including some types of self-harm), acts of significant cruelty or death, but it doesn’t have to be real. The Classification Act treats Artificial Intelligence (AI) gore in the same way as real-life content. Even if content is fake, it can still encourage real-life harm. You can read more about AI and the law here.
Sometimes this kind of material shows up as a meme or a joke, shared by someone who does not realise the harmful impact. Content that shows gore can still be illegal, even when it’s presented in an edited format.
Is gore content illegal?
The Films, Videos, and Publications Classification Act 1993 says that content promoting or supporting acts of torture or the infliction of extreme violence or extreme cruelty, is illegal. This can include gore content if it promotes and supports these acts. There are serious legal consequences for creating, accessing, or sharing this material.
Gore content that doesn’t promote or support what it shows can also be illegal if it seriously degrades or dehumanises a person, trivialises suffering, instructs people how to cause harm, or presents extreme violence, cruelty or torture as positive or beneficial.
What does “promote or support” mean?
In this context, “promote or support” means content that it encourages or presents torture, extreme violence and extreme cruelty as favourable, acceptable or harmless. This includes material that instructs, endorses, justifies or suggests the behaviour should be engaged in.
In some cases, gore content may be presented as fictional, humorous, or for entertainment, and the classification will consider how it is framed and the overall message for the viewer.
Are all representations of gore illegal? What about fiction, satire or news reporting?
The law, and the Classification Office, places a high priority on freedom of expression. Not all content that deals with gore is illegal.
Content that refers to gore in a factual, critical, or harm-focused way, such as news reporting, education, or legal discussion is unlikely to be illegal. While news reporting can still at times be graphic or shocking, often the reporting includes cuts, blurring or descriptions of the strongest elements.
References to gore, self-harm, torture, extreme violence and extreme cruelty in fiction, film, or other forms of entertainment are not illegal, as long as the content does not endorse or encourage the behaviour or degrade and dehumanise victims. In some cases, an age restriction may apply because the material could be harmful for younger audiences (for example, an R16 film.)
About the Classification Office’s role and how the law works
The Films, Videos, and Publications Classification Act 1993 balances freedom of expression with protecting people from harm – especially children and young people.
Every piece of content the Classification Office sees is different and to decide whether something is illegal it must consider each one on its own merits and carefully apply the Act and the principles of fairness and natural justice. There are also serious penalties for offences under the Act. This is why decisions about whether content is illegal are made by trained experts at the Classification Office, who, guided by the Act, follow a transparent and fair process before making a decision.
You can read more about the classification process here.
Why or where might people come across objectionable gore content?
Like any content, people can come across gore content in a variety of ways – for example, on social media feeds, websites, messaging apps, or by being shown content in person. Content might be unwanted or show up unexpectedly, or people might seek it out intentionally.
How to report harmful or illegal content
If you come across harmful, distressing, or illegal content online, it’s important to report it.
Reporting helps protect you, your whānau, and others, supports safer online spaces, and ensures the right agencies can act. You don’t need to be certain whether content is illegal, different New Zealand agencies can assess it and respond appropriately. You can report content even if it’s hosted overseas.
Find out who to report to and how to report harmful or illegal content in New Zealand.
How to Report Harmful or Illegal Online Content in New Zealand
People across Aotearoa have consistently told us they are concerned about the harms of extreme content. Young people, parents and caregivers, grandparents, aunties and uncles, educators, and community leaders are worried for themselves, for young people, and for their communities.
Addressing these harms is a shared effort, supported by clear laws, dedicated agencies, and community action. Together, we can reduce harm by not creating or sharing objectionable content, protecting wellbeing, and reporting harm when we see it.
What’s next?
For parents and whānau: The Classification Office has a range of practical resources to help you support young people who may be encountering extreme content online — from conversation starters to insights shared directly by rangatahi.
For educators and those working with young people: Our resources focus on supporting young people safely and confidently, including guidance on keeping yourselves safe in those conversations and knowing what to do if concerns arise.
For young people: Learning about this content is an important step in staying safe and understanding where the legal line sits. If you need to report something, you can find out how here. If you’d like to talk to someone, our list of support helplines is available here.