This resource has been peer reviewed by Youthline NZ and clinical psychologist Dougal Sutherland from Umbrella Wellbeing

It’s surprising to me how algorithms can change someone’s mindset and their entire life.

Female - Community Group, 2024 Youth Consultation

Here’s a quick overview on how algorithms work to give us an idea of why extremely harmful content might be appearing in a newsfeed.

Algorithms are systems that decide what content to show based on what people interact with online. As soon as someone signs up for a social media platform, the algorithm starts showing them content based on their demographics. Their feed will then be curated based on what images or videos they click on, like, pause or hover over or even if they just look at the comments. The platforms aim to ‘serve’ content that will keep the user engaged and doesn’t distinguish between pausing on a video or actively liking it.

An example shared by rangatahi was of videos that appeared in their newsfeed that began with a laughing baby and then without warning turned into a real-life suicide. This content would then go viral and circulate around their peers based on the same demographics. Rangatahi told us it was difficult to avoid this content and that even if they tried to block or select ‘not interested’ other content like this would appear in their feed. Young people all over the world are having similar experiences. Recent research by University College London (UCL) and the University of Kent supports these observations. Their study found that social media algorithms can rapidly amplify harmful content, including misogynistic and violent material, to young users.

Algorithms are individualised so although videos may go viral meaning its more likely that it will come up in our feed as in the example above, the social media experience is different from one person to the next. Young people told us that they are aware of the algorithms and how they work and the impact of the algorithm on each individual seems to be largely dependent on age and temperament, for example how resilient they are and what counter messaging they a getting in real life, from whānau, school, peers and hobbies.

Knowing about algorithms and how they work can help us to talk with rangatahi about how they can try to control or ‘curate’ their online experience in a positive way.

Practical tip

We recommend taking a look at your own social media newsfeed and having a go at changing it. Try actively shaping the content you see by selecting “not interested” on certain posts, muting or unfollowing accounts, clearing your watch history, or searching for a topic you wouldn’t usually explore. Did you find it easy to change? Was the change instant, or did it take a while?

Talk with your child about your experience and if they have a similar one. This can give you an idea about how much they understand about algorithms and may give them an opportunity to share their knowledge and open a conversation about what they might have seen online and the steps they can take to keep themselves safe.

Was this helpful?

If you'd like to know more about this topic, get in touch. We're happy to help.