skip to main content

Apple’s fight against CSAM could be a promising step forward

Posted on 15 September 2021 by Chief Censor David Shanks

Lockdowns have triggered a global spike in people accessing or attempting to access child sexual exploitation material (CSAM) online. With pressure on tech companies to act, Apple recently announced it will be introducing a new system that scans for child sexual abuse material on US users’ devices. Then they put the move on hold. While critics have raised genuine concerns about privacy, Aotearoa’s Chief Censor, David Shanks, believes this sort of initiative from tech companies presents a promising way forward. Listen to his interview on bFM. 

One of the most difficult things I have to do as Chief Censor is view material showing the sexual abuse of children. It is tragic that children are being exploited in this way and there seems to be a continuing trend for using technology in new ways to monetise and distribute this material. This is why when Apple announced they plan to scan US iPhones for images of child sexual abuse images, I, unlike some others, welcomed this as a step in the right direction.

Digitally accessed child sexual abuse material (CSAM) is on the rise in Aotearoa and around the world. National Center for Missing & Exploited Children (NCMEC) registered a 106% increase in reports of suspected child sexual exploitation - rising from 983,734 reports in March 2019 to 2,027,520 in the very same month in 2020, according to WeProtect. We see it all too often at the Classification Office too, but it’s only the tip of the iceberg. Most of the CSAM material can be prosecuted without needing to be classified – it’s the ones that are less black and white that come across our desk. What we’re doing now clearly isn’t enough. In order to turn this tide, we need to work out a better balance between our legitimate expectations of privacy and confidentiality, and our equally important expectations that children will be protected and abusers will be stopped.

We have been here before. In 2010, New Zealand established the Digital Child Exploitation Filtering system, which is operated by the Department of Internal Affairs. It’s a voluntary system for internet providers to opt into and most major providers do so. The filter helps prevent their customers accessing illegal websites or content and it’s doing its job. Last year the system prevented more than 10,000 attempts to access some 600 child abuse sites from New Zealand every month.

What Apple are doing in the US isn’t searching through your photos to find ones that look suspicious, it is using a database of known child sexual abuse images to scan for matches. This happens when your images are uploaded to iCloud. When the system identifies a match, a human at Apple is notified and reviews the images, if confirmed to be CSAM, authorities are notified and the iCloud account locked. New images are uploaded by authorities on an ongoing bases. There are real safeguards in place to protect privacy, while also preventing harm.

I have been vocal in my support of tech companies taking responsibility and acting on their duty of care to self-regulate their platforms and help prevent harm. While I support this direction by Apple, I care just as deeply about protecting our rights to freedom of expression and privacy.  It’s a real balancing act and any change of this kind needs to be carefully navigated, with safeguards built in.

The slippery slope argument has been at the centre of the backlash to this decision from Apple. Critics are saying that this could lead to pressure put on Apple by totalitarian governments to use this technology, perhaps by accessing material that was critical of a regime in power.  This is a valid concern, we certainly see cases of such governments seeking to use tech and digital platforms to shore up their power and undermine opposition.  But I don’t think we are (yet) in a position where tech companies cannot put in place legitimate harm-prevention measures because of theoretical pressure from an oppressive regime. The ‘slippery slope’ argument cuts both ways. What happens if we do nothing and place no expectation or duty of care that tech companies should be responsible for reducing the harm your devices and services might have on children and vulnerable people? It is becoming increasingly clear that the answer is ever increasing abuse and harm.

We need to strike a balance that is transparent, clearly understood, and negotiated through with governments and tech companies. Should Apple consider extending this initiative to Aoteoroa, I would welcome their engagement with my office, the Privacy Commissioner and the New Zealand Government in terms of working out how this might look like in the context of Aotearoa. I would be interested in working through in a collaborative way an exploration of how our laws, principles and expectations can hit that balance point between freedom of expression and privacy, and the need to reduce harm.

I would also be very interested in seeing how such a collaborative discussion might in turn provide a case study that could assist the thinking around the broad content regulatory review, recently announced by Minister Tinetti.  This review presents an opportunity to rethink how our government can collaborate with tech to deal with the changing nature of harmful content. We need our laws to be fit for purpose, to provide a broad enough ‘toolkit’ to address the wide variety of content harms and issues we are seeing emerge online today, while being agile and flexible enough to address the challenges of tomorrow. 

The reality is, we live in a world where technology is going to have more and more impact on our lives and we must find ways to work together and use technology to our benefit – to create a safer, less harmful world – all while providing strong protections for our rights.

Image of a magnifying glass over the Apple logo

Comments

To comment as a guest without having to login, click inside the comment field below. This will reveal a field labelled "Name". Click in the "Name" field to reveal the "I'd rather post as a guest" checkbox. Tick this checkbox and then fill in your name and email address. You can use a nickname if you don't want to use your real name. Happy commenting!