INTRODUCTION: Welcome to the NSPCC Learning Podcast,
INTRODUCTION: where we share learning and expertise in
INTRODUCTION: child protection from inside and outside of
INTRODUCTION: the organisation.
INTRODUCTION: We aim to create debates, encourage
INTRODUCTION: reflection and share good practice on how
INTRODUCTION: we can all work together to keep babies,
INTRODUCTION: children and young people safe.
GEORGE LINFIELD: Welcome to the NSPCC Learning Podcast.
GEORGE LINFIELD: This episode focuses on Report Remove,
GEORGE LINFIELD: an online tool from the NSPCC and the
GEORGE LINFIELD: Internet Watch Foundation that under-18s
GEORGE LINFIELD: can use to report nude images or videos
GEORGE LINFIELD: of themselves that have been shared
GEORGE LINFIELD: online to see if they can be removed
GEORGE LINFIELD: from the internet.
GEORGE LINFIELD: In just a moment, you'll hear from Chloe
GEORGE LINFIELD: O'Connor, Projects Manager within the
GEORGE LINFIELD: NSPCC's Child Safety Online Solutions
GEORGE LINFIELD: Lab, in conversation with online safety
GEORGE LINFIELD: practitioners from Childline and the
GEORGE LINFIELD: Internet Watch Foundation.
GEORGE LINFIELD: They will explain more about what Report
GEORGE LINFIELD: Remove is and how it works; the
GEORGE LINFIELD: confidentiality and security around the
GEORGE LINFIELD: tool; how professionals who work with
GEORGE LINFIELD: children should approach an incident of
GEORGE LINFIELD: nude image sharing; and how
GEORGE LINFIELD: professionals can use Report Remove to
GEORGE LINFIELD: support young people involved in these
GEORGE LINFIELD: incidents.
GEORGE LINFIELD: But first, let's hear from two members
GEORGE LINFIELD: of the NSPCC's Young People's Board for
GEORGE LINFIELD: Change about why the Report Remove tool
GEORGE LINFIELD: is important for children and young
GEORGE LINFIELD: people.
YPBCM1: I think it's incredibly important that
YPBCM1: young people have a tool like Report Remove
YPBCM1: because not every young person has a support
YPBCM1: system like their family or even trusted
YPBCM1: teachers at school to confide in them about these
YPBCM1: sort of things. I think there's a lot of fear
YPBCM1: in general about these situations being forced to
YPBCM1: go to the police and a
YPBCM1: massive deal being made out of a situation
YPBCM1: that a young person probably already regrets and,
YPBCM1: like, isn't very happy about in
YPBCM1: the first place.
YPBCM1: So I think the fact that Report Remove is
YPBCM1: confidential and is easily accessible for
YPBCM1: so many young people is absolutely amazing and
YPBCM1: incredibly important.
YPBCM2: The fear of judgement around
YPBCM2: sexual images for young people is a massive
YPBCM2: problem that can cause lots of mental health
YPBCM2: problems among young people.
YPBCM2: And I think that this is why this service is so
YPBCM2: important. And I also think it's really
YPBCM2: important that it's linked to Childline so
YPBCM2: that young people then also know that if they
YPBCM2: need further support or if they want to talk
YPBCM2: about it, they can then easily access that
YPBCM2: service afterwards as well.
CHLOE O'CONNOR: There
CHLOE O'CONNOR: are lots of reasons why children and
CHLOE O'CONNOR: young people might share nudes, but it
CHLOE O'CONNOR: can be incredibly distressing for young
CHLOE O'CONNOR: people when they lose control of a nude
CHLOE O'CONNOR: image that's been shared online.
CHLOE O'CONNOR: It can leave them at risk of further
CHLOE O'CONNOR: abuse or exploitation, including
CHLOE O'CONNOR: financially or for further images.
CHLOE O'CONNOR: Young people can feel revictimised every
CHLOE O'CONNOR: time an image or video of them is shared,
CHLOE O'CONNOR: and the IWF are continuously seeing an
CHLOE O'CONNOR: increase in self-generated child sexual
CHLOE O'CONNOR: abuse material.
CHLOE O'CONNOR: To address this issue, the NSPCC
CHLOE O'CONNOR: partnered with the Internet Watch Foundation,
CHLOE O'CONNOR: or IWF, to find a way to support those
CHLOE O'CONNOR: young people to report nude and sexual
CHLOE O'CONNOR: images and videos of themselves to see if
CHLOE O'CONNOR: they could be removed from the internet. With
CHLOE O'CONNOR: support from age verification platform Yoti,
CHLOE O'CONNOR: the first version of Report Remove was
CHLOE O'CONNOR: developed in 2017, and then since then
CHLOE O'CONNOR: the organisations have been working with
CHLOE O'CONNOR: each other and with children and young
CHLOE O'CONNOR: people to improve the tool and to make
CHLOE O'CONNOR: sure that young people know about it.
CHLOE O'CONNOR: So to make a report, children need to
CHLOE O'CONNOR: follow three steps.
CHLOE O'CONNOR: The first is follow the instructions to
CHLOE O'CONNOR: confirm their age. If they're 13 to
CHLOE O'CONNOR: 17, they'll be asked if they'd like to
CHLOE O'CONNOR: prove their age using Yoti and using a
CHLOE O'CONNOR: form of ID to help with that.
CHLOE O'CONNOR: And then step two, they'll log in or
CHLOE O'CONNOR: create a Childline account so that they
CHLOE O'CONNOR: can receive updates on their report.
CHLOE O'CONNOR: And then step three: report and remove.
CHLOE O'CONNOR: They share the image or video or a link
CHLOE O'CONNOR: to it securely with the IWF, who will
CHLOE O'CONNOR: then view and work to have it removed if
CHLOE O'CONNOR: it breaks the law.
CHLOE O'CONNOR: Childline will let the young person know
CHLOE O'CONNOR: the outcome of their report and provide
CHLOE O'CONNOR: further support where needed.
CHLOE O'CONNOR: In this episode we'll be joined by Zara
CHLOE O'CONNOR: at the IWF and Sam from Childline's
CHLOE O'CONNOR: online service team to talk about how
CHLOE O'CONNOR: Report Remove works and how professionals
CHLOE O'CONNOR: can support young people to use it.
CHLOE O'CONNOR: I'll now pass over to them to introduce
CHLOE O'CONNOR: themselves.
ZARA: Hi, I'm Zara and I'm a senior analyst at the
ZARA: Internet Watch Foundation.
ZARA: The Internet Watch Foundation is an independent
ZARA: charity based in the UK.
ZARA: We work to make the Internet a safer place.
ZARA: We identify and remove online images and videos
ZARA: of child sexual abuse worldwide.
ZARA: We also run a hotline which offers a safe place for
ZARA: the public to report to us anonymously.
ZARA: The hotline consists of a team of 16 expert
ZARA: analysts who spend their working week assessing
ZARA: and disrupting the sharing of child sexual abuse
ZARA: material online.
SAM FIRTH: My name's Sam and I'm a website supervisor
SAM FIRTH: with the Childline online service.
SAM FIRTH: Childline is a service for children and young
SAM FIRTH: people under the age of 19 in the UK.
SAM FIRTH: I think it's most well known for its Childline
SAM FIRTH: counsellors and the support it can offer over
SAM FIRTH: the telephone. But also young people can make
SAM FIRTH: an account on the website.
SAM FIRTH: The website itself has a wealth of information
SAM FIRTH: and tips and advice on a whole range of
SAM FIRTH: subjects and topics.
SAM FIRTH: It also hosts the Report Remove tool.
CHLOE O'CONNOR: Brilliant. Thank you very much.
CHLOE O'CONNOR: So it would be really good to hear more
CHLOE O'CONNOR: about Report Remove and why it's needed.
CHLOE O'CONNOR: But taking a step back first, because
CHLOE O'CONNOR: Report Remove is there to support young
CHLOE O'CONNOR: people who have had a nude image shared
CHLOE O'CONNOR: online: what should professionals do and
CHLOE O'CONNOR: how can professionals who work with
CHLOE O'CONNOR: children approach an incident of nude
CHLOE O'CONNOR: image sharing?
SAM FIRTH: I think the primary things to remember is to
SAM FIRTH: not look at the image yourself and to follow
SAM FIRTH: any safeguarding policies that your
SAM FIRTH: organisation may have.
SAM FIRTH: In terms of interacting with that child or
SAM FIRTH: young person that has come to you with this
SAM FIRTH: problem, it's really important to stay calm,
SAM FIRTH: remain non-judgemental, and accept the
SAM FIRTH: situation for what it is.
SAM FIRTH: So it may well be that this young person is
SAM FIRTH: experiencing a sense of regret and is aware of
SAM FIRTH: the risks they might be now facing.
SAM FIRTH: So there's no need for telling off or attempts
SAM FIRTH: to educate them on the risks because that's
SAM FIRTH: already happening for them.
ZARA: I'd just like to add that it's important to
ZARA: remember that the creating and sharing of nudes of
ZARA: under 18 is illegal.
ZARA: However, the law is there to protect children.
ZARA: So it's important to reassure that young person
ZARA: that they're not in any trouble and just to use
ZARA: supportive language and that there is a tool out
ZARA: there to help them remove the content
ZARA: online and the law is there
ZARA: to support them.
SAM FIRTH: One option is to tell that child
SAM FIRTH: or young person about Report Remove.
SAM FIRTH: And if you've got access to the internet there
SAM FIRTH: and then it'd be really handy if you could
SAM FIRTH: show them the tool on the website by searching
SAM FIRTH: for the URL: childline.org.uk/remove.
CHLOE O'CONNOR: So why is the Report Remove tool needed
CHLOE O'CONNOR: and why might a young person have shared
CHLOE O'CONNOR: nude images online?
SAM FIRTH: It's a normal and expected and healthy part of
SAM FIRTH: child development to want to understand the
SAM FIRTH: changing body. And then when we couple that
SAM FIRTH: with access to the internet, many children and
SAM FIRTH: young people have a phone with a camera with
SAM FIRTH: that internet access.
SAM FIRTH: Perhaps to an extent, it's almost inevitable
SAM FIRTH: that those two things are going to come
SAM FIRTH: together at times.
SAM FIRTH: There's many reasons why a young person may
SAM FIRTH: share a nude.
SAM FIRTH: They may be sharing their images in a trusting
SAM FIRTH: friendship or a trusting relationship.
SAM FIRTH: It might be an expression of body confidence.
SAM FIRTH: It might feel empowering.
SAM FIRTH: They might be doing it for fun.
SAM FIRTH: There's a darker side too. They might be
SAM FIRTH: sharing nudes because they've been pressured
SAM FIRTH: or they've been coerced or they've been
SAM FIRTH: bullied. But they might change their mind
SAM FIRTH: about what they've done. They may experience
SAM FIRTH: that sense of regret.
SAM FIRTH: It may be that trust has been broken or
SAM FIRTH: they're concerned that that trust might get
SAM FIRTH: broken. Or when they've initially shared
SAM FIRTH: a nude to try and prevent further requests,
SAM FIRTH: further pressure, it hasn't worked, and
SAM FIRTH: that's when they feel they need something to
SAM FIRTH: happen, something to change, so they can take
SAM FIRTH: some control back of the image that they've
SAM FIRTH: lost. That's where Report Remove can offer a
SAM FIRTH: solution and can offer safeguarding.
CHLOE O'CONNOR: Thank you, Sam. So, Zara, you spoke at
CHLOE O'CONNOR: the beginning about how the IWF can
CHLOE O'CONNOR: identify this kind of content and take it
CHLOE O'CONNOR: down. Why is there a need specifically
CHLOE O'CONNOR: for a service like Report Remove?
ZARA: Over the years we've seen a growth in the sharing
ZARA: of nude images online of under-18s,
ZARA: so we thought it'd be a good idea to provide a
ZARA: service that not only helps young people
ZARA: access support when things go wrong,
ZARA: but also to be able to use
ZARA: our expertise and our contacts within
ZARA: the internet industry to block these images
ZARA: and stop them from being shared online.
CHLOE O'CONNOR: Who is Report Remove for?
SAM FIRTH: Report Remove is for children and young people
SAM FIRTH: in the UK who are under the age of 18.
CHLOE O'CONNOR: And if a young person wanted to use the
CHLOE O'CONNOR: tool — they've looked for it online and
CHLOE O'CONNOR: they've found the tool — what would they
CHLOE O'CONNOR: then need to do?
SAM FIRTH: So once they get onto the Report Remove page
SAM FIRTH: on the Childline website, from there they
SAM FIRTH: create or sign into a Childline
SAM FIRTH: account. Then they have the option of proving
SAM FIRTH: their age with I.D. such as a passport.
SAM FIRTH: Once they've gone through that stage of the
SAM FIRTH: reporting process, they need to attach the
SAM FIRTH: image, the video or the direct URL for the
SAM FIRTH: image or video to the report, and that will go
SAM FIRTH: to IWF who will begin their work with it.
SAM FIRTH: Childline never ever see the image or video,
SAM FIRTH: but we will make contact with that child or
SAM FIRTH: young person to keep them updated with how the
SAM FIRTH: report is going and offer them other forms of
SAM FIRTH: support as well.
CHLOE O'CONNOR: Why are young people asked if they'd like
CHLOE O'CONNOR: to prove their age?
ZARA: So children under 13 are not asked to prove
ZARA: their age. That's because our analysts have the
ZARA: visual expertise to assess that person in the image
ZARA: is underage.
ZARA: However, young people over the age of 13 are asked
ZARA: to choose to prove their age using I.D.,
ZARA: because that means the IWF can be certain that
ZARA: the image of the person is under 18, and
ZARA: then therefore we can get the image taken down from
ZARA: a lot more places.
ZARA: But if a child does not have I.D., they should
ZARA: still be encouraged to use Report Remove.
ZARA: This is because the IWF will still make an
ZARA: assessment of the age of that person in the image
ZARA: and in many cases will be able to use this
ZARA: assessment to be certain that it is a child.
ZARA: If IWF can't be certain that the content is of a
ZARA: child, then we can still ask tech companies
ZARA: to take it down.
ZARA: This means that the content can still be removed
ZARA: from lots of places and the young person can still
ZARA: choose to access emotional support from Childline.
CHLOE O'CONNOR: From the IWF's perspective, what happens
CHLOE O'CONNOR: at the point that a young person's made a
CHLOE O'CONNOR: report?
ZARA: So as Sam said, the young person will create
ZARA: an account. So once they've created their account,
ZARA: they'll be directed to the secure IWF portal
ZARA: to upload their content.
ZARA: After the report has been made, the person will
ZARA: receive a case number into their locker and
ZARA: they can refer back to this if they have any more
ZARA: communications to make about their report.
ZARA: So apart from selecting their age range, the
ZARA: only information the portal gives to IWF
ZARA: are their images and videos or URLs and
ZARA: these are, like I said, uploaded via a secure
ZARA: portal, so only the analysts here will see their
ZARA: content. Unfortunately, IWF cannot
ZARA: view content on end-to-end encrypted apps or
ZARA: websites such as WhatsApp and Snapchat, or
ZARA: content that's saved on another person's device.
ZARA: But if the young person still has the image,
ZARA: they should be encouraged that they can still make
ZARA: a report and upload those images to us,
ZARA: and then we will make sure that it's assessed
ZARA: against UK law.
ZARA: And if it's found to be criminal, we will get
ZARA: that removed as quickly as possible.
CHLOE O'CONNOR: And what did the IWF do to make sure that
CHLOE O'CONNOR: it does get removed?
ZARA: So we will use the law to work out if the content
ZARA: can be removed and if it breaks the law IWF
ZARA: will give it a digital fingerprint called a hash.
ZARA: So IWF can either send a takedown
ZARA: notice to a website or internet service provider
ZARA: if it's a URL, or it'll be blocked automatically
ZARA: depending on the way the hash was found.
ZARA: We share our hash lists with internet companies,
ZARA: including major social media platforms, so
ZARA: if an image— if that image is uploaded to
ZARA: the internet, again, it can be quickly removed
ZARA: or even stopped from being uploaded in the first
ZARA: place.
ZARA: IWF will let Childline know the outcome
ZARA: so they can contact the young person and let
ZARA: them know the outcome of their report and keep them
ZARA: updated and offer them further support.
ZARA: IWF try to review each Report Remove
ZARA: report within one working day and updates
ZARA: are sent to the locker on that day.
CHLOE O'CONNOR: That's really helpful. Thank you.
CHLOE O'CONNOR: And what happens if the image has been
CHLOE O'CONNOR: changed slightly? For example, if
CHLOE O'CONNOR: somebody has got a cropped part of the
CHLOE O'CONNOR: image?
ZARA: The technology that we use, it doesn't
ZARA: matter if the image has been
ZARA: doctored or in any way or cropped, as you say,
ZARA: we'll still be able to identify that image using
ZARA: the hash that we give it.
CHLOE O'CONNOR: That sounds really important.
CHLOE O'CONNOR: Do we know anything about how young
CHLOE O'CONNOR: people have responded to
CHLOE O'CONNOR: Report Remove and if it has made a difference
CHLOE O'CONNOR: to them?
SAM FIRTH: So I can give you an example of
SAM FIRTH: a 15-year-old male who's used Report Remove.
SAM FIRTH: So this person reported a number of images and
SAM FIRTH: videos, a mixture of images and videos.
SAM FIRTH: IWF assessed them all to
SAM FIRTH: be child sexual abuse material and therefore
SAM FIRTH: meeting the threshold to be removed from the
SAM FIRTH: internet. So then, as Childline,
SAM FIRTH: we were able to feed that back to the young
SAM FIRTH: person and offer some emotional support via
SAM FIRTH: the counselling team -- which they took that
SAM FIRTH: up and they were supported by our Childline
SAM FIRTH: counsellors for a period of about three
SAM FIRTH: months. They explained to our counsellors that
SAM FIRTH: the images and videos were sent to a female
SAM FIRTH: young person in trust.
SAM FIRTH: But then having been in school and reflecting
SAM FIRTH: on those general messages about online safety
SAM FIRTH: and nudes, they recognised their vulnerability
SAM FIRTH: and started to become a bit concerned about
SAM FIRTH: how that material might affect them in the
SAM FIRTH: future. And they told that Childline
SAM FIRTH: counsellor that they spoke to about the sense
SAM FIRTH: of relief that they had when those images
SAM FIRTH: and videos were removed and
SAM FIRTH: offering them future safeguarding as well.
SAM FIRTH: And actually after making their Report Remove
SAM FIRTH: report, that same young person got in touch
SAM FIRTH: with the platforms they shared those images
SAM FIRTH: and videos on and asked that platform provider
SAM FIRTH: to remove all other data as well, which the
SAM FIRTH: platform agreed to do.
SAM FIRTH: Yeah, so any sense of regret and anxiety,
SAM FIRTH: and also a feeling of 'life was over' that
SAM FIRTH: that young person was feeling; they were
SAM FIRTH: feeling now more reassured, more relieved.
SAM FIRTH: So Report Remove was really powerful for that
SAM FIRTH: young person.
CHLOE O'CONNOR: Thank you for that example of the
CHLOE O'CONNOR: positive impact that the Report Remove
CHLOE O'CONNOR: tool can have Sam. Zara, you mentioned
CHLOE O'CONNOR: earlier that you'd be assessing the
CHLOE O'CONNOR: content against UK law to see if it could
CHLOE O'CONNOR: be taken down. What happens if you aren't
CHLOE O'CONNOR: able to take the content down?
ZARA: We will tell Childline the outcome
ZARA: of our assessment and if it doesn't meet
ZARA: our criteria of breaking the law, then
ZARA: we will let Childline know why it
ZARA: doesn't quite meet our threshold and then they can
ZARA: convey that to the child and offer support in
ZARA: other ways.
CHLOE O'CONNOR: Great, thank you. And then Sam, what does
CHLOE O'CONNOR: that look like from a Childline perspective
CHLOE O'CONNOR: when you're sending that message to that
CHLOE O'CONNOR: young person to let them know it couldn't
CHLOE O'CONNOR: be taken down?
SAM FIRTH: So I try to give them as much information as I
SAM FIRTH: can and explain why it didn't quite meet the
SAM FIRTH: threshold, because it may be that the image
SAM FIRTH: felt nude to them, or there may be nudity in
SAM FIRTH: it, but it still doesn't quite meet that
SAM FIRTH: threshold for removal. So, I explain that
SAM FIRTH: situation to them.
SAM FIRTH: I also give them the options that might be
SAM FIRTH: available to them to take other actions which
SAM FIRTH: may involve going direct to platforms where
SAM FIRTH: they know the image might be and request for
SAM FIRTH: it to be taken down from— using their
SAM FIRTH: community rules.
SAM FIRTH: But we also offer them emotional support too.
SAM FIRTH: That can be through our counsellors or it can
SAM FIRTH: be through self-help tools like the coping
SAM FIRTH: kit. We also have a Calm Zone on the Childline
SAM FIRTH: website, which is really useful too, and it's
SAM FIRTH: very popular.
CHLOE O'CONNOR: That all sounds like really helpful wider
CHLOE O'CONNOR: support for young people.
CHLOE O'CONNOR: Is that something that young people need
CHLOE O'CONNOR: to engage with to use Report Remove?
CHLOE O'CONNOR: For example would a young person need to
CHLOE O'CONNOR: speak to a counsellor if they wanted to
CHLOE O'CONNOR: report content?
SAM FIRTH: No there's no need to engage with any other
SAM FIRTH: part of Childline. They can use Report Remove
SAM FIRTH: and that be it, and that be all they use from
SAM FIRTH: Childline. But if they do want that bit extra,
SAM FIRTH: there's lots of options available to them and
SAM FIRTH: they can pick and choose what feels right for
SAM FIRTH: them, what suits them as an individual, what
SAM FIRTH: fits in with how they like
SAM FIRTH: to communicate and be supported.
CHLOE O'CONNOR: Brilliant. So, if a young person is choosing
CHLOE O'CONNOR: to engage in that extra support or they
CHLOE O'CONNOR: have just used Report Remove — either way
CHLOE O'CONNOR: — is there a chance that anybody else
CHLOE O'CONNOR: would find out that that young person has
CHLOE O'CONNOR: spoken to Childline or has used Report
CHLOE O'CONNOR: Remove?
SAM FIRTH: Childline has a really high confidentiality
SAM FIRTH: threshold. You can have a look at the
SAM FIRTH: confidentiality promise on the website.
SAM FIRTH: It's beyond that of most of the services that
SAM FIRTH: work with the same age group and that same
SAM FIRTH: confidentiality promise that we offer through
SAM FIRTH: all our work applies to Report Remove as well.
SAM FIRTH: And it means we can keep so much more
SAM FIRTH: information private.
SAM FIRTH: It's only in very, very rare circumstances
SAM FIRTH: that someone who has used Report Remove would
SAM FIRTH: have any information passed on to someone
SAM FIRTH: else, should any further safeguarding be
SAM FIRTH: required. And we would always try to
SAM FIRTH: make sure the young person or the child is
SAM FIRTH: aware that we're having to do that and why.
SAM FIRTH: But again, it's in very, very rare
SAM FIRTH: circumstances.
CHLOE O'CONNOR: Zara, are you able to just talk a little
CHLOE O'CONNOR: bit more to confidentiality on the IWF
CHLOE O'CONNOR: side. I know you said before that IWF
CHLOE O'CONNOR: staff would never know who the young
CHLOE O'CONNOR: person is, they'd only see the confirmation
CHLOE O'CONNOR: that they're under 18 and receive the
CHLOE O'CONNOR: images, but in terms of making sure that
CHLOE O'CONNOR: it's being taken down, how do the IWF
CHLOE O'CONNOR: ensure that that stays confidential?
ZARA: So IWF analysts will view the content
ZARA: and assess whether it breaks the law in the UK.
ZARA: Only hashes are sent out to industry
ZARA: and our members to block the content.
ZARA: No one else will will see the images.
ZARA: So
ZARA: we use our bespoke hashing tool called
ZARA: 'Intelligrade', which was built in-house at
ZARA: Internet Watch Foundation, and we will tag
ZARA: these Report Remove images with a special tag
ZARA: to show that it's been self-reported.
ZARA: This will go out to industry
ZARA: as a 'self-reported image', so therefore
ZARA: when these hashes are shared with law enforcement,
ZARA: they know that it's been self-reported and
ZARA: therefore that person should not fear a
ZARA: knock on their door that they've been sharing child
ZARA: sexual abuse imagery.
CHLOE O'CONNOR: The hashes is what's shared with industry
CHLOE O'CONNOR: members — the image itself is never
CHLOE O'CONNOR: shared.
ZARA: That's right. It's just the hash.
CHLOE O'CONNOR: Brilliant. Thank you. So that
CHLOE O'CONNOR: also means that the risk of the police
CHLOE O'CONNOR: getting involved, is that restricted to
CHLOE O'CONNOR: only if it's needed as part of a
CHLOE O'CONNOR: safeguarding response?
ZARA: Yes. If we spot any
ZARA: potential safeguarding issues, we can relay that
ZARA: to Childline, who will do their own risk
ZARA: assessment.
CHLOE O'CONNOR: And how can young people find Report
CHLOE O'CONNOR: Remove?
SAM FIRTH: So Report Remove is available on the Childline
SAM FIRTH: website using the URL childline.org.uk/remove.
SAM FIRTH: You can also search Report Remove Childline on
SAM FIRTH: Google and it should appear top of the search.
SAM FIRTH: Professionals and parents can also visit
SAM FIRTH: nspcc.org.uk/report-remove
SAM FIRTH: to learn more about the tool from an adult's
SAM FIRTH: perspective.
SAM FIRTH: We're encouraging professionals, such
SAM FIRTH: as teachers for example, to become aware of
SAM FIRTH: the tool so that they can support young people
SAM FIRTH: who might approach them.
CHLOE O'CONNOR: Is there anything out there that can help
CHLOE O'CONNOR: professionals tell young people about
CHLOE O'CONNOR: Report Remove and how to use it?
SAM FIRTH: So the NSPCC has an online elearning
SAM FIRTH: course that professionals can access to find
SAM FIRTH: out more about the subject of managing
SAM FIRTH: incidents of sharing nudes and how to support
SAM FIRTH: young people.
SAM FIRTH: There's also some printouts
SAM FIRTH: that are aimed at professionals that work with
SAM FIRTH: young people, again, accessible from NSPCC
SAM FIRTH: Learning. We've also got some videos
SAM FIRTH: which can be accessed from the NSPCC
SAM FIRTH: website or the Childline website, just to find
SAM FIRTH: out a bit more. And again, professionals can
SAM FIRTH: look at the Report Remove tool on the website
SAM FIRTH: just to become familiar with it, find out
SAM FIRTH: where it is, what it looks like, and just
SAM FIRTH: become more confident that there is a tool out
SAM FIRTH: there to support the young people they work
SAM FIRTH: with.
CHLOE O'CONNOR: Thank you. That's really helpful because
CHLOE O'CONNOR: I can imagine it can be quite challenging
CHLOE O'CONNOR: to have that kind of conversation with
CHLOE O'CONNOR: young people, especially if you're talking
CHLOE O'CONNOR: to a large group of young people about
CHLOE O'CONNOR: something that's potentially sensitive.
CHLOE O'CONNOR: So that's useful that the video is there
CHLOE O'CONNOR: to, sort of, do the talking for them and
CHLOE O'CONNOR: then give all the extra information
CHLOE O'CONNOR: around it to help answer any questions.
CHLOE O'CONNOR: Is there any particular time that's
CHLOE O'CONNOR: important to share information about
CHLOE O'CONNOR: Report Remove, for example when an
CHLOE O'CONNOR: incident has occurred or in general?
SAM FIRTH: I think any time is a good time.
SAM FIRTH: We know that the most common age range
SAM FIRTH: for using Report Remove is at the later
SAM FIRTH: teens, but we do get young people under the
SAM FIRTH: age of 13 using the tool as well.
SAM FIRTH: It's useful to share with young people before
SAM FIRTH: an incident occurs so they are aware of Report
SAM FIRTH: Remove and they know it's there should they
SAM FIRTH: need it, but also should they find themselves
SAM FIRTH: in a situation where they are worried about an
SAM FIRTH: image, to find out about it then is also very
SAM FIRTH: helpful. It's a safeguarding tool, it
SAM FIRTH: can help them either before a situation
SAM FIRTH: occurs or after.
CHLOE O'CONNOR: Okay, so if a young person has disclosed
CHLOE O'CONNOR: that a nude image of them has been shared
CHLOE O'CONNOR: online, either to a professional, be it a
CHLOE O'CONNOR: teacher or a coach or anyone like that,
CHLOE O'CONNOR: or even their parent — that a nude image
CHLOE O'CONNOR: has been shared and they would like support
CHLOE O'CONNOR: with that. How can an adult support them
CHLOE O'CONNOR: with Report Remove?
SAM FIRTH: So the adult can show them the Report Remove
SAM FIRTH: tool on the website.
SAM FIRTH: They can maybe show them the videos about the
SAM FIRTH: tool to help that young person build their
SAM FIRTH: confidence. But in terms of making a report,
SAM FIRTH: the young person can do that entirely
SAM FIRTH: independently.
SAM FIRTH: They don't need an adult to give any kind
SAM FIRTH: of permission or anything like that.
SAM FIRTH: Report Remove is completely free to use, so
SAM FIRTH: there should never be any exchange of money
SAM FIRTH: for nudes to be removed from the internet.
SAM FIRTH: A young person doesn't have to pay to use
SAM FIRTH: Report Remove at all.
SAM FIRTH: We've tested the tool in consultation
SAM FIRTH: with children and young people, to make sure
SAM FIRTH: that it's clear, that it's easy to follow and
SAM FIRTH: that it's user friendly, so that it can be
SAM FIRTH: used completely independently by young people.
ZARA: Many young people are using Report Remove, so
ZARA: we know we are providing a valuable service.
ZARA: So we would like you to encourage young people
ZARA: that are worried that their nude images have been
ZARA: shared online, to encourage them to report to us
ZARA: and we will do our best to get those images
ZARA: removed and prevented from being
ZARA: uploaded again in the future.
CHLOE O'CONNOR: Thank you very much. That sounds really
CHLOE O'CONNOR: powerful that young people have a tool
CHLOE O'CONNOR: where they can take action themselves to
CHLOE O'CONNOR: report something that's been happening to
CHLOE O'CONNOR: them and to have an action taken from it,
CHLOE O'CONNOR: to have that content removed.
CHLOE O'CONNOR: And also especially important that it can
CHLOE O'CONNOR: help stop it being shared again in the
CHLOE O'CONNOR: future, like in that example that you
CHLOE O'CONNOR: shared Sam, for that young person, even
CHLOE O'CONNOR: if there wasn't an immediate worry, to
CHLOE O'CONNOR: know that it couldn't be shared online
CHLOE O'CONNOR: again in the future sounds like a really
CHLOE O'CONNOR: reassuring thing for young people.
CHLOE O'CONNOR: So thank you both so much today for
CHLOE O'CONNOR: joining us to talk about Report Remove
CHLOE O'CONNOR: and how it works and how it can support
CHLOE O'CONNOR: young people. If anybody listening to
CHLOE O'CONNOR: this episode would like to find out more
CHLOE O'CONNOR: information, all of the resources that
CHLOE O'CONNOR: we've mentioned today, so those videos
CHLOE O'CONNOR: and the printouts and the wider information,
CHLOE O'CONNOR: can be seen in the show notes from
CHLOE O'CONNOR: today's episode.
CHLOE O'CONNOR: The Report Remove tool itself is
CHLOE O'CONNOR: available on the Childline website at
CHLOE O'CONNOR: childline.org.uk/remove.
CHLOE O'CONNOR: NSPCC Learning also offers an elearning
CHLOE O'CONNOR: course on managing incidents of sharing
CHLOE O'CONNOR: nudes. So thank you very much
CHLOE O'CONNOR: for your time today.
SAM FIRTH: Thank you.
ZARA: Thank you very much.
OUTRO: Thanks for listening to this NSPCC Learning
OUTRO: podcast.
OUTRO: At the time of recording this episode, content
OUTRO: wasn't today, but the world of safeguarding and
OUTRO: child protection is ever changing.
OUTRO: So if you're looking for the most current
OUTRO: safeguarding and child protection, training,
OUTRO: information or resources, please visit
OUTRO: our website for professionals at NSPCC.
OUTRO: TalkTalk UK Forward Slash Learning.
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.