Article: Learning the lessons from the pandemic: keeping our children safe online

Learning the lessons from the pandemic: keeping our children safe online (originally published on ConservativeHome)

Over the last year we have seen the best of the internet and the very worst. It has been a lifeline to many of us, allowing us to remain connected to other people and to continue working from home. This is as true for children in this country as for adults.

As the months of the pandemic have ticked by, we’ve all become very aware of the challenges which the internet presents. I have some experience in those challenges: for three years I served on the funding council of the Internet Watch Foundation (IWF), and latterly spent a decade in fraud and financial crime, well aware that it was the same ease and openness that makes the web so attractive which enables a frightening volume of fraud and scams. Having spent more years than I would care to remember advising people to not share information about themselves online and aware of therisks to young people, I now find myself in a job where sharing online is a key part of the role, and with young children starting to make their own tentative steps online. To say that this makes me queasy is an understatement.

The IWF’s mission is the elimination of child sexual abuse online. They achieve this by partnering with the same technology sector I was proud to work in. When I met them recently, they told me that in just one month in the first lockdown, they and their industry partners blocked 8.8 million attempts to access child sexual abuse content in the UK. 

Blocking access to child sexual abuse content is essential work but it’s only one part of the puzzle – we also need to bear down on the generation of new content. As the IWF’s CEO warned recently, “there is a fire burning in the bedrooms of our nation’s children.”

She was referring to the deeply concerning trend of “self-generated” abuse, where children are groomed into producing indecent images or videos which are then captured and re-shared. Tragically, this mostly affects very young girls (97% of the images or videos include a girl), and in most cases (81%) a child aged 11-13 years old. In a further 15% of cases, the image or video showed a child aged just 10 or under.

For any parent those statistics will give you pause. They represent a hidden ring of exploitation, where young people are groomed and abused online and at scale. And in response, IWF staff quite literally watch children grow up online at the hands of their abusers, removing content as soon they find it. But we can’t keep playing whack-a-mole with this material – we need a system which is geared towards prevention from the outset. The upcoming Online Safety Bill should enable just that, but there are challenges that it must meet to be truly effective.

First, the legislation must be flexible enough to deal with the changing landscape in which it operates. Whatever new regulatory regime is formed must be able to adapt to new technologies and allow industry the space to innovate to meet new threats. Working with those already dealing with these issues is going to be key to ensuring an effective regime.

Second, there must be equivalency for encrypted platforms.Understandably, many social media companies are movingtowards encrypted platforms or private messaging services. There are clear benefits in this for privacy and cyber security reasons, but we must ensure that these systems are not being exploited to view or share images and videos of abuse.

I was concerned by recent comments from the industry at the Home Affairs Select Committee on the potential damage to child safety. We must ensure that companies are continuing to scan their platforms for child sexual abuse material so that it can be removed. We have already seen the disastrous consequences when legislators get this wrong, as is currently playing out in the EU with the temporary derogation from the e-privacy directive.

It may seem counterintuitive to believe that you can both have privacy and protect against content being shared on the platforms which enable that same privacy, but it is entirely possible. Existing mechanisms allow tech companies to remove vast amounts of child sexual abuse imagery before it makes its way onto their platforms. This technology enables companies to “match” a photo at upload with an illegal image using a unique digital fingerprint of letters and numbers.These same mechanisms must be able to function on encrypted platforms or alternative solutions that are equally, or ideally more effective need to be in place before a platform attempts to encrypt. The bar should also certainly be higher with services specifically aimed at children. We should incentivise and legislate so tech companies are driven to innovate effective ways of doing this.

Finally, we must invest in education and in equipping children with the necessary skills to navigate life online. From a very young age there must be open conversations about the internet, in school and at home. We must ensure young people have confidence in reporting mechanisms, that they will be believed, and action will be taken.

School staff need to be trained effectively and parents must have the digital knowledge to facilitate these conversations.

We cannot afford to waste time tweaking the online regulatory regime over the next few years. We must prioritise getting this right from day one. 

As the COVID-19 restrictions begin to lift, many of us are feeling that we are approaching a new chapter and an opportunity to shape the future. I urge everyone who reads this to consider the shocking figures which I quoted earlier and consider how best to prioritise children’s safety online as the upcoming Online Safety Bill is developed.

Previous
Previous

Article: Letting the Big Society Flourish

Next
Next

Mail Weekly Column: 12 September 2021