Children are being exposed to more encrypted extremist content, according to Jonathan Hall, the UK’s terrorism law reviewer
The minimum age for WhatsApp users was lowered from 16 to 13, and the UK’s terrorism watchdog has denounced Mark Zuckerberg’s company Meta for this “remarkable” move that may expose more youth to extremist content.
According to Jonathan Hall KC, more kids have access to stuff that Meta is unable to control, such as sexually explicit or terroristically themed material.
When it comes to WhatsApp’s end-to-end encryption, which keeps messages visible only to the sender and recipient, Hall, the independent reviewer of terrorism laws, said on LBC radio that Meta is powerless to get rid of offensive material.
“By lowering the user age from 16 to 13 for WhatsApp, they are essentially exposing three more years of that age group to content that they cannot regulate,” he stated. “So, to me, that’s an extraordinary thing to do.”
Using a record number of arrests from the previous year, Hall also mentioned how minors were becoming increasingly susceptible to terrorist propaganda.
“Last year, we had 42 juvenile offenders arrested—the greatest number ever and a notable increase. It’s clear now that youngsters are especially vulnerable to terrorist content, especially those who are sad,” the speaker stated. “They discover the significance they are looking for in their life. An extremist identity could result from it.”
The age adjustment for users in the UK and EU that WhatsApp announced in February went into effect on Wednesday. The platform guaranteed that safeguards were in place and said that the revision brought the UK and EU age limit into line with other nations.
Nevertheless, proponents of child safety disapproved of the choice as well. The decision, according to Smartphone Free Childhood, “contradicts the increasing national call for big tech to take greater steps to protect our children.”
The Online Safety Act made end-to-end encryption a central focus due to concerns about illegal content on WhatsApp and other messaging platforms. This law gives the communications regulator, Ofcom, the power to require messaging providers to employ “accredited technology” in order to identify and eliminate content that promotes child sexual abuse.
The government has attempted to minimise the importance of this clause by saying that Ofcom would only intervene if content scanning was “technically feasible” and if the procedure complied with the bare minimum of privacy and accuracy requirements.
Instagram is probably going to follow Meta, which announced plans to implement end-to-end encryption on its Messenger service in December.