Tech firms to scan private messages for child abuse images under EU plan

world
Tech Firms To Scan Private Messages For Child Abuse Images Under Eu Plan Tech Firms To Scan Private Messages For Child Abuse Images Under Eu Plan
Civil rights advocates say the proposals amount to “indiscriminate mass surveillance” which would “destroy the right to privacy” — and open encrypted communications to attack by cyber gangs and totalitarian states. © PA Archive/PA Images
Share this article
Kenneth Fox

Child protection groups have welcomed radical EU proposals which will force tech companies to scan people’s private online communications, including encrypted messages, for child sexual abuse imagery and grooming.

However, civil rights advocates say the proposals amount to “indiscriminate mass surveillance” which would “destroy the right to privacy” — and open encrypted communications to attack by cyber gangs and totalitarian states.

As the Irish Examiner reports, the European Commission has announced detailed proposals which, it said, are aimed at responding to an “overwhelming increase” in child sexual abuse material (CSAM) online and in solicitation of children into sexually abusing themselves or even meeting perpetrators offline.

It said that in the first months of the Covid-19 pandemic, the demand for CSAM rose by up to 25 per cent in some member states and that reports of grooming increased by 16 per cent from 2020 to 2021.

Advertisement

It said groomers were contacting children on social media, gaming platforms, and chats, with US figures showing a three-fold increase in “self-generated” imagery of seven- to 10-year-olds.

The commission said that, currently, certain online service providers detect such material on a voluntary basis, but that many companies take no action.

“Voluntary action is, therefore, insufficient,” the commission said.

Risk assessments

In terms of what the EU proposal entails, companies will be obliged to “prevent” CSAM by assessing the risk of their service being used to share this imagery and taking action to reduce that risk.

Member states must set up national authorities to review those risk assessments and, where a significant risk remains, issue a detection order to address that risk.

Encrypted communications will be addressed, with the commission saying that a “large portion” of child sexual abuse takes place on them.

A new independent EU centre on child sexual abuse will be set up, which, among other things, will create a database of indicators allowing for the reliable identification of CSAM;

Member states will also need to set out rules on “dissuasive penalties”, and fines “should not exceed 6 per cent of the provider's annual income or global turnover”.

The commission said the new EU centre will facilitate access to “reliable detection technologies”.

On the key issue of whether or not it is technically possible to access encrypted technologies, the commission said a separate consultative process has shown that “solutions exist" but added that they “have not been tested on a wide-scale basis”.

Read More

Want us to email you top stories each lunch time?

Download our Apps
© BreakingNews.ie 2022, developed by Square1 and powered by PublisherPlus.com