By Charlotte Munns
LONDON, Jun 29 2020 (IPS)
Global upheaval caused by the COVID-19 pandemic has left society’s most vulnerable exposed. Instances of child sexual exploitation material (CSEM) found online have increased at an alarming rate over past months.
The incidence is higher, the abuse is worse, and the children are younger. Self-regulated social media companies are dragging their heels implementing reform that bolsters the safety of their youngest members.
This recent upturn comes after decades of rapid growth of CSEM material. INTERPOL, a global policing organisation, reported a 10,000% increase in the amount of CSEM on the Internet since 2004.
Since lockdown measures were put in place, the Internet Watch Foundation (IWF) has blocked nearly 9 million attempts by UK internet users to access child sexual abuse websites. The vast majority of victims identified are 7 to 13 years old.
Many instances of abuse originate on social media platforms. Private messaging services, and childrens’ broad access to the Internet have facilitated contact between victims and perpetrators. Each photo published online is evidence of a crime occurring, yet much goes undetected.
At a United Nations briefing in April concerning the effects of the pandemic on children, the European Union representative Walter Stevens noted that the scale of online abuse “continues to expand at an alarming rate.” In some countries, such as Australia, the amount of detected material has doubled in recent months.
The COVID-19 pandemic has brought more children home, and increased Internet connectivity as teaching turns virtual, the most vulnerable members of society are being delivered into the hands of abusers.
Secretary General Antonio Guterres said earlier this year, “governments and parents all have a role in keeping children safe,” adding that, “social media companies have a special responsibility to protect the vulnerable.”
Maud de Boer-Burquicchio, former Special Rapporteur on the sale and exploitation of children, criticised tech companies’ intentions, “the respect of childrens’ rights and dignity, if at all, continues to come as an afterthought.”
No company is more central to this discussion than Facebook. According to the New York Times, of the 18.4 million reports of child sexual abuse material last year, 14 million came from Facebook’s platform. The Messaging service facilitates contact between victims and perpetrators, and is where images and video are easily sent and disseminated.
Following suit from most other social media organisations, Facebook’s plans to implement end-to-end encryption into its Messaging service represents a significant step backwards in combating CSEM globally. The measure is responding to users’ calls for greater privacy, yet would prevent anyone, even Facebook, from identifying exploitative messages and media sent in conversations.
Currently, hash technology within a system called PhotoDNA allows for the detection of CSEM across platforms. If end-to-end encryption is introduced, this will no longer be possible.
Andy Burrows, NSPCC Head of Child Safety Online Policy told IPS that this service “will make content moderation virtually impossible and make it easier for offenders to groom children.”
A number of leaders from countries like the United States, UK and Australia have criticised Facebook’s haste to encrypt its platform. They have called for a delay until child safety can be guaranteed.
Susie Hargreaves, Chief Executive of IWF, told IPS, “we are asking Facebook to give assurances that child protection will not be hampered and that children and victims will be protected in some way, and as yet, none of us have seen any of those assurances.”
FBI Director Christopher Wray has expressed his concerns that end-to-end encryption would prevent law enforcement’s ability to track down perpetrators of child sexual exploitation.
In October of 2019, Attorney General William Barr sent a public letter to Zuckerberg echoing this concern, and calling on Facebook to, “embed the safety of the public in system designs,” and “enable law enforcement to obtain lawful access to content in a readable and usable format.”
Fred Langford, IWF Chief Technical Officer, in an interview with IPS praised the social media company for engaging with the issue of CSEM. Facebook has partnered with other tech companies like Google and Microsoft to discuss embedded protections for children, yet many criticize these measures for providing little real change.
At a meeting last month, shareholders noted the severity of CSEM on Facebook’s platform, stating, “Facebook’s plans to expand end-to-end encryption will make it unable to track CSEM on social media enabling more offenders to evade detection.”
Past measures to protect children on the platform have not been effective enough, they said. Shareholders requested a report be compiled detailing how Facebook would address the issue prior to imposing end-to-end encryption. The Board of Directors voted against.
Without effective measures to protect children while ensuring user privacy, end-to-end encryption will make continuing to detect and prosecute offenders nearly impossible. Many are unsure of Facebook’s measures to deal with that.
“Tech companies have proven time and again that they are failing to make their self-regulated platforms safe for children,” NSPCC’s Any Burrows told IPS. The ongoing pandemic, and Facebook’s sluggish response to concerns for child welfare on its platform may further endanger our most vulnerable.
The post Safety of Children an Afterthought for Tech Companies appeared first on Inter Press Service.
Excerpt:
Charlotte Munns, a free-lance writer based in London, has specialized in English Literature and Middle Eastern Studies at Columbia University, New York
The post Safety of Children an Afterthought for Tech Companies appeared first on Inter Press Service.
Source : African Media Agency (AMA)