Online Child Abuse: Google, Meta, Discord, And More Collaborate

Date:

A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) with cross-platform signal sharing between online companies like Meta and Discord. The Tech Coalition, a group of tech businesses with a cooperative aim to fight online child sexual exploitation, wrote in a recent announcement that the program is an attempt to keep predators from avoiding detection by moving potential victims to other platforms.

Lantern serves as a central database for companies to contribute data and check their own platforms against. When companies see signals, like known OCSEA policy-violating email addresses or usernames, Child Sexual Abuse Material (CSAM) hashes, or CSAM keywords, they can flag them in their own systems. The announcement noted that while the signals don’t strictly prove abuse, they help companies investigate and possibly take action: like closing an account or reporting the activity to authorities.

A visualization showing how Lantern works.

Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, it used information shared by one of the program’s partners, Mega, to remove “over 10,000 violating Facebook Profiles, Pages and Instagram accounts,” and reported them to the National Centre for Missing and Exploited Children.

The companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Members of the coalition have been developing Lantern for the last two years, and the group said besides creating technical solutions, it had to put the program through “eligibility vetting” and ensure it jibes with legal and regulatory requirements and is “ethically compliant.”

One of the big challenges of programs like this is being sure it is effective while not presenting new problems. In a 2021 incident, a father was investigated by police after Google flagged him for CSAM over pictures of his kid’s groin infection.

Several groups warned that similar issues could arise with Apple’s now-canceled automated iCloud photo library CSAM-scanning feature. The Tech Coalition wrote that it commissioned a human rights impact assessment by the Business for Social Responsibility (BSR) — a larger coalition of companies aimed at global safety and sustainability issues. BSR will offer ongoing guidance as the program changes over time.

The coalition will oversee Lantern, and said it is responsible for making clear guidelines and rules for data sharing. As part of the program, companies must complete mandatory training and routine check-ins, and the group will review its policies and practices regularly.

 

 

 

 

 

 

 

 

 

Share post:

Subscribe

spot_imgspot_img

Popular

More like this
Related

FG Initiates Comprehensive Housing Law Review

To tackle housing challenges, Minister of Housing and Urban...

President Tinubu Appoints New Leadership for NNPC

On Monday, President Bola Tinubu reshaped the leadership structure...

Federal Government Addresses Challenges In LPG Supply And Pricing

In response to concerns over the supply and pricing...

FG Pledges Reintroduction Of University Autonomy

President Bola Ahmed Tinubu revealed over the weekend that...