South Korea to ask Telegram, other social media firms to help tackle digital sex crimes


Image used for representative purpose only

Image used for representational purpose only | Photo Credit: Getty Images

South Korean authorities said on Wednesday (August 28, 2024) that they plan to ask Telegram and other social media platforms to more actively help remove and block sexually explicit deepfake content, part of measures aimed at tackling a growing problem.

The move comes amid public and political outrage after several domestic media outlets reported that sexually explicit deepfake photos and videos of South Korean women were frequently found in Telegram chatrooms.

The Korea Communications Standards Commission said a 24-hour hotline for victims will also be set up and the number of regulatory personnel monitoring digital sex crimes will be doubled from the current 70.

The Korean National Police Agency also said it would launch a seven-month campaign to crack down on online sex crimes.

Ryu Hee-lim, chairman of the media regulatory body, said at a meeting held on the issue that the institution plans to set up a consultative body to enhance communication with social media firms on the removal and blocking of sexual deepfake content.

For companies that do not have offices in South Korea, it wants to establish face-to-face channels for regular consultations.

“The creation, possession and distribution of deepfake sex crime videos is a serious crime that destroys personal dignity and individual rights,” Ryu said.

Besides Telegram, the commission said it would seek cooperation from X as well as Meta’s Facebook and Instagram and Google’s YouTube. None of the companies responded to a Reuters request for comment.

Criticism of Telegram in South Korea coincides with the arrest of Telegram’s Russian-born founder Pavel Durov – part of a French investigation into child pornography, drug trafficking and fraud on the encrypted messaging app.

The number of deepfake sex crime cases in South Korea has risen from 156 in 2021 to 297 so far this year, with the majority of offenders being teenagers, according to police data.

The victims are usually women and include schoolgirls and female soldiers in the South Korean military.

South Koreans have made more than 6,400 requests to the Korea Communications Standards Commission for help removing sexually explicit deepfake content this year, compared with about 7,200 cases last year in which the commission agreed to help remove the content.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top