S. Korea plans tougher penalties to combat deepfake sex crimes
Producing, distributing or possessing deepfake porn may lead to up to 3 years in prison
By Park Jun-heePublished : Nov. 6, 2024 - 15:16
In an effort to clamp down on the rapid spread of deepfake sex crimes here, the Korean government on Wednesday rolled out a comprehensive package of strengthened measures to handle the AI-generated sexual abuse, from properly educating students to improving laws and seeking tougher punishment.
It marks the first time for the government to map out a pan-governmental response plan following concerns over the lack of a coordinated strategy by related institutions for tackling deepfake sex crimes in the country.
Under the plan, people caught producing and distributing deepfake porn materials, as well as possessing, purchasing, storing or even viewing such content, will face up to three years in prison or a fine of up to 30 million won ($21,592).
Also, if a person creates or edits sexually explicit videos or images -- even without the intent to distribute them online -- the plan recommends that they be punished with up to seven years in prison, compared to the five years under current law.
Those exploiting deepfake sex crimes involving minors via blackmail or coercion will be handed a maximum of three and five years in jail, respectively.
Also, the Justice Ministry is pushing to propose amendments to the Act on Special Cases Concerning the Punishment of Sexual Crimes to confiscate offenders' profits gained through deepfake sex crimes. The government added that it would grant leniency to individuals who self-report their participation in deepfake sex crimes.
Aimed at tracking down the flood of deepfake pornography, officials pledged to beef up their investigative capabilities.
The enhanced capacity will allow officials to go undercover to pursue online sexual predators where they could collect evidence. This procedure, however, will require a request from the prosecutor and the court's approval to ensure the process is legally justified.
As deepfake crimes circulate online, domestic and international platform operators considered "intermediaries providing harmful content to minors," such as Telegram or others, could potentially be subject to regulation for distributing open channel links to unspecified people that could encourage them to have access to explicit materials.
Corrective orders and fines will be issued against value-added service providers like Naver and Meta if they fail to prevent the distribution of illegally filmed materials and deepfake content, according to the Korea Communications Commission. The broadcasting regulator noted that in countries like France and the UK, social media companies or online service providers are responsible for managing illegal content on their platforms.
In cases where platform operators are unsure whether the content is sexually explicit or violates any rules, the government will temporarily block the content while requesting the KCC to review the material.
In line with the digital transformation, officials noted that they would use AI technology to detect deepfake content automatically in real time. Once detected, the system will automatically request that the platform operations delete such content.
Authorities also said they plan to team up with platform operators to respond to removing or blocking deepfake pornographic content and set up additional hotlines to receive reports on illegal activities to safeguard victims. They plan to create a website where victims can report incidents, making it easier for them to access support services.
In addition, the government plans to try to strengthen cooperation with overseas-based social network platform companies so that they provide subscribers' personal information in response to official requests from Korean courts and investigative agencies.
Moreover, schools and youth facilities will educate teenagers that creating, sharing or watching sexually explicit materials is a serious crime, while universities will set up various methods -- such as deepfake prevention booths -- to raise awareness.
A total of 781 deepfake victims asked for help from the Advocacy Center for Online Sexual Abuse Victims in the first eight months of this year, and 288 of them were minors, according to the Woman's Human Rights Institute of Korea. Also, some 387 individuals have been apprehended this year for deepfake sex crimes, according to police in late September.