K-pop singers have risen to global stardom in recent years, but their popularity has a dark side: Many have become victims of sexual objectification online.
The issue came to the fore recently as controversy surrounding artificial intelligence chatbot Lee Luda led to online debate about the sexual harassment of K-pop singers.
The chatbot, developed by local startup Scatter Lab in December, came under fire after users taught it to use vulgar language against women and make discriminatory comments against minorities.
The abuse raised the question of whether AI characters can be victims of sexual harassment. The controversy led to debate on other forms of online sexual harassment, especially those targeting K-pop stars. These include edited audio recordings made to sound like celebrity sex tapes; “real person slash,” a fiction genre based on made-up stories about real people; and deepfake pornography, in which videos are altered so that real people falsely appear to be taking part in sex acts. What is real person slash?
Male hip-hop rapper Son Simba recently said he had come across a story that a fan had written about him. He said he was shocked at the explicit language and description that was used to depict the sexuality of the character modeled after him.
On Jan. 11, a petition demanding punishment of those who create or distribute this type of fiction was posted on the Blue House’s e-petition board, bringing public attention to the issue. More than 216,000 people had signed the petition as of Monday morning.
Real person slash refers to fan-created fiction. It can be about singers, actors, sports stars or anyone else. When the characters are K-pop singers, the stories usually feature same-sex romances between bandmates.
Real person slash traces its roots to the 1990s, when fans of first-generation K-pop bands began writing fan fiction about their favorite bands.
S.M. Entertainment, a major entertainment agency, even held a fan fiction contest in 2006 for fans of the K-pop bands under its management.
But vulgar language and sexual description are the mainstays of real person slash today. Sometimes sexual assaults are depicted as part of romantic relationships.
Users of some community websites -- sites frequented mostly by men -- have called real person slash a form of sexual harassment, pointing out that some of the victims are minors.
Politician Ha Tae-keung brought the dispute offline Jan. 19, filing for a police investigation of 110 online account holders thought to have created or distributed real person slash. Ha said real person slash is a sex crime just as serious as the Telegram sex slave videos that shocked the nation last year.
Some users of other community websites -- sites frequented mostly by women -- denounced that view, describing it as “backlash” and saying the Telegram sex abuse case cannot be compared with real person slash. Some say the genre is a way of supporting K-pop singers. Deepfake technology used for sexual abuse
Other unethical practices involving K-pop stars include deepfake pornography.
This technology, based on an artificial intelligence function called deep learning, generates pornography using high-resolution photos and video images of female celebrities.
While any celebrity, and even non-celebrities, can become victims of deepfake porn, data shows that female celebrities are more likely to be targeted.
According to a 2019 report by Amsterdam-based cybersecurity firm Sensity, 25 percent of those who were exploited for deep fake pornography without their consent were K-pop singers.
In June, the Korea Communications Standards Commission agreed to block access to deepfake websites and social media accounts that distribute videos featuring edited images of South Korean celebrities.
Those who create or distribute deepfake videos without the subjects’ consent can be jailed for up to five years or fined up to 50 million won ($44,787) under a newly revised law, but it is difficult to punish every offender. And often, victimized celebrities are reluctant to speak up due to the publicity it would invite.
There have also been audio compilations of celebrities’ voices, edited to sound sexual and to mislead listeners into thinking they are real celebrity sex tapes.
On Jan. 13, an online petition was uploaded on the Blue House’s e-petition board system asking for strong punishment for anyone producing or distributing deepfake images of female celebrities. The petition had collected more than 383,000 signatures as of Monday morning, surpassing the 200,000 mark and obligating the government to respond.
By Im Eun-byel (email@example.com