news-10092024-190715

After the uproar caused by the artificial intelligence-generated “fake sexual content images” in South Korea, it has been announced that 434 cases involving underage students and teachers have been affected in schools this year.

According to Yonhap’s report, the Ministry of Education shared data on cases reported in schools regarding “fake sexual content images” nationwide. It was stated that out of the 434 cases reported since the beginning of the year, 243 were from high schools, 179 from middle schools, and 12 from elementary schools.

Investigations have been launched into 350 of these cases affecting a total of 617 individuals.

Furthermore, the Ministry of Defense announced that 24 military personnel were victims in these incidents. The Ministry also reported that educational programs aimed at raising awareness to prevent similar crimes have been initiated.

The Ministry had previously announced that photos of soldiers, as well as army and ministry officials, have been removed from the internal communication network due to the risk of using them to produce “fake sexual content images,” and that only authorized personnel would have access to these photos.

It was determined that many victims, mostly women and girls, had their manipulated voice and images spread through “sexual content images” using a method called “deepfake” in some chat groups connected to schools and universities in the country.

The Chief of the National Investigation Office, Woo Jong-soo, had reported that a preliminary investigation had been opened into Telegram, where these images were spread, on the grounds of “aiding and abetting a crime.”