Jan 28 (Reuters) – China’s cyberspace regulator issued draft rules on Friday for content providers that alter facial and voice data, the latest measure to crack down on “deepfakes” and mould a cyberspace that promotes Chinese socialist values.
The rules are aimed at further regulating technologies such as those using algorithms to generate and modify text, audio, images and videos, according to documents published on the website of the Cyberspace Administration of China.
Any platform or company that uses deep learning or virtual reality to alter any online content, what the CAC calls “deep synthesis service providers”, will now be expected to “respect social morality and ethics, adhere to the correct political direction”.
The regulations provide for people to be protected from being impersonated without their consent by deepfakes – images that are virtually indistinguishable from the original, and easily used for manipulation or misinformation.
“Where a deep synthesis service provider provides significant editing functions for biometric information such as face and human voice, it shall prompt the (provider) to notify and obtain the individual consent of the subject whose personal information is being edited,” Article 12 of the draft says.
The rules stipulate fines of between 10,000 and 100,000 yuan ($1,600 and $16,000) for first-time offenders but violations can also lead to civil and criminal prosecutions.
The draft also provides for a user complaints system and mechanisms to prevent deep fakes from being used to spread false information. App stores will be required to suspend or remove providers of deep fake technology where necessary.
“Deep synthesis services are also used by some criminals to produce, copy, publish and disseminate illegal information; slander and degrade people’s reputation, honour; as well as impersonating others’ identities to commit fraud and other illegal acts – not only damaging the vital interests of the people, but even endangering national security and social stability,” the draft rules say.