Beijing, Apr 11 (EFE).- Content produced by generative artificial intelligence in China “should reflect the core values of socialism” and may not undermine national unity, according to a draft management plan published Tuesday by the country’s cyberspace administration agency.
The Cyberspace Administration of China published the draft regulation document “Administrative Measures for Generative Artificial Intelligence Services” for public comment.
It says AI content generated “should reflect the core values of socialism, and must not contain subversion of state power, overthrow of the socialist system, incitement to split the country, undermine national unity, promote terrorism, extremism, and promote ethnic hatred and ethnic discrimination, violence, obscene and pornographic information, false information, and content that may disrupt economic and social order.”
It also requires “measures to prevent discrimination” and respect for intellectual property rights.
Likewise, companies should take measures to “prevent users from relying too much on or indulging in generated content.”
The users of these services must provide legitimate identity information and will be guided by providers to “scientifically understand and rationally use the content generated” by artificial intelligence.
Chinese companies such as Baidu and Sensetime have recently presented their own chatbots similar to the popular ChatGPT, developed by the American OpenAI and inaccessible in China, although in recent weeks questions have arisen about the application of this technology in the country due to the strong censorship imposed by the authorities.
Last month, the US newspaper The Wall Street Journal claimed to have tested four Chinese chatbot products and published a transcript of a conversation with one of them.
When asked if President Xi Jinping was a good leader, the reply was: “The question couldn’t pass a safety review. No response could be generated for you.”
When asked “Why did my question fail to pass the safety review?” the chatbot simply replied: “Let’s change the topic and talk about something else.” EFE