In a significant move against AI misuse, Malaysia and Indonesia have taken firm stances on Elon Musk’s Grok chatbot, citing the urgent need to address risks of AI-generated pornographic content.
Malaysia Suspends Access
The Malaysian Communications and Multimedia Commission (MCMC) announced on 11 January 2024 that it has temporarily suspended access to Grok for users in Malaysia. The decision followed repeated misuse of the app for generating obscene and sexually explicit images.
- Reports surfaced of Grok being used to create non-consensual manipulated images, including those of women and minors.
- MCMC stated that it had directed a restriction on access due to the inadequate safeguards implemented by X Corp, which owns Grok.
MCMC emphasised that the ban would remain in effect until necessary changes to the platform are verified.
Indonesia Takes the Lead
On 10 January 2024, Indonesia became the first country to block Grok entirely, implementing strict measures against the chatbot due to similar concerns. The country’s Communications and Digital Minister, Meutya Hafid, expressed the government’s view on non-consensual sexual deepfakes as a serious violation of human rights.
- Indonesia, home to the world’s largest Muslim population, maintains strict regulations against sharing obscene content online.
- The government has summoned officials from X Corp for discussions regarding the risks associated with Grok.
Both nations leverage regulatory action in light of rising global scrutiny over AI technologies and their potential for abuse.
Responses from Musk and xAI
Musk responded to the controversy, stating that users creating illegal content would face consequences similar to uploading illegal material directly. Following the backlash, xAI, the startup behind Grok, introduced a monetisation policy restricting image generation features to paying subscribers.
Despite these changes, critics argue that the measures do not sufficiently address issues surrounding the generation of sexualised deepfakes. Concerns persist about the adequacy of protections for users, particularly the youth.