Social media apps such as WhatsApp and TikTok have been attracting the wrath of the Indian government for spreading misinformation. While the government has been mulling strict norms for these platforms for more than a year, it is now in the final stage of notifying the guidelines.
The government will amend the intermediary guidelines of the Information Technology Act (IT) 2000 to hold social media apps responsible for the content on their platforms.
Citing anonymous government officials, an ET report said that the law ministry has tweaked the existing draft prepared by the Ministry of Electronics and IT and the guidelines are awaiting approval.
The new norms will hold social media apps accountable for creating content as well as disseminating them. After the guidelines are enforced, they will have to check the virality of unlawful content on their as well as other platforms.
For instance, we frequently see content that is being created on a particular platform but finds virality post-distribution through other apps.
According to the draft, the government wanted WhatsApp, TikTok, Helo, ShareChat and other social media apps to develop automated tools to identify unlawful content, appoint an officer for 24X7 coordination with law enforcement officials and take action against anti-social or inflammatory content within 24 hours.
Companies like TikTok and Facebook will be required to put better techniques and funds to remove unlawful content immediately. “WhatsApp can’t give end-to-end encryption as an excuse for not removing such content,” a government official told ET.
To check the spread of misinformation on its platform, WhatsApp recently put a limit on the sharing of frequently forwarded messages to only one chat at a time.
The move came at a time when misinformation around Covid-19 has been spreading fast. Recently, Bytedance-owned TikTok also faced some heat from users and the media for videos that emphasised that a particular religion can’t be affected by the deadly virus.