Elon Musk’s privately held Twitter is likely to increase the level of toxicity the social media platform tolerates, in line with the new owner’s professed opposition to censoring tweets, on grounds of freedom of expression. The challenge is to regulate social media on par with mainstream media, so that national policy rather than owner predilection determines the quality of the public discourse, of which platforms like Twitter are an increasingly vital part.
Musk considers Twitter to be the public square of the 21st century, a very important place for views and ideas to be aired, exchanged, disputed and clarified, the contention creating the shared public discourse that serves as the basis of democracy. At an altitude from where a hawk watches little, twittering birds flitting from tree to tree, this looks grand. Closer to the ground, disharmony becomes discernible from certain kinds of tweets: lies, hate speech, instigation, libel, revenge porn. How do we respond to such harmful content?
Free speech fundamentalists would hold that people should be free to say what they feel. Most people would side with curbing harmful speech. But there is and would be no consensus on where to draw the line, when it comes to harmful speech.
Twitter and Facebook banned former US President Donald Trump after the January 6, 2020 assault on the Capitol, which Trump is widely believed to have instigated. Should that ban continue? Musk’s answer might be different from that of Twitter’s incumbent management.
The world is culturally diverse, social mores and sensibilities vary extensively. In some countries, women are gearing up for a battle to ‘free the nipple’, in some others, the uncovered ankle of a woman is considered provocatively lewd. In some cultures, burning a religious book might be dismissed as a crazy stunt that primarily damages the environment. In others, it could lead to riots and violence. That raises the question as to the cultural acceptability of the norms upheld by globally available social media platforms. Is moderation according to the sensibilities of North America appropriate for Kandahar, Karachi or Kolkata?
The solution is to end the treatment of social media as platforms that cannot be held responsible for what they carry. Traditionally, such intermediaries have been deemed not responsible for the content created by their users. India’s social media intermediary guidelines go to great lengths to stipulate the removal of objectionable content, once an intermediary has been alerted to the objectionable nature of the content it has been carrying and to institute compliance mechanisms.
The actual solution is much simpler. Social media platforms are not dumb publishers of user-generated content. They sort the user-generated material, feed some streams to some users and other streams to yet others, based on their analysis of user taste and preference. They also resort to content moderation, removing some content altogether from their platforms, seeing it to be unfit to be carried. Such activity qualifies as editorial intervention. There is no reason to not treat a social media platform, whose content is subjected to editorial intervention, on par with mainstream media, as regards compliance with norms relating to hate speech, libel, instigation and social disharmony.
Why should Indians defer to the putative wisdom of Facebook’s Oversight Board as to the suitability of posts in the Indian context? Indian law and regulation should prevail. It is far better for such norms and regulations for social media to be the same as for mainstream media, so as to insulate social media from more onerous compliance requirements.
Once India puts in place uniform norms for mainstream and social media, with regard to their obligation to protect free speech and guard against abuse of free speech, Indians would no longer have to worry about who owns or cease to own Twitter or any other social media platform.
Subscribe to Mint Newsletters