By most accounts, the big bad boys of global tech finally did the right thing by shutting out United States President Donald Trump from major social media platforms, in the face of an ‘armed insurrection’ to effect a ‘regime change’, and by pulling the plug on an untamed platform called Parler, threatening to amplify calls for violence.
Even though questions swirl on why it took them so long to act, the world is gratified by the eventual action from Apple, Facebook, Google and Twitter — in simple alphabetical order — and later Amazon. Such concerted action was unprecedented. Big tech has rarely expressed shared views on how to police their platforms, and have mostly preferred to go their own individual ways, evidently to little good. Often enough, they have let things drift — splitting hairs to deflect criticism and hoping the troubling issues would disappear over time. As we know now, things only come to a boil as it did in the seat of power in Washington DC.
What the ‘coordinated’ action did not do is let the tech companies off the hook. If anything, their action is facing more scrutiny because of a number of issues, and two plain things with bipartisan agreement in the US. One, the tech platforms are way too powerful for their own good or for that of society. Two, they can no longer be allowed to exercise this power without a proper set of accepted standards.
All the companies have a terms of service for users, but no equivalent to govern their own conduct. Would the tech companies, for example, acted the same way they did, had the same events taken place not in Washington but in, say, Moscow or any other world capital? Where do they draw a line on free speech? Or regime change? Would today’s social media allow a new Arab Spring that we saw a decade ago, when Twitter, notably, won praise for aiding the uprising against the oppressive regime in Egypt?
Do the platforms require a proper code of conduct, maybe a due process as well before they can shut out individuals, groups or apps? Should the code be different for different countries, depending on local laws? Don’t aggrieved parties deserve some form of redress?
What can we do about it, and exactly how?
The talk right now is only in the US because it is home to the influential tech platforms, and has direct power to control the companies. US President-elect Joe Biden is expected to move quickly on the issue, and likely build on a prior effort to scrap Section 230, which exempts the tech platforms from legal liability for users’ content.
But other countries are impacted by Google and the others. Perhaps, more profoundly in the absence of rule of law and other independent institutions in many countries. In fact, many are decisively worse off because the tech companies have chosen to abandon any pretence of principles and allowed racism, bigotry and gender discrimination to flourish on their platforms in these countries, including in India.
Broadly, we need oversight of the app ecosystem and oversight of content. Consider the app ecosystem, which controls access to several billion devices. Powerful as Twitter and Facebook are, Google and Apple might be even more so because they control this access in a manner that can make or break an idea or business, without real accountability or redress.
The two companies’ recent action on Parler is a poor example, given the social media platform’s incendiary content, but what if a similar thing happened to a minor offender or an innocent user? Similarly, Twitter’s or Facebook’s decision to temporarily or permanently boot out an individual or an organisation ought to, in all fairness, have due process and a forum for redress.
What might do the trick is an oversight panel like the one constituted by Facebook, but only truly independent. It should have control over the most critical platforms and oversee the elements most critical to society and its accepted philosophies. It could mean oversight of the app stores run by Google and Apple to ensure everybody gets a fair deal to enjoy the benefits of the ecosystem, and nobody is unfairly singled out for punishment.
It could mean an authority that can moderate content on platforms such as Facebook and Twitter, even police them. Finally, perhaps a ‘supreme court’ — as Facebook has fashioned its panel — that will deliver justice for aggrieved parties.
If it sounds like a new United Nations for technology, maybe that is what we need. Probably, one without the evident shortcomings of the existing global body. It is hard to see how else we could govern the crucial elements of our tech-centred world. So far, India has shown no inclination to assume global leadership on technology issues, but this might be the time to jump in and show some statesmanship.