The Delhi High Court on Tuesday laid out directions to tackle the issue of removing offensive content from the internet and preventing errant parties from re-posting and re-directing such content. Observing that “the internet never sleeps; and the internet never forgets,” Justice Anup Jairam Bhambhani said that despite orders passed by the court, online platforms have not been able to “fully and effectively remove” content.
The judgment was passed in response to a petition by a woman who complained that photographs that she posted on her private accounts on Facebook and Instagram were taken without her knowledge and consent and reshared on a pornographic website. While the photographs themselves are not objectionable, they have become offensive by association, thus making the publishing and transmitting of this material an offence under section 67 of the Information Technology Act (IT Act) 2000, the court held.
Cannot play a cat-and-mouse game: Delhi HC
During the hearings, the petitioner complained that there was “brazen and blatant disregard” to the court order and “the errant respondents and other mischief-makers had re-directed, re-posted and re-published the offending content onto other websites and online platforms, thereby rendering the orders of the court ineffective.”
The court cannot play “a cat-and-mouse game of errant parties evading court orders by re-posting offending content,” Bhambhani noted. If the court is not in a position to pass “effective and implementable” orders, then subsequent judgements on the matter will be “rendered infructuous,” he added.
What should intermediaries and search engines do?
- Remove offensive content within 24 hours: Intermediaries (websites or online platforms) are required to remove any offensive content once they receive “actual knowledge” by way of a court order or upon being notified by an appropriate government agency. Such order may be passed based on grievance brought to the court as contemplated by Information Technology (IT) Rules, 2021 or by other means. The content must be removed within 24 hours as mandated by the IT Rules, 2021.
- Deploy proactive measures: Intermediaries are expected to deploy proactive measures such as automated tools that can detect and remove content that is “exactly identical” to the content that the court has deemed offensive. The court further clarified that “none of this would impose upon the website, online platform or search engine(s) any obligation to generally monitor content or to adjudicate the illegitimacy of any content or operate as a prior restraint or a blanket ban or censorship of content generally.”
- Inform users not to publish offensive content: The Delhi HC refers to the IT Rules, 2021 provisions and reiterates that intermediaries are required to inform users to “not host, display, upload modify, publish, transmit, store, update or share any information that belongs to another person and to which the user does not have any right or which is inter alia invasive of another’s privacy.”
- De-index and de-reference offensive content: Search engines must make offensive content unavailable by de-indexing and de-referencing it from search results. “It cannot be overemphasised that even if, given the nature of the internet, offending content cannot be completely ‘removed’ from the world-wide-web, offending content can be made unavailable and inaccessible by making such content ‘non- searchable’ by de-indexing and de-referencing it from the search results of the most widely used search engines, thereby serving the essential purpose of a court order almost completely,” the court ruled.
- Remove from search results globally: Citing precedent set in the Google Inc. vs. Equustek Solutions Inc case in the Supreme Court of Canada, the Delhi HC ruled that search engines, upon being issued an order to remove or disable access to some content, must “block the search results throughout the world since no purpose would be served by issuing such an order if it has no realistic prospect of preventing irreparable harm to a litigant.”
- Preserve records: The intermediary who is hosting the offending content must preserve all records relating to the offending content for 180 days, or longer if the court desires, for investigation purposes.
- Safe harbour provisions: Failure to follow orders will result in the intermediary forfeiting the exemption from liability and losing safe harbour provisions guaranteed under section 79 of the IT Act.
What can the aggrieved party do?
- The aggrieved party must furnish to law enforcement agencies all available information that it possesses relating to the offending content such as file name, Image URL, and Web URL.
- The aggrieved party should also be permitted to notify law enforcement agencies to remove offending content from any other intermediary on which the same or similar offending content is found to be appearing.
- The court may direct the aggrieved party to make a complaint on the National Cyber-Crime Reporting Portal, if not already done so, to initiate the process of grievance redressal.
What should law enforcement agencies do?
- Law enforcement agencies are required to obtain all information concerning the objectionable content including all unique identifiers relating to the offending content such as the URL (Uniform Resource Locator), account ID, handle name, Internet Protocol (IP) address, the hash value of the actual offending content, metadata, subscriber information, access logs and such other information as the law enforcement agency may require in line with Rule 3(1)(j) of the 2021 Rules. The enforcement agencies must obtain this information within 72 hours.
- Upon being furnished with the information of the objectionable content by the aggrieved party and the orders from a court, law enforcement agencies must notify intermediaries to take down the objectionable content.
- Upon being notified by the aggrieved party about any other intermediary hosting content that is the same as or similar to the content deemed objectionable, the law enforcement agency must notify the intermediary to take down the content.
What the respondents have to say?
During the preliminary hearings of the case, the Cyber Prevention Awareness and Detection Unit (CyPAD), the cybercrime unit of the Delhi Police, submitted that while it was ready to comply with the removal order issued by the court, it cannot assure that it can entirely efface offending content from the internet owing to “technological limitations and impediments.” Other respondents to the case reiterated the same and said that the only option for an aggrieved party is to get fresh orders from the court for each platform and website the content has been shared on.
Google argued that its role in following court orders is “disabling access to specific URLs by effacing or removing such URLs from the search results” and it cannot make content non-searchable, arguing that this can only be done by the owner of the website/online platform. The search giant further added that its image-based results are harder to identify and block than text-based results.
Facebook submitted that it has robust policies to protect the privacy of its users and that the woman who filed the case in question does not claim relief from Facebook/Instagram despite her photographs being taken from the two platforms.
Also Read
- Summary: Information Technology Rules 2021, And Intermediaries And Social Media Platforms
- Objectionable Images Of Rape Victims On Social Media: Orissa HC Makes A Case For The Right To Be Forgotten
- Reliance On Automated Content Takedowns Needs To Be Reconsidered: MediaNama’s Take
- Facebook Oversight Board To Accept Appeals From Users For Content Takedowns; MediaNama’s Take