The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (the "Intermediary Rules") will fundamentally alter how Indians use the internet. While Part I of the Intermediary Rules primarily defines terms, Parts II and III of the Intermediary Rules contain the actual compliances and requirements. The regulation of intermediaries, including social media intermediaries, is the subject of Part II. Messaging-related intermediaries, such as WhatsApp, Signal, and Telegram, and media-related intermediaries, such as Facebook, Instagram, and Twitter, are examples of social media intermediaries. Part III deals with the regulation of digital news media (though there is a lack of clarity on exactly which news media these Rules apply to) and OTT platforms. The social media intermediaries have been divided into two categories by the information technology (Intermediary guidelines and digital media ethics code) rules of 2021: social media intermediary and significant social media intermediary. The rules emphasize the importance of intermediaries performing due diligence. Aside from that, the rules cover the safe harbor provisions outlined in Section 79 of the Information Technology Act of 2000. The rules state that an intermediary must publish details about the intermediary's privacy policy, the use of personal data by the intermediary, and other information on the application or website, as the case may be. 

Pros of IT Rules, 2021 

One of these laws' most notable features is the more detailed approach it takes to the issue of intermediary regulation. The prior draught — and the actual statute — continued to regard ‘intermediaries' as a single entity, wholly defined under section 2(w) of the IT Act, which drew most of its legal terminology from the EU E-commerce Directive of 2000. In the directive, intermediaries were considered as though they were "mere conduits" or "dumb, passive carriers" who had no active involvement in the information. While this may have been true of the internet at the time these laws and guidelines were passed, the internet today changed dramatically. Not only do these intermediaries offer a variety of services, but there is also a huge issue of magnitude, which is handled by a few select companies, either through centralisation or the sheer number of user bases. As a result, a broad, generic mandate would overlook many of these nuances, resulting in unsatisfactory regulatory consequences. 

As a result, the new regulations envision three sorts of entities: 

1. There are the “intermediaries" as defined by section 2(w) of the IT Act. It would be the overarching word for all entities that would be subject to the regulations. 

2. As entities, there exist the "social media intermediates" (SMI), which allow two or more individuals to engage online. 

3. The guidelines designate what are known as "substantial social media intermediaries" (SSMI), which are companies with user-thresholds set by the government. 

The obligations differ based on Hierarchies of classification. An SSMI, for example, would be held to a greater degree of transparency and accountability towards its own users. They'd have to do so by issuing six-monthly transparency reports outlining how they handled with requests for material removal, how they used automated content filtering systems, and so on. 

Other aspects of this transparency principle include notifying users whose material has been blocked and enabling them to appeal the deletion, among other things. 

The proactive filtering mandate, which required intermediaries to basically screen for all illicit information, was another guideline from the previous draught that had caused a lot of anxiety. This was an issue on two levels: 

  • • Machine learning technologies are just not advanced enough to make this a reality, which means that there is always the risk of valid and legal material being suppressed, resulting in a chilling impact on digital expression in general. 
  • • The technological and financial burden placed on intermediaries as a result of this would have harmed market competitiveness. 

The new rules appeared to have eased this load by limiting it from being mandatory to best-efforts; and second, by narrowing the definition of "unlawful content" to only include content portraying sexual abuse, child sexual abuse imagery (CSAM), and duplicating content that had already been disabled or taken away. This precision would be beneficial for better implementation of such technologies, since prior research has demonstrated that training a machine learning tool on a corpus of CSAM or abuse is far easier than training a machine learning tool on more contextual, subjective topics like hate speech. 

Cons of IT Rules, 2021 

The Intermediaries Rules have far-reaching implications for online privacy, free speech and expression, and information access. The intermediary's user agreement or privacy policy should state that the user is responsible for not hosting, displaying, publishing, transmitting, modifying, or storing any information that is unethical, illegal, derogatory, Defamatory to the general public, affects the dignity of an individual, misleading, and so on. Furthermore, the intermediary must ensure that information that jeopardizes the nation's unity, integrity, or sovereignty is not accepted on its platform. In the event that such unethical information is transmitted, the user must be notified. If the material transmitted violates the intermediary's policy, the intermediary can request that the information be removed or the account be deleted. 

If an intermediary is made aware that information does not comply with the rules set forth in this act, the intermediary must act quickly to remove the information. The intermediary is also required to keep the information in question for a period of 180 days in order to conduct an investigation. If the information transmitted is defamatory or provocative to a person, the intermediary is required to remove the information within 24 hours of receiving the complaint. 

A grievance redressing mechanism must be established by intermediaries. The name of the grievance officer, as well as their contact information, must be published on the intermediary's website or application. Intermediaries must also make public the mechanism through which a user of the intermediary can file a complaint about a violation of any of the rules' provisions.The grievance officer is also required to acknowledge the complaint within three business days and to resolve it within one month. A chief compliance officer, a nodal contact person, and a resident grievance officer are all mentioned in the rules. The intermediary must publish a monthly 

compliance report that includes the details of all complaints received and all actions taken in response to those complaints, as well as the details of content removed from the intermediary's application. The grievance officers who are appointed under this act must be Indian citizens. This will undoubtedly erect a significant entry barrier for new entrants to the sector, as they will now be required to devote a significant portion of their resources to developing such a compliance mechanism. While behemoths like Facebook, Twitter, and Instagram will have no trouble doing so, smaller Indian startups will be put under more financial strain. When startups reach the 50 lakh mark, this will reinforce the higher entry barriers. Simply put, it means less competition, which means less innovation and value for users. 

The provision dealing with the first originator of information is one of the rules' most contentious provisions. The rules state that The ability to identify the first source of information as required by a competent authority or a court order under section 69 of the Information Technology Act of 2000. Because intermediaries like WhatsApp use end-to-end encryption to protect messages sent and received by users, this has been a source of friction. Intermediaries will have to disable end-to-end encryption in order to identify the message's first originator, posing a threat to the fundamental right to privacy. 

Significant social media intermediaries must enable ‘automated tools,' or basically AI technology, to identify and remove child sexual abuse material, content depicting rape, or any information that is exactly identical in content to information that has previously been removed, according to Rule 4(4) of the Intermediaries Rules. This is an example of function creep, in which extreme technological measures intended for a limited and very serious use are gradually and imperceptibly applied to other, less serious uses, leading to even more function creep. 

Another point of contention in the rules is voluntary user verification, which allows users to voluntarily verify their accounts. Users must be able to verify their accounts using an appropriate mechanism. Critics argue that this will obliterate online anonymity. The 2011 Rules did not specify any penalties for intermediaries who failed to comply with the Rules' provisions; the penalty was simple: they lost their immunity! The Intermediaries Rules, 2021, on the other hand, expressly state a loss of immunity and specify the severity of the consequences, which may include criminal prosecution under the IT Act and the Indian Penal Code.This is a significant barrier for newcomers to the sector, who may lack the financial resources to acquire the legal skills needed to deal with such obligations, or who may be scared off by the threat of criminal prosecution. In any user-centered, community-centric technical environment, it will sabotage social sharing and conversation functions. 

The Rules have an overabundance of delegation of authority. The Rules, for example, have established a non-judicial adjudicatory process for resolving complaints about content published by Digital News Media and OTTs. They've also established an adjudicating body called the "oversight committee." This is despite the fact that the IT Act does not expressly authorize the government to do so.

About Us

We speak the language of Technology & Internet. We understand how the law interacts with Technology & Internet. Cyber Crime Chambers is a boutique firm specializing in internet laws and digital forensic evidence.


B.A., B.L., (Hons) IPDP., (London)
Pgd IPR., Pgd Cyber Law., Msc., (IT)
Advocate, Madras High Court

Karthikeyan, is a renowned cyber law expert, who is also the Managing Partner of Law Office of Karthikeyan, a reputed law firm based in Chennai.

More About Us

Report an Incident

Send us information about your case, we will respond to you promptly