‘Keeping It Real: India Tightens the Reins on Synthetic Media’
On February 10, 2026, the Ministry of Electronics and Information Technology (‘MeitY’) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 (‘Amendment Rules’), which amend the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (‘Intermediary Rules’). The Amendment Rules will come into force on February 20, 2026.
By way of background, the Intermediary Rules prescribe, among others, due diligence measures that intermediaries (as defined below) must implement to avail safe harbour from liability for third party generated content. The Amendment Rules follow a public consultation process initiated in October 2025, during which MeitY released draft rules (‘Draft Rules’) for stakeholder feedback. Pursuant to feedback received, MeitY has made significant changes to the Draft Rules, which includes, among others, a revised definition for ‘synthetically generated information’.
What is Synthetically Generated Information?
The Amendment Rules primarily seek to govern the misuse of ‘synthetically generated information’ to spread misinformation, deepfakes, and other unlawful content. Synthetically generated information (‘Synthetic Media’) is defined to mean audio, visual, or audio-visual information that is artificially or algorithmically created, generated, modified, or altered using a computer resource, in a manner that such information appears to be real, authentic, or true, and depicts or portrays any individual or event in a manner that is, or is likely to be perceived as, indistinguishable from a natural person or a real-world event. The Amendment Rules expressly clarify that audio-visual information arising from: (i) routine or good-faith editing, formatting or enhancement that does not materially alter, distort, or misrepresent the substance, context, or meaning of the underlying content; (ii) routine or good faith creation, preparation, or formatting or design of documents, presentations, and educational materials, that do not result in the creation of any false document; or (iii) use of computer systems solely for improving accessibility, clarity, quality, translation, or searchability, without generating, altering or manipulating any material part of the underlying content; does not constitute Synthetic Media.
This definition of Synthetic Media focuses on artificially or algorithmically created content or information that is unauthentic, misleading or deceptive, and is narrower in scope, in comparison to the all-encompassing definition proposed under the Draft Rules, that covered within its ambit any information that was ‘artificially or algorithmically created, generated, modified or altered’, which would have resulted in unintended consequences.
Applicability of Amendment Rules
The Amendment Rules apply to intermediaries including significant social media intermediaries (‘SSMIs’). An intermediary is defined under the Information Technology Act, 2000 (‘IT Act’) to mean an entity that, ‘on behalf of a third party’, receives, stores, or transmits data or content or provides any service with respect to such data or content. Examples of intermediaries include web-hosting service providers, online marketplaces, search engines, and telecom service providers.
An SSMI is a social media intermediary (an intermediary that enables online interaction between two or more users and enables them to create, upload, disseminate, modify, or access, content), having more than five million registered users in India. Given the significant volume of content handled by them, the Amendment Rules prescribe certain additional obligations for SSMIs.
Key Obligations under Amendment Rules
The Amendment Rules require intermediaries offering a computer resource that enables or facilitates the generation or modification, and transmission of Synthetic Media, to ensure that: (i) it deploys reasonable and appropriate technical measures to prevent users from creating and transmitting unlawful Synthetic Media; (ii) Synthetic Media (not covered under (i)) is prominently labelled or embedded with permanent metadata or other appropriate technical mechanisms (including a unique identifier), to the extent technically feasible, to identify the computer resource of the intermediary used to create, generate, modify, or alter such information. Such Synthetic Media must be labelled in a manner that ensures prominent visibility in the visual display, that is easily noticeable and adequately perceivable, or, in the case of audio content, through a prominently prefixed audio disclosure that can be used to immediately identify that such information is synthetically generated. Although the Draft Rules had mandated that disclosures cover 10% of any content, following concerns raised by industry stakeholders, the Amendment Rules afford platforms greater latitude in determining how to implement the aforesaid labelling requirements. An intermediary is also obligated to ensure that it does not enable the modification, suppression, or removal of any such label or identifier.
Additional Obligations for SSMIs
An SSMI that enables displaying, uploading, or publishing ‘any information’ on its computer resource, must: (i) require its users to declare whether such information is Synthetic Media; (ii) deploy appropriate technical measures, including automated tools or other suitable mechanisms, to verify the correctness of such declarations and to ensure that any unlawful Synthetic Media is identified; and (iii) where such user declaration or technical verification confirms that any information is synthetically generated, ensure that it is prominently labelled indicating that the same is Synthetic Media. The Amendment Rules clarify that SSMIs must implement appropriate technical measures to verify the correctness of such user declarations and ensure that no Synthetic Media is published without such prominent label.
Shorter Timelines for Content Takedown, Grievance Redressal and User Intimation
The Amendment Rules also introduce materially compressed timeframes for the removal of illegal content. Where content is determined to be unlawful pursuant to a Court Order or by the Government by way of a reasoned intimation to the intermediary in writing, it must ensure takedown of such content within three hours (in comparison to the previous timeline of 36 hours). For particularly sensitive content or material involving non-consensual intimate imagery or deepfakes, the content takedown timeline has been reduced from 24 hours to two hours after receipt of a complaint from the affected individual. Additionally, the timeline for grievance redressal has been reduced to seven days, and in case of grievances related to certain types of unlawful content including pornographic content or that which is harmful to children, the redressal timeline prescribed is 36 hours. Further, the timeline to periodically intimate an intermediary’s users regarding the consequences of violation of its rules and regulations, privacy policy, and user agreement, has been reduced from one year to three months.
Safe Harbour Implications
Where an intermediary becomes aware, or it is established that the intermediary knowingly permitted, promoted, or failed to act upon unlawful Synthetic Media or otherwise violates its other obligations under the Amendment Rules, such intermediary will be deemed to have failed to exercise due diligence under the Intermediary Rules, which may in turn jeopardize its safe harbor from liability for third party content as prescribed under the IT Act.
Way Forward
The Amendment Rules constitute India’s first substantive law governing artificial intelligence (AI) generated content. With the amended law coming into force on February 20, 2026, intermediaries should take immediate steps to ensure compliance. In doing so, they may face operational challenges, including in relation to complying with the compressed timelines for takedown of unlawful content and grievance redressal which would require them to overhaul their technical systems and internal content adjudication processes.