Republic strikes back; data reset

|
  • 0

Republic strikes back; data reset

Thursday, 11 December 2025 | Kaushik Moitra | Nandini Tyagi

Republic strikes back; data reset

India’s data-protection landscape is undergoing a necessary reset. The Digital Personal Data Protection (DPDP) Act, 2023, and DPDP Rules, 2025, now partially in force, represent a significant shift in how personal information is processed, protected, and governed. For those familiar with the mechanics of data, and ease with which ambiguity has historically been exploited, this transition is overdue. The framework is the country’s first comprehensive digital privacy regime, which is citizen-centric, consent-driven, and innovation-aligned. It is built on seven principles: consent and transparency, purpose limitation, minimisation, accuracy, storage limits, security, and accountability.

The Government has opted for a phased implementation. Provisions establishing the Data Protection Board are operational; it will investigate violations, and impose penalties. The remaining obligations such as consent notices, retention standards, safeguards, breach reporting, grievance handling, and protections for children and persons with disabilities will happen within 18 months after the notification. This staggered rollout is intended to give citizens and companies realistic adjustment timelines.

The new regime seeks to transform personal data from something passively surrendered to something users actively control. Platforms must explain, in a user’s preferred language, what data they collect, why and how they will use it, and how consent may be modified or withdrawn. However, the DPDP Rules stop short of mandating disclosures on recipient categories, cross-border protections, or retention periods, thereby undermining fully-informed consent. Concentration of appointment powers for the Data Protection Board in the Executive has drawn criticism for diluting independence.

Firms must promptly report breaches to both the Board, and affected users, with fuller details within 72 hours. Users must be told what happened, and how to mitigate harm. Additional safeguards apply where users lack the capacity to give consent. Processing children’s or disabled persons’ data requires verifiable guardian consent, and are subject to narrow welfare-based exceptions. Platforms cannot monitor children’s behaviour or target them with advertising.

Research and archiving enjoy limited allowances but must meet baseline privacy standards. The Rules remain silent on whether AI (Artificial Intelligence) training qualifies as ‘research,’ leaving a significant issue as unresolved. Significant Data Fiduciaries, or entities processing high-risk or large-scale personal data, face heightened compliance. These include annual impact assessments, independent audits, and scrutiny of systems, algorithms, and hosting infrastructure. Cross-border transfers are permitted in principle, but the Government may restrict datasets processed by these fiduciaries, and impose conditions on access requests from foreign authorities. This reflects a shift toward risk-sensitive data mobility.

The Act introduces strict retention logic. Data must be erased when consent is withdrawn, or the purpose ends. Organisations must retain personal data, and logs for at least one year to support legitimate state functions. After this period, erasure is the default mode, unless another law requires retention. E-commerce, gaming, and social-media platforms must delete users’ data after three years of inactivity, with a notice, and an opportunity to re-engage. This curbs extreme data accumulation but may entrench prolonged monitoring by both the corporations and State.

The State may ask for information for reasons that may range from national security to compliance reviews, sometimes without notifying the users, which raises concerns over broad surveillance powers. A new rights ecosystem obliges organisations to explain how users may access, correct, or erase data, with a uniform 90-day limit for grievance resolution. Consent managers, who are modelled on India’s account-aggregator framework, will allow users to centrally view, and revoke permissions across platforms while remaining data-blind and audit-compliant.

Modern platforms behave like industrial-scale data refineries, and aggregate vast behavioural datasets with little transparency or consent. Dark patterns continue to push users toward maximal data sharing. Apps rely on third-party tools that transmit their own streams of information. Cross-border cloud routing adds opacity. Even pricing models are influenced by surveillance logic. Reports suggest differential pricing based on device type or battery level.

Courts are beginning to address these practices. In the US, a ruling accepting Meta’s use of books from ‘shadow libraries’ for AI training as fair use highlights the asymmetric treatment between the individuals and large platforms. Indian regulators have flagged this imbalance: WhatsApp’s 2021 policy update was deemed abusive because the users lacked real choice. When combined with opaque AI training datasets, the ANI versus OpenAI dispute, and government advisories on dark patterns, it is clear that several entities continue data practices that they cannot defend or trace.

Surveillance extends into hardware. Smart TVs monitor viewing habits, phones continuously generate telemetry, wearables store intimate health metrics, and cars generate behavioural data. These are monetised abroad. Across the ecosystem, data capture is continuous, consent fleeting, and deletion rarely meaningful.

The Act requires organisations to justify what they collect, why they do it, who accesses it, and how long it is retained. This challenges the default-to-accumulate existing models, which have existed for the past decade. Sectoral frameworks intersect with these expectations. The Reserve Bank of India mandates strict security, storage, and audit standards. KYC (Know-Your-Customer) data cannot be repurposed without fresh consent. The regulatory practices in European Union and the US treat lawful basis, retention limits, and algorithmic transparency as routine obligations.

Healthcare and the related platform economy face additional scrutiny. Medical data is already fenced by multiple laws, and guidelines that require explicit consent, and encrypted exchange. Children’s data now faces heightened protection under the DPDP regime, which must be harmonised with the IT Act’s obligations on harmful, and algorithmically-curated content. Proposed amendments to the IT Act will require labelling synthetic content, and impose provenance duties on platforms distributing AI-generated media.

Connected devices, which silently ingest telemetry, audio, and location trails must now implement layered permissions, real deletion mechanisms, and strict purpose limitation. Engineering teams will need to redesign systems on the principle that legality, and not technical capability, governs what companies may collect.

India’s privacy regime is evolving, and leaves major questions open. These are related to AI governance, state access boundaries, regulatory capacity, and the copyright implications of training on protected works. Yet the shift is unmistakable. For an economy where behavioural data has operated as an unpriced asset, and digital infrastructure underpins both growth and governance, the new DPDP framework marks the beginning of a disciplined, purpose-bound data order. It does not settle all the debates, but it sets the first durable boundaries within which India’s data economy must mature. The era of data drifting without form is giving way to one defined by structure, intention, and accountability.

Kaushik is a Partner & Practice Lead for Regulatory, IP-TMT & Practice Development; Nandini is an Associate in Regulatory & IP-TMT Practice. Both work for Bharucha & Partners, a leading full-service law firm with offices in Mumbai, Delhi, and Bengaluru, and which caters to a wide array of clients in India and overseas; views are personal

State Editions

Winds improve City AQI

11 December 2025 | Pioneer News Service | Delhi

Govt targets to cut road crashes by 50% by 2030

11 December 2025 | Pioneer News Service | Delhi

Health ministry, WHO launch campaign on women’s health

11 December 2025 | Pioneer News Service | Delhi

IndiGo crisis: Trade, tourism sectors suffer Rs 1,000 crore loss

11 December 2025 | Pioneer News Service | Delhi

Delhi to soon permit third-party audit for fire license: CM

11 December 2025 | Pioneer News Service | Delhi

Morning tours launched for change of guard ceremony

11 December 2025 | Pioneer News Service | Delhi

Sunday Edition

Why meditation is non-negotiable to your mental health

07 December 2025 | Gurudev Sri Sri Ravi Shankar | Agenda

Manipur: Timeless beauty and a cuisine rooted in nature

07 December 2025 | Anil Rajput | Agenda

Naples comes calling with its Sourdough legacy

07 December 2025 | Team Agenda | Agenda

Chronicles of Deccan delights

07 December 2025 | Team Agenda | Agenda