top of page

Data Privacy Counsultant

The DPDP Act, 2023 and the Draft DPDP Rules, 2025: What Do They Mean for India’s AI Start-Ups?

  • Writer: Davies Parker
    Davies Parker
  • Apr 21
  • 4 min read

Introduction

The Digital Personal Data Protection Act (DPDPA), 2023 and the Draft DPDP Rules, 2025 have ushered in a new era of data privacy in India. This framework, with an emphasis on enforcing data privacy, has ignited a sense of regulatory urgency. While this signals a significant shift towards stronger privacy protections, it has also created a wave of uncertainty, particularly among startups and emerging technologies.

AI startups often house a wealth of sensitive data, making them prime targets for attackers seeking to exploit this information, creating an orchard for potential threats. The challenge here is understanding the needs of each startup and engaging in compliance practices that are not too resource intensive. For startups navigating lean resources and rapid scaling, compliance with the DPDP framework is a daunting challenge. AI startups specifically must make compliance a priority since AI systems depend on enormous datasets, many of which contain personal data, to operate efficiently. AI in the past has proven to have the ability to scan and analyze data to uncover sensitive facts that people might not want to disclose, this brings with it major privacy issues. The absence of adequate guardrails might result in abuse or illegal access to confidential data.

From appointing Data Protection Officers to managing consent, under the DPDP Framework, startups are grappling with questions about operationalizing compliance in a manner that aligns with their business needs.

Informed Consent

Under Section 4 of the DPDPA, data fiduciaries are required to secure explicit consent from individuals before processing their personal data. This consent must be freely given, specific, informed, and unambiguous. The Draft DPDP Rules emphasize that consent notices must be clear, specific, and easy to understand independently. This means that startups must now establish systems to securely collect, manage, and document user consent, ensuring compliance with these requirements. This starts with providing standalone notices outlining the types of personal data collected, the purposes for data collection, and the uses to be enabled by such processing activity.

These notices must also detail how users can withdraw consent and exercise their rights under the DPDPA. The Rules provide for a consent manager framework to facilitate user consent management. Startups must implement systems that enable users to transparently give, review, and withdraw consent at any time. AI Startups can ensure compliance with consent requirements by developing a user-friendly consent mechanism, creating sufficient granular consent options and ensuring transparency in AI training and decision-making.

Data Security Measures

AI startups handle large-scale, structured, and unstructured data, making them prime targets for cyber threats. The DPDPA requires organizations to implement robust organizational and technical safeguards to protect personal data against unauthorized access, use, disclosure, alteration, or destruction.

  • Startups must prioritize the adoption of advanced security measures. This includes implementing encryption to secure sensitive information during storage and transmission, as well as deploying strong access controls to ensure that only authorized personnel can access personal data.

  • Some of the reasonable security measures under the Draft DPDP Rules include implementing measures like encryption, obfuscation, masking or the use of virtual tokens mapped to specific personal data.

  • Further regular security audits, vulnerability assessments, and penetration testing to identify and address potential risks form a part of the organizational measures that may be undertaken.

Ensuring that sufficient security measures are taken by AI startups to secure their AI model is crucial.

Apart from the security measures, it is also important for organizations to have a strong breach response plan. Since AI systems continuously learn and process data, breach response strategies must be tailored to dynamic AI models.

The draft rules also lay down certain timelines for intimation of breach that must be adhered to. In case of any breach, the AI startup as data fiduciaries must ensure that they take timely action and notify the Data Protection Board as well as the affected Data Principals. Data fiduciaries must provide further information, such as facts of the event, circumstances and reasons behind the breach, remedial actions and report on notifications given to affected Data Principals.

Cross-border Data Transfer

The global nature of tech and AI operations adds another layer of complexity to the establishment. Different countries enforce varying regulations concerning data protection, making it challenging for organizations to ensure compliance while operating across international jurisdictions. AI Startups with global operations, engaging in cross-border data transfers must ensure compliance with these regulations by adhering to the prescribed standards. Data Fiduciaries must ensure that they adhere to the requirements and standards prescribed by the Central Government under Section 16 for the transfer of data outside the territory of India. Additionally, AI Startups must also comply with sectoral regulations governing cross-border transfer of personal data.

Data Retention and Deletion

The Act requires organizations to retain personal data only for as long as necessary to fulfil the purposes for which it was collected. They must establish and implement clear policies for data retention that align with these guidelines. The draft DPDP Rules provide for specific data retention periods based on the purpose for which the data is being collected and processed. Once the data is no longer needed, they should ensure its secure deletion or anonymization to prevent unauthorized access or misuse. Data Principals must be informed 48 hours before their data is to be erased. This process can include automated systems for tracking data lifecycles, conducting regular audits to identify redundant data, and securely erasing it in compliance with industry best practices. By adopting these measures, startups can reduce data-related risks and demonstrate accountability in handling personal information.

 
 
 

Recent Posts

See All

Comments


bottom of page