Data Governance and Protection Standard of India (DGPSI) is now becoming available. And, this will facilitate in complaince with privacy laws especially Digital Personal Data Protection Bill of India. In this episode, Punit Bhatia has a conversation with India’s finest data privacy expert Vijayashankar Nagarajarao, commonly known as Naavi. They both talk about various aspects like evolution India’s privacy landscape, DPDP, Data Governance and Protection Standard of India (DGPSI) and even FDPPI.
Data Governance Protection Standard of India (DGPSI) with Naavi & Punit Bhatia
Transcript of the Conversation
Punit Bhatia 00:00
Data Governance and Protection Standard of India or DGPSI. This is supposed to be the path breaker. When it comes to compliance with privacy and AI legislation, especially in context of India as India has recently passed Digital Personal Data Protection Bill (DPDP).
There are various ways in which you can comply with the law. But managing or putting an effective data governance program is a key. And what better than having a data governance and protection standard, which allows you to do it in a standardized way that is recommended by experts.
This is what the DGPSI or the Data Governance and Protection Standard of India does. And we're are getting these thoughts from none other than the brain behind it. He is a thought leader and pioneer in India's privacy landscape. He is none other than Naavi. Can you describe to us how has your journey been and what is currently the privacy landscape because people think that now the DPDP has come in, India has a privacy law. While most knowledgeable people like you know that it is not the case, we have a formal data protection law, but privacy was embedded enshrined in the Constitution and even otherwise, in many other laws for last so many decades.
Punit Bhatia 12:11
Yeah, that makes sense. So, you make a very clear distinction between India's Digital Personal Data Protection bill being the digital data, while the EU General Data Protection talking about general data, so, it's both personal and physical and all other forms of data. So, that's very interesting. But I also have seen that in the DPDP, they don't talk about sensitive data, which was the case in the draft. Do you have a view on or opinion on that?
The idea has been a data centric club. So, they use the word sensitive data. Now, what has happened is, this particular act is looking at the organization which is processing the data as the object of this particular regulation. So, they may look at defining a significant data fiduciary, at that time, they will take into account whether a data fiduciary is processing is sensitive personal data are ordinary data. So instead of defining data as sensitive and non sensitive, and then say that whoever is handling sensitive personal data will be having additional obligations responsibilities. What this Act may try to do is, focus on the organizations and we will distinguish organizations which carry higher risk to privacy. For various reasons, we will define them as significant data fiduciaries, who will have this additional responsibilities like the DPO DPIA data audit and other things. So, I think the same objective maybe perhaps achieved by different means by shifting the focus to classification of the data fiduciaries, rather than classification of the data.
Punit Bhatia 14:19
That's interesting. So, in your opinion, the sensitive data or sensitive personal data objective will be achieved by classifying the data fiduciary as significant. So all hospitals or healthcare companies will be classified like that assumingly and many others, and then you will achieve the same objective in the end but without burdening the usual companies with the responsibility of having to segregate data into sensitive personal data and personal data. Now, that the DPDP is here, and already we are talking about this. But there is a talk about Artificial Intelligence Act. Is there something being contemplated on artificial intelligence?
Because the DPDP focuses on digital personal data, it is automatically taking into account the what GDPR calls as an automated processing because all data covered by the DPDP is automatically processed. Without the use of a computer it doesn't become a digital data therefore, use of computer is that now, the computer whether it is used at the base level where a programmer says something for the computer to do and that executes which is the let's say base level of programming which all computers do or the software is sophisticated enough to be classified as an artificial intelligence committee artificial intelligence is also software. The instructions are given and inclulde the instructions to correct the code based on the observations it is like a lead or something like that, which has a sensor and based on the sensor input, the left may get adjusted in terms of its position and other things. So, that we don't normally recognize as artificial intelligence it is a simple techno I mean software issue can everything. So, we feel that the requirement of artificial intelligence to some extent is already there in Indian law, because what information technology actually did was they introduced one very, is my main neglected section called Section 11 where this said that the activity of IT system which gives out an automated decision is attributed to the person who causes that device to behave in a particular fashion, which means that the owner of the software is responsible for the action of the software. This is embedded in Information Technology Act as if it is a part of the contract regulation because of that, if there is a algorithm and it does some automated decision making the responsibility and accountability for that automatically lies with the owner of the AI algorithm. Now, what are the AI algorithm you may call as the licensing or if the licensor has not properly disclosed the default functioning? I say that the licensor should take the responsibility because he has not disclosed all the activities of the software. But anyway, there is a human who is the owner of the yay algorithm, who Indian lot today can recognize as the person responsible for the end action of an artificial intelligence algorithm. Okay, maybe the next question is, do we consider artificial intelligence as a jurisdical person? Okay. So, when we say courts, sometimes even other say that law should be more specific, so you get more and more clarification. So what I am talking about is like an interpretation of one innocuous section 11 and extending it to the world of artificial intelligence, which was not there in the year 2000 or 1998. When this section 11 was actually thought of, okay, but because of the generic usage of the language, I'm saying that today, I can stand in a court and argue that section 11 says that automated decision making is attributable a deep fake or any other artificial intelligence algorithm, you should attribute it to the overlap. Therefore, the order has to pass have due diligence say, I can also extend the concept of due diligence and concept of intermediary to this algorithm, I can consider the algorithm as an intermediary and the Warner as the person who is giving instructions or passing the message. Therefore, he has to exercise due diligence, if he does not exercise due diligence, then the consequences he may be even accountable personally. So, Information Technology Act is so designed that many of these modern day problems, we can find solutions, but if the government thinks that worldwide we are building some at it for a, it is fine, I'm fine with that. Basically, I focus on accountability and then the normal ethical behavior of any organization. Now, even if we look at the EU law, basically it tries to classify the AI in your based on the risks And it tries to say some can be banned, some may be regulated, some can be free. In fact, you just think the man said that deepfake is not really a very risky Yeah, whereas in India, I don't know. All this has to be based on the accountability, we are saying that the publishing organization which is a YouTube or something should be able to identify the origin of a particular video similar to what we have been talking of WhatsApp for being able to identify the origin of a particular message. And therefore there is accountability. Our criminal jurisprudence does not allow cases to be built up if you cannot identify the particular cause committed the offense. In those cases, we have to ask the intermediary to take vicarious liability Information Technology Act has got vicarious liability also. So, they can say the organization will be deemed as the perpetrator of the crime unless they are able to tell us that some XYZ customer of their platform was the person who is responsible. So, in a way Indian life is robust enough to use the current law to meet some of these requirements. And let us see if government still wants to come up with the EU AI act kind of thing. It is welcome as long as it is it does not give the some kind of loophole for example, you take the humanoid robots, today humanoid robot is allowed to be a CEO of a company. Now, if he takes a wrong decision and some adverse impact is there on an individual The question is who has to be held responsible. So, there the EU law the US law may have to define the political entity and treat that as a responsible but if you do that, then you cannot put that a robot in jail, you cannot give a death sentence to a humanoid robot which may cause murder, okay, but if you recognize the human behind that as the person who is accountable, then under the concept of due diligence, up to a particular point, you can protect that person against any I mean very high level of say punishments, but if it becomes necessary, if we start using humanoid robots and commit as a terrorist activities, our law which recognizes the person behind the automated activity can be convicted even for a death sentence. Okay, whereas if you define the yea, algorithm or the robo as a juridical entity, you can only punish that, at best, you can deactivate it and dismantle it and nothing more than that. So I think defining a political entity will actually dilute the provisions that is my current view. So when the government comes up with an act, we have to see whether they will be innovative enough to take these kind of concepts in mind, or they will just go by the international order, just as people used to say GDPR is the golden standard So, India has to follow that. But India did not do that India had right to do something on their own. So similarly this artificial intelligence law when India adopts, they should look at it independently not being tied down to say the EU right Punit Bhatia 25:19 In fact, India, I think is part of the OECD group, where they agreed on a principle based approach for AI. And there is an AI framework at which all member states have agreed to. And underneath it, the subsequent legislation will come. But whether we talk about AI or privacy, there's one important term, or one important way in which companies will need to manage Of course, we still need to define who the jurisdiction is, and who is the entity who's executing it or going to be taken into accountability. But still, that's called Data Governance. Because end of the day, whether it's AI, whether it's privacy, it's about data. And when I last spoke to you, you were talking about this data governance framework that you have put in, and can you enlighten us about what is this data governance framework? How can it help? And how can people access it because it sounded like or it seemed like to me a very powerful tool, which can enable just like the ISO standards, which can enable companies to execute or manage data privacy or data data rather not privacy only entire set of data in a very effective and responsible way. Naavi 26:38 See, we have called it as Data Governance and Protection Standard of India or DGPSI. What drove us to work on this cause is that if an Indian entity because this law is directed towards data fiduciaries, so data fiduciaries need to be compliant now for compliance. So, there should be a data auditor who is willing to actually evaluate the system of data management in an organization and then say whether this is compliant with the DPDP or not. So, this requirement meant that companies need a proper framework for being in compliance with it, and help them during implementation, and then perhaps even to invite a third party auditor to certify. If you look around the available framework, we had ISO 27001, which was for security confidentiality, integrity and availability of information, whether that information is personal or non personal still 27001 where applicable. and in the 2022 version, maybe they might have used some words related to privacy, but to certain 27701 which was specifically created for GDPR compliance, but GDPR is not the same as the DPDP. So you cannot adapt to 27701 for the DPDP compliance. Therefore, we needed a particular framework exclusively for DPDP compliance. So, in an organization, data governance is to be looked at as an asset and everything should be done to see that the asset is more productive. So, the BIS the Bureau of Indian standards has created a particular standard for data governance within that they have added one module with about 25 controls, we can call it as controls though they call it as expected outcomes. So 25 controls were directed towards data privacy. If data is an asset, it has to be valued and the value of data should be visible in a corporate environment, otherwise you cannot have adequate funding by the top management will not be able to provide support. So because of these things, we have included lot of governance issues already in our DGPSI framework. So therefore, what we did was, we took the DPDP and we recognized that IT Act 2000 is still relevant. So, there are several aspects of IT Act 2000 which has to be added to this governance model. I hope this will be very, very useful for organizations, we have also taken another thought into consideration that there is a system of assessment which is going on from a quality perspective or something like that. And we wanted to say that when an auditor does a thorough evaluation of controls, he also gets an idea about assessment. So therefore, we said that the system called Data Trust score can be integrated with this certifiable audit system. And therefore DGPSI becomes both an implementation framework, third party certifiable audit framework and a third party assessment framework. We feel that with this, the number of compliances which organizations need to work on will be reduced to that extent, they can focus more on business rather than throughout the year working on compliances. And one other very important aspect what we, I personally feel the DGPSI will accomplish will be a compliance requirement, in which you will have a CISO, DPO as well as the CFO and the marketing manager, all of them are involved because it involves management, so that I think there is a greater integration of the resources, I feel that in the long run companies will feel that it becomes something like a goal for the entire management team rather than the DPO. So that is what we have done. And the DGPSI is being given as an open standard, actually the standard itself, we want to actually share it, of course, what how it has to be rolled out in terms of templates and other things is something which consultants can customize based on the particular context. But otherwise, the system is considered as an open standard by FDPPI, which I represent Foundation of Data Protection Professionals in India (FDPPI), which is an NGO created by professionals, who are into data protection for years, that organization will be providing the backend support in terms of accreditation of auditors, providing certification and capacity building for professionals. So that is the DGPSI can be rolled out.
Punit Bhatia 33:24
I think that's a very good idea. With DGPSI and FDPPI, you will be able to support the companies and people in implementing the digital personal data protection bill as well as any other act that comes around relating to data management, data governance, and the thought process you have in the DGPSI is also very similar to what they are talking about in the UK ICO because they're saying having a data protection officer role is burdensome, or it's also putting the responsibility very somebody who's outside in the business, and you need to put it in the business. So that is very similar thought process. And that's where I think data protection is going to head into next 10-20 years, because you cannot put data protection out of the business it has to be in the business and with the business. So there has to be a collective responsibility. So I'm very happy to hear that thought. So, in essence, if somebody wants to become a member of FDPPI, they have to be in India.
Not necessary, it's not necessary. They they can remember we have a few members from the from London basically and also the Middle East. So it is open to everybody. It is a what we call as Section 8 company and section 8 company means it is a not for profit company. Secondly, this is not a company where the liabilities are limited by shares. There are no shares in this. It is limited by guarantees. I will have the voting powers, it doesn't happen by entity wise each person will have only one kind of owed and is still a private limited company, which means that we have a limitation on the total number of voting members. So, we have, what we have done is we are taking general members in a different category, we call them as basic members and it is a non voting membership, we have created a category called supporting members is supporting members are actually the building blocks of FDPPI, in the sense that mvppa will execute projects in association with the supporting members. So, supporting members will come as a consortium and we work under the brand name of mvppa. But execution will be the department people supporting members, we have a revenue sharing arrangement. Many of our members who come into the supporting member category, they're basically entrepreneurs and business people, they otherwise we have totally about 400 people who are supporting us today, not all of them are people who are in the supporting member category, they are just there to consume the services of mvppa, even corporate members, they consume the services of mvppa. Like we when we do a training or something like that supporting members or not consumers, they are actually support us to mvppa. So there's a small difference. They execute projects sometimes brought in by them, and they take the assistance of other members in the organization, or I get the inquiry and I asked couple of people who are suitable for that particular thing whether we can build a consortium for this particular project.
I don't think many other organizations have this kind of setup. I felt that for an NGO of this type formed by professionals who have essentially a conflicting business themselves. This is a kind of structure which we have worked so far, it has been working well. We are really five years old now.
Punit Bhatia 38:00
Congratulations. I'm very happy that you are also not only working with the government, but also working with the professionals and also thinking about how do the laws get implemented. So I would say in essence of time, it has been very enlightening, and very enriching to learn about the Indian data protection landscape, the thought process that is going behind in the AI Act, as well as the DGPSI, which I think will help companies in implementing GDPR, the DPDP or the PDP as we call it, even the AI act or any other new act that will come up, it will holistically allow for and maybe will become eyelid soap, a standard equalent of ISO in coming days.
And FDPPI will also I think, grow with the new Act and new law. So all the best for whatever you're doing, and keep helping the data protection world as you've been doing.
ABOUT THE GUEST
Vijayashankar Nagarajarao, commonly known as Naavi is a thought Leader who pioneered several concepts in Cyber Jurisprudence in India and also introduced services such as Cyber Evidence Archival, Online Dispute Resolution etc.
Naavi specializes in Consultancy related to Regulatory compliance in the areas of ITA 2000/8, HIPAA, GDPR, CCPA, DPDPA (Digital Personal Data Protection Act of India) etc. Naavi is also a Registered Independent Director with Ministry of Corporate Affairs, Government of India.
Authored several books including the first book on Cyber Laws in India in 1999. Also authored a book titled "Personal Data Protection Act of India (PDPA 2020)" to mark the beginning of a new era of data protection in India.
The website www.naavi.org codifies the 20+ years of the journey of Naavi in the filed of Cyber Laws in India. The website www.fdppi.in reflects the activities of the Foundation of Data Protection Professionals in India
About Punit Bhatia
Punit Bhatia is one of the leading privacy experts who helps CXOs and DPOs to identify and manage privacy risks by creating a privacy strategy and implementing it through setting and managing your privacy program and providing scenario based training to your key staff. In a world that is digital, AI-driven, and has data in the cloud, Punit helps you to create a culture of privacy by establishing a privacy network and training your company's management and staff.
For more information, please click here.