Jan 31 / Punit Bhatia and Katrina Destree

Provide Choice, Control & Consent to Create Trust

Drag to resize
Imagine a world where every click, search, or online purchase feels safe and trustworthy. In today’s fast-paced digital age, trust has become the foundation of our online interactions. But how is this trust built, and what role does privacy play in it?

Trust plays a key role in how people and businesses connect. Digital trust is built on three important pillars: choice, control, and consent. Privacy expert Punit Bhatia and industry leader Katrina Destreé discuss the strong connection between privacy, trust, and technology, including artificial intelligence. They explore what people expect from businesses, how companies can meet these expectations, and why following privacy laws and ethical practices is essential for earning and keeping trust.

Transcript of the Conversation

Punit 00:00 
Provide choice, control and consent to create trust. Well, that's very concisely said, because when we talk about trust, there are user expectations, and then there is the aspect of AI. That's when it starts to get blurred. And then we talk about, does privacy play a role in creating trust, or what role do privacy laws play in creating trust? And if all that is put in perspective, organizations need to take initiatives to create trust. And finally, if you do all that right, you win trust. But what do you do in all that? You provide a choice, you provide a control and ask for consent. And when you do all that, you create digital trust. Well, that is very simply said and very complex or very ambitious to do, and it's a dynamic thing, because you continue to do actions on an ongoing basis, continuous basis and how about talking about all this in detail with a privacy colleague who's very well known in the circles, and I'm talking about none other than Katrina Destreé, so let's go and talk to her.   

FIT4Privacy 01:16 
Hello and welcome to the fit for privacy podcast with Punit Bhatia. This is the podcast for those who care about their privacy. Here your host, Punit Bhatia, has conversations with industry leaders about their perspectives, ideas and opinions relating to privacy, data protection and related matters. Be aware that the views and opinions expressed in this podcast are not legal advice. Let us get started.   

Punit 01:45 
So here we are with Katrina Destreé. Katrina, Welcome to the Fit for privacy podcast.   

Katrina 01:51 
Thank you so much. Punit, it's pleasure to be here. Thank you for having me.   

Punit 01:55 
It's a pleasure to have you. So let's start with a basic question. Now, we talked about privacy few years ago, then we talked about or before that, we talked security, then risk and compliance and so on and so forth. And then we were saying, AI will come. We'll take away our jobs, and we don't have anything to do. We can go on holidays. And nowadays we are talking about what we call digital trust. So in your view, how would you describe this concept of digital trust?   

Katrina 02:28 
Digital trust is about confidence, and perhaps we are thinking about that as we're on holidays, and we are thinking about all the processes that it took for us to get on a holiday, taking our flight or getting in our car and going through toll roads, etc., in our hotel and our accommodations and all of those transactions. And so we're thinking about confidence in those organizations and how they've processed our personal data responsibly. So we're having, hopefully, confidence and trust that all those data elements shared have been properly processed. They're being used for the purposes that they've been requested, and they're managed along the way in a secure manner.   

Punit 03:05 
That is very well said. I would usually not be able to add anything to that, but I will say then, maybe from a user's perspective. So we as privacy pros or security pros sometimes can define things like that. But what would be a user expectation? So typical user, let's say any of our children, family, friends, colleagues, who are not privacy pros, what would they have an expectation in terms of digital trust?   

Katrina 03:36 
That's a great question, because I think that their expectations are much higher than ours. They're much greater. They've grown up with iPads in their hands very early on, I'm guessing, age six, perhaps even younger, and they start, if not at home than in their educational setting. And so they're familiar with the digital world, and they start to ask questions earlier as well, and I'm basically surrounded by Gen Z, so I have a lot of questions coming to me from my son and my his friends, and actually, I have three sons, so I have them coming from each angle, and they are asking me sets of questions. You know, Mom, do I need to give this information? What happens when I get this form? Should I really fill this out? Where is it going? Is this necessary? And I actually am surprised and I'm impressed. I'm thinking, have they been really paying attention to me and my work, or through osmosis, they've kind of gathered this kind of intelligence or level of awareness, and I think it's probably all of the above. It's also conversations that they have within their own groups, and they're asking themselves, do I need to get this information? and they're becoming more and more aware. So those expectations are increasing, and I think that it's a dynamic exercise between those that are requesting the information, and those that are providing the information, expectations are greater.   

Punit 05:04 
That does make sense, because these people, they have some tech affinity, so iPads and iPhones and these things are natural to them, but they also have some digital savviness. They're not over concerned like Instagram converted into paid option for in case you don't want them to use data, and they are open to let them use data, but when they are asked to share too much, then they also get sensitive. Why do you need to know that? Is that really necessary? And that's the language we use, rather than the non privacy pros use, but in all this context of trust, or digital trust, and when we talk about the user expectations as well, where does the concept of or the technology called AI fit in?   

Katrina 05:52 
AI is right there with us, and it can be both great and it can be cautionary, right? So AI can fit into the area of really being an effective tool. It's what we make it so AI, we think of AI from the user perspective, an everyday perspective in the bots, and we think of it as being a way for companies to be efficient and also for us to be efficient. Now we're used to having customer service situations with chat multitasking, we're multitasking, they're multitasking, and the bot options create efficiencies. And I think that customers like that efficiency. And I've also read the recent ISACA report, and this also lends credibility to that statement, is that we, we as customers, want to know that there's a high level of efficiency without the potential for human error. At the same time, we want the human oversight, because we don't want to lose complete control of that processing. So it's very important to have that combination of Ai plus the human oversight when it comes to the bots and where AI fits into the picture, I think staying in the driver's seat is very important for organizations looking at that bot as what it is. It's a tool, so it's not going to assume the risk. It's going to help with the efficient side of processing, providing the options. I think that there's another area that can be very useful for us to consider, and that is a tool to convey an organization's privacy practices, privacy familiarity, commitment to privacy. So for example, if you're in a chat session, the bot states, okay, choose your option. Is it your travel? Is it your account? Is it a question? Is it this? Is that there can also be an option to say, Would you like to know more about our privacy policy? Would you like to know about how useful your data is going to be for the next series of questions? Here's how you can benefit from sharing these particular data elements, and I think as a consumer, we like to have a lot of these personalized options for us. I know, for example, I'm going to be appreciating the fact that my car might guess where I'm going tonight, and I'll get in, I'll have five minutes to spare, and I will be happy that. Okay, great. They kind of guess the location where I might be going based on my patterns, and it saves me time. So I like that personalization. When I log on to my music and there's a recommendation for what songs you know I might appreciate some a new artist that I haven't explored before. I like that personalization, but I want it to be a tool that's for the purpose of being efficient. I don't want to feel that I've lost control. I don't want to feel in the dark about where is this information going. I don't want to feel manipulated to give more information than is necessary.   

Punit 08:58 
I'm fully with you. I think it's a tool that's the right word you've used. It helps us get things done better. It helps us get things which we don't know or understand what we don't know, but as long as it asks us, it gives us the choice, gives us the control because few months back, I was listening to a podcast of a monk, and he said somebody asked him, can you use emails? Are you tech savvy? And he said, Yes. And then he cracked a joke, saying, we use email without being attached to it, and that's the thing. So the same thing here, as long as we are able to use AI and we feel we are using it, that's fine, right? To use us exactly on the product. That's not okay, because we want to be confident. We want to be having the control and the feeling that we are running it. And sometimes, if the suggestions are annoying. Then we should also be able to turn it off, and that's where the human oversight and control comes in. Now that's a fascinating conversation already, and I would like to take a turn towards the privacy aspect of it. So if there is digital trust and there is AI both elements, AI making suggestions, AI making life easier, AI making it efficient. But does privacy have a role in here to play in, especially in creation of digital trust? Then,   

Katrina 10:28 
Absolutely, so privacy has a role to play in the area of controlling your personal data, how it's being used, how it's being transmitted, how it's being handled the entire life cycle, so it's staying in control. I think that's the key word when I think of privacy. I think a lot of C's words that begin with C, we have consent, right? We have choice, we have control. Probably many other words that will pop up, but they seem to all. A lot of them seem to begin with the letter C, but choice, control, consent, very important in privacy, and so when we think of the privacy laws and regulations that have appeared over the past few years, this is giving us assurance, and it's giving us the feeling that there will be consequences, another word with a C, consequences, if these principles are not really followed. And so we want that level of confidence, and we want to have that assurance that this is the direction that's being taken, that these are the messages that are being conveyed to organizations that we understand. Need this data, and I should qualify personal data, so privacy is about personal data. And I also think that when we think of personal data, we need to remember that word personal, making it personal to us, so that we can stay in control of our personal data   

Punit 12:01 
That makes perfect sense, choice, consent and control on personal data create trust. That's very powerful message. And now to protect privacy, we have two things. One, you talked about the principles, which are self implied or sometimes imposed by law. But then there are also privacy laws. And do you believe these privacy laws are also helping create trust, especially, say, GDPR or the CCPA or the deletion act in the California?    

Katrina 12:33 
Absolutely, because let's step back outside of our own circle of privacy and security professionals, they are helping create trust because they're educating a lot of these agendas for the regulators. And when they have like, for example, in California, the California Privacy Protection Association, which is relatively new, we also have the European board of data protection, when they have meetings that are public meetings, they'll publish the agenda, and that's for anyone to review, and they can register to join. There's actually a meeting this Friday in California that can be joined virtually. And if people feel perhaps self conscious that they're not a privacy Pro, I'm just you know, an average person, I would, but I'm curious, right? You can join anonymously. You can you don't have to reveal your personal information to join these meetings, and it's a chance to become educated. So that's the first level of awareness that there are these laws that are to assure that personal information is being processed in a responsible manner and educating people. So that is the first step, is this level of awareness, and then going forward, how to sustain it. So it's not like you're compliant one day, good, we're done. We're good to go, right? It's dynamic. It's ongoing. So these laws, as we we've seen, and there are rules that come along with the laws. Afterwards, there's regulations. Here's how you do this, here's how we activate these laws, and we go forward, and there's opportunities for public comment as well, and oftentimes the entities that submit the comments are associations, so they're representing entire sectors. So again, we're stepping outside of our own circle of privacy and security professionals, we're getting into the general public arena. We're raising levels of awareness, and we're raising this conversation, and perhaps that's why Gen Z is tuned in and becoming aware. I also think coming back to that topic generally, Gen Z, they might have had more negative experiences with their personal information being compromised. And I'm not speaking necessarily about a breach. I'm speaking about the misuse they've posted on social media, and perhaps it had a negative reaction and they questioned, why don't I share that information? Yeah, they step back and say, Oh, that was personal to me. Maybe I should be a little bit more cautious about what exactly I share, or they're they're facing news reports. Here's what happened, and it was because there was access to this personal information. So I think there's a really much higher level of awareness than we've ever had before.    

Punit 15:20 
I agree with you. So if we look back to our conversation, there is the concept of digital trust. The users expect it. Privacy contributes. AI is there, and when you use it with right consent, choice and control, it creates trust. And then we talked about that laws are also there to help create trust, like you mentioned in California, when the authorities are meeting, the meetings are public, and people can join in. So all that is creating trust, and now all this concept or the law or the authorities are there to guide organizations to be able to sell their products or sell their products while creating comfort for the consumer that the products are trustworthy and there's nothing being done to their personal data, and they are remaining in control of their personal data. So in this situation, if you look at it from an organization perspective, how would an organization demonstrate the leadership or demonstrate the actions to create or inspire, let's say the digital trust environment from a consumer standpoint.    

Katrina 16:30 
Yes, that's a great question, because organizations are the one processing the personal data. So the organizations to demonstrate leadership need to be proactive. And I've seen some very good examples of this, especially as I've seen the way that privacy policies, and I use their policy, because that's what the public knows. The Privacy Policy, the correct word, of course, is privacy notice, because that's the external facing document, the policies of internal facing documents, but the notice. So, sake of simplicity, I'll just say the privacy policy, right? External privacy policy. I've seen these policies get better and better. I've seen these policies take on a more conversational tone. We, not only we treat your privacy seriously. Okay, that's the old term. Now it's like FAQs. Here's what we process. Do you have any questions? And they will list questions, they will list scenarios, they will list icons to find out more information. I do believe that some companies that are demonstrating leadership will be straightforward. Yes, we are processing your personal data. Yes, we are processing for these particular purposes. Here are some scenarios for our processing personal data. Some companies even publish their policies on dealing with third parties. If you are a third party slash vendor, we are expecting you to adhere to the same standards that we determine are necessary for proper processing. And I've been impressed with certain companies that have made such statements public, so that inspires trust. That's an example of leadership and thinking in advance and really getting out there and being proactive, having training, digital chess training, privacy training, not just a small segment once a year that people will probably forget about, but really making it part of our DNA, part of not just well, we need to say this for our customers, but perhaps people are employees, and they have their personal data being processed. So they want confidence so every stakeholder that the organization is working with, so that can start internally with employees that can go with, obviously, their investors, that can go with NGOs, non governmental organizations and Watch Dogs, that can go with regulators, that can be part media relations, that can be a wide range of stakeholder engagement, being proactive, anticipating what those stakeholders are expecting regarding the processing of personal information, and already being prepared to make a statement on that that's demonstrating that level of accountability for how they are processing personal data, so being very proactive. And again, I'll use that similar visual, staying in the driver's seat, yeah, and having that keyword accountability is super important, and that's how they'll demonstrate leadership,

Punit 19:41 
Absolutely. And I think sometimes, like you made the distinction between the notice and the policy, I think you do publish upon notice, but it's okay for a company to be upfront, to take the leadership and publish the policy saying this is how we are protecting our data internally. Now, I was working with a company, and I was blessed to be in such a transparent company who published the notice, and along with it, said, this is our data protection policy across the globe. If you want to read it, read it. And then I thought, everyone is like that. And I was consulting for another company, and I asked, let's publish that. And they have transparency in their values. But when I say, publish the policy, they are scared. They are worried that others will know what we do. No others will be happy to know that this is what you do, right, a competitive advantage, but they were concerned. So it was a bit awkward to say the transparency in your values, but you are not open to share your policy. So you are right. You need to be proactive. You need to take it on and make things happen and share things proactively. But let's imagine you do these proactive things. You make a notice, you share your policy, you give control, you give consent. Is that going to be good enough to win customer trust in this? What do you say the world of AI, digital tech and so on.   

Katrina 21:03
 
The word good enough, it sounds like it's stopping, right? It sounds like a check box. It sounds like, Hey, we're good, right? Yeah, turn the page next to next agenda item, yeah. We have to think of it differently. We have to think of instead of saying done or good enough, we have to think of dynamic. Yes, what's the next step again, being proactive, anticipating the world is changing around us, right? I mean, everything is changing, so we need to stay on top of that and anticipate the change. If we have a list of FAQs, anticipate that they'll increase as we use AI. Anticipate that more is around the corner as we use options to help customers control their personal data. Anticipate there's going to be more requests, more options, more etc. So I think the key word is dynamic, not good enough.   

Punit 22:05 

And I think dynamic and continuous because, continuous, dynamic basis, you stay alert, you don't you keep watching the market, you keep watching consumer behaviors. You keep checking your risk posture, and then on a continual basis, trying to adapt, trying to change where you need to improve and where you need to fine tune your policy, maybe where your actions may be, your risk status may be, or the assessments you've done, and as you do that, then you would create digital trust, and as you create that digital trust, consumers would come to you, but then you still need to remain dynamic, and still need to continue to do those things, but that leads us towards the end of conversation, because it's been a fascinating conversation, and we have covered much ground in very short time. But I would like to ask that you describe what do you do on a day to day basis, because it's been a very profound conversation, and people would like to understand, what do you do, and how can they reach out to you if they want to talk to you?   

Katrina 23:07 

Well, thank you so much for that question. It has been a great conversation. Punit, I really do appreciate it, and I appreciate your thought leadership as well as we were speaking a little bit about that before we started the recording. So yes, I advise public and private organizations on how to improve their privacy awareness and to sustain their activities, and also a key part of that is in the area of training. So as you know, I do privacy training. I'm on the IPP faculty for the CIPM program, and I've really enjoyed that, both in person and online, and that reinforces my own understanding of all the things that are changing in the area of privacy. It's every day as I like to joke, I go to sleep at night and I think that I'm aware and current. And I wake up and say which, which new privacy law got passed and I need to be completely aware what happened at night when I was sleeping in other parts of the world in the area of privacy, because it's very dynamic and very exciting. And including in my off time with those that will listen, I talk about privacy and security and artificial intelligence. I think that these areas are quite fascinating. So, in terms of reaching out to me, it can be definitely on LinkedIn. So my name, Katrina Destreé, on LinkedIn. I am on there every day, and that's the best way to send me a message. And of course, also at my website, which is agreed privacy, esg.com   

Punit 24:47 

That's wonderful, and I would say it was wonderful to have you, Katrina. Thank you so much for sharing your knowledge.   

Katrina 24:54 

Thank you very much, Punit, I really appreciate the work and your leadership. Thank you.   

FIT4Privacy 25:00 

Thanks for listening. If you liked the show, feel free to share it with a friend and write a review if you have already done so. Thank you so much. And if you did not like the show, don't bother and forget about it. Take care and stay safe. Fit4Privacy helps you to create a culture of privacy and manage risks by creating, defining and implementing a privacy strategy that includes delivering scenario based training for your staff, we also help those who are looking to get certified in CIPPE, CIPM and CIPT through on demand courses that help you prepare and practice for certification exam. Want to know more, visit www.FIT4Privacy.com that's www.FITthenumber4privacy.com. If you have questions or suggestions, drop an email at hello (@)fit4privacy.com. Until next time. Goodbye.   

Punit 26:16 

Now, for those of you who have been in privacy or who are in privacy field, you know that January 28 is coming up, and January 28 is a privacy day, or international privacy day. So for those of you who are in privacy, take this opportunity to pass on a message to your staff, to your colleagues, and emphasize the message on how privacy can help create digital trust, and if you need any help, do not hesitate to reach out because we are doing this with many other clients and many other companies to support them in creating this awareness and campaigns. So enjoy privacy day. 

Conclusion

Earning digital trust is not a one-time task—it’s an ongoing effort. Organizations must go beyond simply following rules; they need to actively address people’s concerns, stay honest, and give users real control over their data. By focusing on clear communication, genuine consent, and user empowerment, businesses can meet the growing demand for privacy.

As advanced technologies like AI continue to evolve, staying committed to privacy will help organizations balance innovation with responsibility. This discussion emphasizes that accountability and adaptability are key for businesses to lead in today’s digital age.

ABOUT THE GUEST 

Katrina Destrée is an experienced privacy and sustainability leader with a proven record in developing and implementing privacy programs, sustainability initiatives, and reputation-building strategies for technology and financial services firms. She is based in San Diego, California after having previously lived and worked in Europe for 20 years (Belgium, France, the Netherlands, and Poland). An IAPP Fellow of Information Privacy (FIP), she holds IAPP’s CIPP/E and CIPM certifications and recently joined the IAPP Faculty. She is an independent consultant at Agréa Privacy & Sustainability Communications after working in global privacy departments at Dell Technologies, Silicon Valley Bank, and PayPal in Silicon Valley. Prior to privacy, she held senior management roles in sustainability at Nokia and the Global Enabling Sustainability Initiative (formerly a UN Initiative) in Brussels.

When data protection and privacy became the leading issue facing the entire telecom sector from a sustainability point of view, she increased her focus on data privacy. Her work in the telecom sector includes enabling technologies, machine learning, and AI. She loves Privacy Impact Assessments (PIAs) because they bring people together and prove to be far more interesting as the processing of personal data stories evolve.

She holds a Master of Arts degree from the Fletcher School of Law and Diplomacy at Tufts University in Boston and a Bachelor of Arts degree from the California Polytechnic State University in San Luis Obispo. She also continues her life-long passion in classical ballet and teaches beginning ballet.    

Punit Bhatia is one of the leading privacy experts who works independently and has worked with professionals in over 30 countries. Punit works with business and privacy leaders to create an organization culture with high AI & privacy awareness and compliance as a business priority by creating and implementing a AI & privacy strategy and policy.

Punit is the author of books “Be Ready for GDPR” which was rated as the best GDPR Book, “AI & Privacy – How to Find Balance”, “Intro To GDPR”, and “Be an Effective DPO”. Punit is a global speaker who has spoken at over 50 global events. Punit is the creator and host of the FIT4PRIVACY Podcast. This podcast has been featured amongst top GDPR and privacy podcasts.

As a person, Punit is an avid thinker and believes in thinking, believing, and acting in line with one’s value to have joy in life. He has developed the philosophy named ‘ABC for joy of life’ which passionately shares. Punit is based out of Belgium, the heart of Europe.

For more information, please click here.

RESOURCES 

Listen to the top ranked EU GDPR based privacy podcast...

Stay connected with the views of leading data privacy professionals and business leaders in today's world on a broad range of topics like setting global privacy programs for private sector companies, role of Data Protection Officer (DPO), EU Representative role, Data Protection Impact Assessments (DPIA), Records of Processing Activity (ROPA), security of personal information, data security, personal security, privacy and security overlaps, prevention of personal data breaches, reporting a data breach, securing data transfers, privacy shield invalidation, new Standard Contractual Clauses (SCCs), guidelines from European Commission and other bodies like European Data Protection Board (EDPB), implementing regulations and laws (like EU General Data Protection Regulation or GDPR, California's Consumer Privacy Act or CCPA, Canada's Personal Information Protection and Electronic Documents Act or PIPEDA, China's Personal Information Protection Law or PIPL, India's Personal Data Protection Bill or PDPB), different types of solutions, even new laws and legal framework(s) to comply with a privacy law and much more.
Created with