Mar 14 / Punit Bhatia and Luke Mulks

How do you Protect Privacy in an AI World

Drag to resize
As artificial intelligence continues to shape the digital landscape, the challenge of protecting user privacy has never been more pressing. The rise of data exploitation, intrusive tracking, and growing consumer distrust has led to new regulations like GDPR and a demand for privacy-first solutions. But how can businesses adapt to this new era without compromising efficiency and monetization? Can AI enhance privacy while still delivering personalized experiences? And what role do users play in shaping a more secure digital future?

This episode explores the evolution of privacy-preserving technologies, the shift towards user-centric business models, and the challenges and opportunities in building sustainable, privacy-first solutions. From AI-driven search engines to new advertising approaches, the discussion highlights how innovation can balance personalization with user control.

Transcript of the Conversation

Punit 00:00 
How do you protect privacy in an AI world? Yes, AI is coming. New technologies are coming, and privacy is becoming more and more a challenge. Now the key question here is, how do you protect privacy, and how do you keep trust in this digital world? Is advertising going to take or analyze all your data. Are there any ways in which your privacy is protected? What are the some of the practical steps you can take? Like when we talk about the practical steps, we often talk about using a safe browser or a privacy protecting privacy browser or privacy respecting browser. Sometimes we talk about using a safe search or privacy protected search, and sometimes we also talk about using ChatGPT or other things, and then we say concerns on privacy. So is there any privacy protected AI, GPT? Now, all these are fascinating questions, and none other than someone who has been pioneering in this space that is privacy, crypto and protecting privacy. And you may have heard the brand called Brave through brave browser, the brave technologist podcast, or the brave search engine. And I also learned there's a Leo AI that they've come up with. So I'm talking about none other than Look who is the VP of Business Operations at Brave software, and he's the man behind the action. Let's go and talk to Luke Mulks. 

Fit4Privacy 1:35 
Hello and welcome to the Fit4Privacy Podcast with Punit Bhatia. This is the podcast for those who care about their privacy. Here your host, Punit Bhatia, has conversations with industry leaders about their perspectives, ideas and opinions relating to privacy, data protection and related matters. Be aware that the views and opinions expressed in this podcast are not legal advice. Let us get started. 

Punit 2:04 
So here we are with Luke. Luke, Welcome to the Fit4Privacy podcast. 

Luke 2:08 
Thanks for having me on I'm excited to be here. 

Punit 2:12 
It's a pleasure to have you. And let me start with a fundamental question. And let's go back in say, 10 years we were talking about data, data being new gold, everything being in internet. And then came privacy concerns, because privacy was being exploited, or data was being exploited in different ways. And then came GDPR. And thanks to that, we were talking more on privacy. Some were saying there's more regulation than necessary, and some were saying that's necessary. That debate kicked in. Then the debate of culture of privacy kicked in, and privacy leads to consumer trust. And then we were talking about, all of a sudden, AI, or emerging technologies, and then they said, AI will kill us. Who knows? Will it? Will it? Won't it. But people, the dust is settling down, and now we are starting to talk about digital trust, or consumers trust in the digital world. So how would you describe this digital trust concept? 

Luke 3:10 
Yeah it's a great question. And I think Braves approach on this, and my own personal take on this, is that you've really got to build, build the software for the user first. And that's kind of been really the missing piece of the puzzle for a long time. When the web came out, initially, people thought of it this way, where it was, you know, you're building as the user. The user's agent is the browser. That agent should be working on behalf of the user. The problem is, is that as this technology scaled, as the web scale is advertising scaled as all these things scaled, they scale exponentially. And unfortunately, the methodology that was used while they scaled all required the user handing over their privacy is kind of this cost of doing business. And so what's been missing from the equation have been tools that are designed to operate based on putting that users interest first, which means you're protecting their privacy. Means that you're making the software secure as possible. It means that you know, are you building a piece of software where you don't need to know everything about the user where you're the so the focus is actually on the service you're providing, and not on kind of getting as much of that user's data as possible to use for other things. And so that's really been a void in the market. And in a lot of ways, brave was really early to bringing solutions there. In a lot of ways it was really late too, right? If you think about programmatic advertising and kind of how Google and Facebook really aid a lot of the monetization of the internet, and became kind of this empire over advertising and monetization, it was doing so off of these methodologies that are, you know, capturing users data. So in a lot of ways, we were early, in a lot of ways on the privacy discussion, but in a lot of ways we were late, and as far as, like, you know, getting to addressing the problem before it had manifested. That said, I mean, we are where we are, and we've been building out these tools for years. Is now. And I think with AI, it's even more important where you know, you've got to really be considering it is a very intimate relationship between a user and AI, and a lot of personal, your thoughts, your information, all of that, that data that you're inputting in and that's being used in these models are very, very close to users and and very, you know, they can be very identifiable too. So brave is really kind of continuing to lead in these new emerging areas, both in web three and crypto side and in an AI and like, how can we compiled user first options there for users? Because, when we started, I mean, this stuff was very abstract, people would who would go into meetings, and people would be like, Look, no one cares about privacy. You know, back in 2016 it wasn't, it wasn't even really something people were interested in. Then, like you mentioned, GDPR rolled out, and people were like, Okay, we have to care about it now and then, what are we going to do? And everybody trying to kind of put a square peg in a round hole, or try to make something that was designed to work off of collecting user data all of a sudden safe it doesn't. I mean, we saw this with with Google, with flock, and with some of these other solutions where they try to make something that basically works off of user data be compliant, and it just is kind of incompatible at the end of the day, whether that's players in the industry that are being, you know, pretty difficult forces to work with and moving for that change, or just in the fact that, you know, if your whole targeting model is based off of, you know, users data in the cloud, you're using a model that's going to have to require collecting all that data. So one thing where we've been as a new player in the space, or newer at least, you know, we've been trying to introduce ways that you can have business and users work together, where users don't have to have their data be lost at all, you know. And that's kind of like that user first principle is kind of our big mantra, and our big, our big, you know, call to the public. And what you're seeing now too, is like, it's not just us. You've got, you know, proton, it's got 100 million users on their privacy products. Like, Dr. Goh is becoming an, you know, a good player in the space. Like, there's a lot of right privacy tooling and option signal has just gotten huge adoption, right? Like, there's a lot of tools that people are using now that weren't even in the realm you know that long ago, yeah. 

Punit 7:25 
And I think few years back, if we say the privacy concern, as you are mentioning, one of the concerns was around advertising. The tech players said, for advertising, we need the data. And consumers were like, how can you have my data and be used in, maybe, say, unethical or unlawful ways, and then the debate has not settled down, but maybe, if you can give us a perspective on, how do you see the future of advertising in a more what do we say privacy focused way, so that consumers would trust that their data is being used in a lawful way and only if they allow it to be used?

Luke 8:08
Yeah, no that's a great question too, and it's another thing. One of the things that we came to market with really early was privacy preserving advertising, and it you it leveraged our position as a browser, users information is on the browser. Why have everything that's matched based off of the user's data be done in the cloud when you can actually serve an ad from a browser that's based off of, you know, the full corpus of data there. And so we used a lot of that by moving the ways that, you know, the ads are targeted and matched onto the local client of the user onto their browser, it provided a way of targeting ads that actually was better and didn't require all these companies to use that data. And also, there's just such an excessive waste of of inefficiency in the advertising model outside, like in the one that you know, proliferated and kind of become that core model of having a free, quote, unquote, free web. It's so inefficient. I think people see now, when you've got these cookie consent dialogs that are required in Europe, there's, like, in some cases, 200 300 companies on these, and it's like, we don't even know who they are. You don't know what they're doing. But even worse, too, I think you know the interesting timeline here, if you look so Edward Snowden had all of his revelations right, and went like 2020 1220, 13. Ish, right? After that, you see this proliferation of like, business intelligence, data warehousing, and all these things starting to happen in the business world. And now the government just goes and they just can go buy the data like they don't even have to do what they were doing back when, when the Snowden revelations came out, they found out a way to commercially do it and make it legal enough to where you really have to be thinking about, well, how can I one make it effective for advertising? Because Do I really need to know every. Coordinate that you're located in at all times in order to be able to target you a pair of Nikes, probably not. Like, I don't think so. I think you need to know if the user has an interest in getting new shoes. And if so, like, Let's get them while that interest is hot. Or, like, if they need to know about Nike, let's make sure that we can put Nike in front of them. If they're, you know, if they're an addressable user. And these are not really, I mean, there are problems to solve, but you don't need to. People should not be having to give so much data. And I think also you'd be hard pressed to find somebody now, compared to 10 years ago or eight years ago or five years ago that had been impacted negatively by data breaches and a lot of these, you know, new threats that have emerged, and the best policy that you can have is the one that collects as little to no data about the user as possible. And the problem has always been that, like, there aren't really very many options that allow you to make that business decision without having and being able to make money with it, right, like, and so I think what we're starting to see now are an emergence of new companies or older companies, that are starting to look at, how can we look at new tooling, how can we look at new ways to target things? But on the advertising side, I think that there still needs to be a wake up call, and I think that it's going to take a little bit while longer for big tech to move on this, because you've seen it before, where Chrome spent, Google spent years saying, Okay, we're gonna, we're gonna, we're gonna sunset third party cookies in the browser, where we're gonna just a couple more years, a couple more years, couple more years, and now they're like, we're not gonna do it. And so it's like, you guys can't get off the crack. You know, it's like, they can't, like, kick the drug. It's not they're so dependent on it. And if you look at what's happening with like YouTube, where, okay, we're, we're really kind of putting as many ads in there as possible, or kind of the dynamics there too. It's not just about privacy. It's also kind of come into this realm of censorship and around how much does a creator have to like can, you know, alter what they're trying to put in the world because they don't want to lose money either. And so these things have all gotten so big as far as the issues go, that it's going to take them a little while, I think. But that's all opportunity for folks that want to do something different and novel and bring bring something that's effective, effective and private to market. And so that's kind of what we're doing here. It's like, you really have to with Brave. What we've done is basically, like, blocked all of that old world third party cookies by default, like, and just clean the slate and then show that look like you can have advertising if it's first party and you're not throwing third party tracking in it, you can brave won't block that. But if it's also like, you know, if you have new private privacy preserving ways of advertising like that's allowed, and brave, like we are search engines have ads in them, and brave, and users can control it beyond that too. But I think you know really like you've got to put that user's interest first, and you've got to make sure that look are their privacy protected. And if you go to a site and your your site is advertising things that don't have third parties. And you know, it's that site to use a relationship is still intact, and you're not kind of pulling the wool over their eyes and having 300 or so, but other companies kind of scraping their data without them knowing about it. And I think, too, there's just a practical part of this where, you know, it's 2024 you know, we're coming on to 2025 you go to some of these sites, and it's just like, the wheel never stops spinning on the loading and the contents all loaded. But these things just keep collecting and collecting and collecting and collecting and all this data. And it's like, really, like, are we moving toward having a better web for users? Are we going to keep playing these stupid games that are not just high risk to users, but now it's basically a big liability for also the publishers and the content creators that don't do they really want to have to collect all this data or let other people do it, and I think that the public's starting to turn on that in a major way. But I think, you know, advertising is going to be slower to move on that from what we've seen at least. 

Punit 13:59
 
Yep, it does make sense. But if we have to protect privacy, especially in the advertising, does the technology like maybe, say encryption or maybe, say anonymization have a role to play? Can that be a game changer, saying, let's encrypt the data and then it's not accessible for advertising? Is that a real possibility. 

Luke 14:23 
You're starting to see efforts around this, like where there's Trusted Execution environments, or, you know, other other methods that are out there. I think these are, there's going to be explored. You're going to see lots of shades of gray on this, because, partially because, you know, you look at Europe, Europe has a clear definition around, you know, what user privacy is, but in America, it's still very much privacy is kind of whatever the big companies say it is, right? And so you've got this world where, you know, Apple's campaigning on privacy, but, like, if you make a transaction through Apple, Apple knows what you're doing, right? Like, so there's all these kind of shades of gray, and I think you know, can the 10? Companies will need to innovate on this front further. And I think where Brave is on the spectrum is going to be more on that aggressive side of like, zero knowledge, like, we do not want to know but, but you're going to see shades of gray, and you're also going to see companies getting into the realm of like, look like, let the user consent to giving data to us. And that's not bad either. If the user is saying it's okay, like, then the users in control. And I think that that's a model that's starting to get more traction now than before, because I think data is like, pretty difficult to explain to users. Users have, like, from, from our experience, you know, a lot of the times, users didn't know the difference between what a browser was in their search engine was because of how all these things are tightly integrated, right? Like data privacy and these things, it just has to work, and it has to work well, and you can't expect a user, and that's, I think part of the problem with the regulation has been that we're really like dumping all this stuff on users, and it's better than nothing in a lot of ways, but also they just don't. They're just don't they're just going to blindly cook everything. And so it's one of those things where the real way to innovate here is to build stuff that works well and works well with privacy. And I think we're finally kind of getting there to where the tool hits good enough to where this is doable now, and or examples like brave and others have gone to market and shown that. One, there's market demand for this, and two, there there are technical ways to to implement this at scale that work. And so I think that that's, you know, those are big hurdles. I think, you know, when I started at Brave, like, back in 2016 the big effort was around getting the domains to adopt HTTPS right, like, and have secure, you know, have have the, have that level of security going. And that's been something that's scaled pretty well. But I think, you know, these things take a while to kind of roll out at a larger scale. You've got to have big anchor partners that are going to do it or change it on the tech side, or, you know, have something that has such demand that that already has that working well. And I think that, you know, eventually these things we'll meet in the middle, but you know, we're gonna be the standard bearer for the ones growing, the users that are becoming militant around this. 

Punit 17:09 
Yeah, but you mentioned one of the things that users have to be doing a few things, or users have to be in control of their privacy. So apart from, say, using the right browser or using the right search engine. Are there any practical steps that people can take, individuals can take to enhance their digital privacy, maybe without sacrificing convenience, as we call it. 

Luke 17:35 
Yeah, I think. And then that's been the other gap. Is then, in the past, having privacy tools meant that there was some trade off on convenience or on web compatibility. I mean, that's a big thing we've dealt with too, especially as over time. You know, a lot of these, a lot of these monetization services get built into functional things on the website. Um, there's, there's practical things people can do. One there's services you can use to, kind of use anonymous email forwarding. There's a couple of those out there that are available, those types of things, where you're not having to use the same email address for everything. Some people use that for convenience too, because they don't like getting spammed at their main email account for everything. There's other just practical measures to using fewer fewer native apps on your device like, and using more web services is a lot, a big way to kind of limit your digital footprint, because once those native apps on on mobile are installed, they have a lot of access to a lot of your information that you might not be aware of. Also, I mean, just, you know, making being careful with like, kind of just practical measures around this stuff, mainly, mainly we won't get in the browser stuff, but search engines you use to there's email service now, we're getting to a point where there's almost a privacy product or privacy tool for most things, and those things are getting easier to use. And so I think that, you know, the first thing, I mean, you can follow like Naomi Rockwell, she's really great on listing out different privacy tools and reviewing them. A lot of it is around, kind of, like finding the right fit for you and then finding something that's easy that other people are also on. But there's also a lot of kind of, like privacy lipstick out there, or whatever, where, like, you know, services like telegram will claim that they're private. There really aren't right? And so there's a lot of these kind of hooks that are used as more of this becomes marketable, that people should watch out for. But I think a lot of it is just like being mindful that what you put out there when you're posting on social, don't post stuff that you would want people to save, like, if you're traveling, like, maybe don't post it you're getting on the plane to go somewhere, right? Like, things like that. But also, be mindful of how things like location services are enabled, and try to use options like that more sparingly, because a lot of times, once these things are on, they're on and and so, you know, basic thing you could do just removing. A lot of the native apps from your phone that you don't use is a big, a big win on the privacy side, too. And then, of course, like using, you know, like I mentioned earlier, our products are helpful on that front, but it's getting easier. It's getting easier. It's not at the point where you can pay with a palm yet Whole Foods, but that's a that's been the challenge, right? Is to be able to build something that works in that way, and for for a lot of what we've experienced, I mean, that means that you have to do a lot more work as a builder in the space up front, because there are a lot of things that have become convenient that come at that cost of privacy. So doing something that normally takes an hour or two for someone to implement with the incumbent technology takes a little bit longer for us to initially engineer, and then when we're engineering it to be private, we have to figure out that next step of like, how do we make this easy for the implementer on the other end to implement because they have limited time, and, you know, a lot of times they're really not going to move unless you've got enough demand for what they're trying to do. 

Punit 20:59 
Yeah, and I think you yourself are doing a lot of work in this area through brow brave and I mean, you have a browser, you have a search engine, you have a podcast, and I think you also now have an AI assistant. Maybe, can you talk about that and explain how these things help protect privacy? 

Luke 21:17 
Yeah, absolutely. So, like I mentioned earlier, like Braves user first browser, so we block all of that third party tracking and scripts and all that as you navigate the web. We also do a lot with the core of the browser. So it's a chromium fork browser, but we go down to the core browser code and disable and Proxy A lot of the calls that go back to Google, which is surprisingly a lot. So we also harden at that level. But what we've also done with on the AI front is basically the browser has with an AI assistant can be super powerful thing, right? A lot of we use chat GPT for is kind of separated from the browser. But if you have a browser that has something like chat, GPT, or at that level, you can have something that can help you with everything you would use in a browser. So if you're at YouTube and you want to summarize the YouTube video, you can have Leo, our browser Assistant, do that for you. You can have the transcribe video conferences through our brave talk product. You could use it for all sorts of things there. We also, like you mentioned, we have a brave search private search engine, and that's our own index, so we're completely independent from Google and Bing. And one thing we've been able to do with that is we have an API for our search index that allows for real time data to get used with AI. So if there's something you know, a lot of people would get blocked on chat GPT by having, you know, common crawl data that's a couple years old or whatever, with with Brave Leo, and with what we're doing with API service, you know, you can get real time data in your outputs through brave Leo. And beyond that. I think there's some other things too. Like we're seeing all sorts of different people using AI for different things. We've added this ability to bring your own model to Leo too. So you can either use, you know, mixture or quad, or one of these llama, one of these other models that we include, but you can also bring your own model there. And we're playing with things like using local models as well from the browser, because the browser is a pretty data rich thing, and you could use it as way better personalization locally, et cetera. So we're doing that. We're doing it all kind of with that whole user first principle. So we're making sure that users information is just being processed and forgotten quickly. We're not storing the data like a lot of the stuff that you'd expect from Brave. We're applying on that AI level, which a lot of folks aren't. So it's one of those nice things where, you know, having that assistant there that's in the browser can be useful if you're a developer, it can be useful if you're a user. It could be useful for all sorts of different things. I think beyond that too, our search engine, we've had an AI answer engine to the search engine. So if you're I think about 40% of user queries will return a generative AI response. Ai answer at the top of the search engine results, it's all sources are cited and it's attributed to the proper sources, et cetera. So that you know the publishers are getting credit for that. But also, we have an ability for a user to automatically just override any query and get an AI answer. So we're seeing you have to bring as part of that convenience question you had earlier, you've got to meet people where they are, right? You got to meet people. People know search engines. They go to them every day, right? Like, if you want AI to get adopted, bring it there, bring it to the browser, bring it to where the people are, and make it useful for them. And I think the word of stage with AI more broadly, where everybody's talking about it being ubiquitous and everywhere. And you know, every you know, every you know, Fortune 500 is trying to figure out how we can get our AI thing going. And you know, we have these platforms where we can bring it to people right away and make it really useful. But it's also one of those things where you. The concept of a browser can completely change over time too. With an AI assistant built in, you could have the AI assistant ordering your tabs in certain ways. You can have it doing all sorts of different things to help you out that make it super useful. And so we're right on the cusp of getting to that level with it, but first, you kind of have to start with ground zero. So that's what we're doing. We're like, first, let's get something in that's like a prompt people are using, that's privacy preserving. And let's do this in search too. And those are big anchors for us to kind of start, you know, a launch pad from, and see where we can go with it. But, you know, giving a user the option bring your own model if you want use what we've got, have a free version. You can pay for a premium version, right? Like all these different things that let people use it based on how you know their needs are. 

Punit 25:43 
That's good. So people have brave browser, people have brave search, people have brave podcast and the Leo AI, so which is what they can use. But when we launch things like this, and I know you've been doing it for more than 10 years now, but the question usually is, in the free market, when the free products are being introduced, people know that money is being made through the ads, so the product is commercially viable or sustainable. But in context, when you say you're not using the data for ads, and we will not do anything that gives us leverage on your data, or say, monetization of your data. The question is, How sustainable is it commercially? Is it really sustainable? 

Luke 26:27 
Yeah well, it's a great question. And I think, like part of this is around. Well, we have, we have monetization of ads through the search engine. That is a revenue source. We also have premium AI. We have so brave Leo has a premium version, right, that has, you know, higher rate limits, more model selection, things like that, for more advanced users. And it's an open question, right? Like right now we're trying to see what the usage is like. How does this usage scaling? And what we're finding is that people use it. Is there some people that use AI for on a similar cadence of what they use a regular browser for? There are some people that use it way less often, right? And so we've got a dynamic where, over time, the cost to do it, it's going to decrease. Currently it's pretty high. But also you've got this rush for adoption and trying these different things out. So it's still like a pretty experimental phase, too. I think you know how we approach the premium. Model might change over time too, like we might say, look, make all the models available for everybody, and just make rate limits set, you know, at a cost, right? And so we know that, you know, we can monetize it to such a level because we've got, you know, a shopping assistant or something like that, that's going to bring us revenue share on, you know, things that are sold, if we're putting a helpful service out there, you know, these types of things will come with time, and right now, we're kind of like getting the building blocks there while there's demand in the market to try new things with them. So I think that it's an open question, but I think that we'll be fine on it. I mean, just looking at our early numbers and looking at where the world's going and just how powerful these tools are, it's going to be like most things, where you'll have, like, a pre premium tier and that there, there might be ways where we have an independent index, right? Well, we might be able to offer you, if users want to help us to build the index, potentially have a free version that's available for that, or something like that. There's lots of ways that we can kind of adapt this over time, but I think right now, kind of having this free leg that's not as robust, but then having a premium option there too, it's helping us to kind of navigate this initially, and then monetizing the search engine has been something that teams have been working on for several years, on getting that Bootstrap and going and so we're at a stage there where that's picking up as well. So most of those AI responses we're getting a search, we can put ads below them that are helping to monetize. You know, the whole the whole ship, basically, Punit 28:47 Yeah, and I think from a business perspective, you're also offering ads, so I'm presuming that's on a privacy protected way, but you still offer ads on your platform.

Luke 28:59 
Yeah, on search, we have search ads that are there based on what the users you know based on what they're querying. So if they're querying something like autos, and we can get show them a commercial result at the top of the search, that's just as good as what we would show a non commercial result as they're getting a search head for that. We have our brave ads. We have a new tab takeover unit that it's like a big billboard for everybody, all these are privacy preserving by default, like and you know, still lives in that realm of brave not wanting to know who the individual users are, but put them in touch with the businesses that want to put get their products to the right people. 

Punit 29:35 
That's wonderful, because that means you're doing a lot of good work to help people protect privacy and have trust in the digital world, and I'm glad to learn about it. But would you have one final message about basically, how can people protect Privacy? 

Luke 29:53 
Yeah, I think a huge part of it's around education, like learning about what tools are out there, learning about what project. Officer out there, and one of the reasons we wanted more people to learn about this was by launching a podcast. So we launched a brave technologist podcast about a year ago, and we have experts on whether they're business leaders or academics or regulators and policy makers like folks from all around the world, a lot of Privacy software or two, and even people from our own privacy research team come on in about a half hour episode each, and trying to kind of do that, it's like, let people know, like, what's out there for them? What practical measures can they take to, like, protect their privacy? How are people in the industry thinking about privacy? Because, like you mentioned at the beginning, like a lot of people are like, Oh, is this gonna Is this gonna wipe out the population, you know, and there's a lot of this hyperbolic kind of fear out there, but one of the things we really want to learn is how, what are people, what are the practical concerns, like, from people that are building or academics that are researching and developing? And a lot of this comes down to, like, privacy concerns and accessibility concerns. So we kind of have these two legs with AI, where a lot of people are saying, look like, if a country is not making AI accessible to their people, they're going to feel a generation behind as this stuff scales. So the challenge becomes like, how can we do this in a good, privacy preserving way? So yeah, it's if you go to brave.com/podcast that's where you'll see our podcast is there, and education is there for that. And then, yeah, I think, like, check out the Naomi Brock well too. She does a lot of great like privacy, product tools and analysis and things like that to help kind of build your tool kit. 

Punit 31:36 
So thank you so much, Luke. And one final question, if people want to reach out to you or connect with you. What's the best possible way?

Luke 31:45
Yeah I'm on X. If you want to look at Luke Mulks, l, U, K, E, M, U, L, K, S is my handle on X, you can DM me. My DMS are open. And then if you want to try all our stuff out, just go to brave.com it's a super free, easy to use and love all feedback too. So don't be shy about DMing me on on x with any good, bad or ugly. Especially bad or ugly, I'd love to make sure we're fixing stuff and putting things out there that people want to use. 

Punit 32:11 
So that's wonderful, Luke. We will put that detail in the show notes as well. So with that, I would say thank you so much for your time, and it was wonderful to have you. 

Luke 32:21 Likewise man, thank you for having me on. Really appreciate it. 

Fit4Privacy 32:25 
Thanks for listening. If you liked the show, feel free to share it with a friend and write a review if you have already done so. Thank you so much. And if you did not like the show, don't bother and forget about it. Take care and stay safe. Fit4privacy helps you to create a culture of privacy and manage risks by creating, defining and implementing a privacy strategy that includes delivering scenario based training for your staff. We also help those who are looking to get certified in CI PPE, CIPM and CIPT through on demand courses that help you prepare and practice for certification exam. Want to know more, visit www.fit4privacy.com that's www.FIT the number 4 privacy.com if you have questions or suggestions, drop an email at hello(@)fit4privacy.com. Until next time.

Conclusion

Privacy and innovation can coexist—but it requires a fundamental shift in how companies approach data, advertising, and AI. Businesses don’t have to rely on invasive tracking to succeed. Instead, models that prioritize user consent, encryption, and local data processing can create a more ethical and effective digital ecosystem. Another key takeaway is that trust is the foundation of the future digital economy. Consumers are becoming more aware of how their data is used, and they are demanding better alternatives. Companies that embrace transparency and put users first will not only gain trust but also set new industry standards for privacy-preserving technology.

Ultimately, the conversation highlights an important learning: privacy isn’t just a feature—it’s a necessity. As AI continues to evolve, it’s up to businesses, policymakers, and users to ensure that digital innovation respects individual rights and fosters a safer online experience for all.

ABOUT THE GUEST 

Luke Mulks is the VP of Business Operations at Brave Software (privacy-respecting browsers and makers of the Basic Attention Token) and the host of The Brave Technologist Podcast. At Brave, Luke works on creating new blockchain-based revenue verticals and bringing proofs of concept incorporating privacy and crypto to market to scale them into business operations. With over 20 years in digital media, advertising, publishing and startups, Luke Mulks specializes in transforming business with new tech. Joining the pioneers at Brave in 2016, Luke began applying his experience from working with Google, NFL, Warner Brothers, and Comcast to co-author the Basic Attention Token white paper, and to develop new Web3 business models. Luke continues growing the business of Web3 and educating people on these new technologies as host of The Brave Technologist podcast and live weekly BAT Community Call. Frédéric Lebeau CEO and co-founder of Datavillage. He has experience in the financial sector, cybersecurity and enterprise software. We created Datavillage because many of today's challenges cannot be solved in isolated data silos. Our vision is to create a world where organizations can freely and confidently collaborate on data, opening new insights and opportunities while maintaining full control over privacy and compliance. 

Punit Bhatia is one of the leading privacy experts who works independently and has worked with professionals in over 30 countries. Punit works with business and privacy leaders to create an organization culture with high AI & privacy awareness and compliance as a business priority by creating and implementing a AI & privacy strategy and policy.

Punit is the author of books “Be Ready for GDPR” which was rated as the best GDPR Book, “AI & Privacy – How to Find Balance”, “Intro To GDPR”, and “Be an Effective DPO”. Punit is a global speaker who has spoken at over 50 global events. Punit is the creator and host of the FIT4PRIVACY Podcast. This podcast has been featured amongst top GDPR and privacy podcasts.

As a person, Punit is an avid thinker and believes in thinking, believing, and acting in line with one’s value to have joy in life. He has developed the philosophy named ‘ABC for joy of life’ which passionately shares. Punit is based out of Belgium, the heart of Europe.

For more information, please click here.

RESOURCES 

Listen to the top ranked EU GDPR based privacy podcast...

Stay connected with the views of leading data privacy professionals and business leaders in today's world on a broad range of topics like setting global privacy programs for private sector companies, role of Data Protection Officer (DPO), EU Representative role, Data Protection Impact Assessments (DPIA), Records of Processing Activity (ROPA), security of personal information, data security, personal security, privacy and security overlaps, prevention of personal data breaches, reporting a data breach, securing data transfers, privacy shield invalidation, new Standard Contractual Clauses (SCCs), guidelines from European Commission and other bodies like European Data Protection Board (EDPB), implementing regulations and laws (like EU General Data Protection Regulation or GDPR, California's Consumer Privacy Act or CCPA, Canada's Personal Information Protection and Electronic Documents Act or PIPEDA, China's Personal Information Protection Law or PIPL, India's Personal Data Protection Bill or PDPB), different types of solutions, even new laws and legal framework(s) to comply with a privacy law and much more.
Created with