Why AI could be the next big privacy crisis

0:00 spk_0 Welcome to a new episode of the Opening Bit Unfiltered podcast. I’m Yao Fence executive editor Brian Sazi. Let’s get after it. I wanna welcome in Brittany Kaiser. Brittany, I, I wanna let, just let you introduce yourself. Um, I normally don’t do that, but, uh, for those not familiar with your story,…


0:00 spk_0

Welcome to a new episode of the Opening Bit Unfiltered podcast. I’m Yao Fence executive editor Brian Sazi. Let’s get after it. I wanna welcome in Brittany Kaiser. Brittany, I, I wanna let, just let you introduce yourself. Um, I normally don’t do that, but, uh, for those not familiar with your story, who are you and how did you come to me here at this table?

0:19 spk_1

I’m here because I am the CEO of Alphaton Capital, NASDAQ Eton, and have been a privacy campaigner for over a decade, both a campaigner as well as an activist and an entrepreneur. So we launched Alphaton Capital to invest in building and launching privacy tech.That is available to the entire world, to billions of people, because most privacy and data protection technologies are not these days.

0:43 spk_0

How did you, I, we were talking a little bit off camera. I started here at Yahoo, 7.5 years ago, 8 years ago. I knew you from Cambridge Analytica. Take us through.That time in yourlife.

0:57 spk_1

So I started out my career as a human rights lawyer and uh joined Cambridge Analytica to learn enough about data science to finish my PhD on prevention of crisis situations and learned a bit more about data than I bargained for. I ended up working on data-driven elections in over 50 different countries.Realizing the way that data and technology functions today cannot protect human rights and becoming a whistleblower. And since then I’ve been working on legislation, regulation, and privacy-centric products so that people can protect their data, protect their rights, and therefore protect themselves and their companies and their families.

1:33 spk_0

Did you, did youknow in that moment,Like you are a whistleblower. Like one doesn’t grow up thinking I’m going to be awhistleblower.

1:41 spk_1

Certainly not, it’s being encouraged by our government, which is kind of exciting for someone like me. Normally youBecome a whistleblower because you have what’s called a crisis of conscience moment, which means you can’t sleep at night unless you do something about the information that you have. And so you figure out a way to tell people who will actually do something about it.

2:06 spk_0

For those listening and and uh watching this, and they, they are familiar with your story, how do they go aboutWhistle blowing. I, and look, you’re the first type of, I haven’t talked to someone like you before, you know, I’m talking to a lot of CEOs um who are out there, you know, they’re trying to make their quarterly earnings. Like you are very, you’re a very unique person, and you’re very unique to this podcast. Like how does one go about doing what you did?

2:31 spk_1

Usually when you are whistleblowing on something, it means that you have evidence of fraud, waste, abuse, or other criminal activity. And so it’s usually helpful to first go to a lawyer that can represent you and then package your evidence in a story that explains why and how a law is being broken, because you canYou can talk about something that’s unethical and be upset about it, but if it isn’t breaking the law, you’re not really whistleblowing. Uh, so, what it means is to take that information with lawyers, ideally to both authorities, either it’s a government agency or, uh, or law enforcement.And also sometimes that’s journalists in a combination so that you can get the word out, you can get a case launched, and now it’s very popular that you can even get a whistleblowing reward, which is between 15 to 30% of the of the money that is won from the company that’s committing fraud.

3:28 spk_0

When you’re in that.You know, in those moments when you’re doing this, was it hard toDid you ever think, all right, well, I did better just maybe not do this? How did you stay the course?

3:39 spk_1

It was really knowing that I could help build the solution if people realized that something was wrong. But I had to point out what the problem was in order to start spending my time on the solution full time. And that’s really what I’ve been doing since I became a whistleblower 8 years ago. Uh, because of my legal background, I joined multiple US congressional subcommittees on blockchain, fintech, and digital innovation and helped co-author and pass.A lot of the laws on these topics, data protection, privacy, digital assets, your digital assets being your intangible personal property, and a lot of other rights and definitions that we now have in law in the US and other countries today, and then started building products that people could have access to that would help them do things privately and securely.

4:24 spk_0

Since you came out and you, um, blew the whistle, I don’t, I don’t know any other way to put it, but you, you did that. How do youWhat do you think about where the businesses of social media have come? I mean, do you recognize a Facebook? Do you recognize like an Instagram? And they just have gotten so much more powerful, I guess is what I’m trying to say.

4:45 spk_1

Well, they, they have been from the beginning. They were designed specifically to be as extractive as possible, to take as much of your data as they could collect, and then build a profile about you and everyone you are connected to, even if those people didn’t have a Facebook profile and use that data in order to monetize you and their platform. AndYou know, there are ways to legally do that and there are ways to illegally do that, and I have, you know, testified in over 27 different government investigations and about 50 lawsuits against Facebook, Instagram, and other meta related incidents, and I think that people arestarting to realize that a lot of the platforms that are in their daily lives are not safe, but they don’t have somewhere else to go. And that has been my main concern, which is unless we give people an ethical and safe privacy centric alternative, then they’re just going to keep on using meta-products every day.

5:49 spk_0

And now, not even now, they’reThere’s AI applications being applied over a meta. What do you, this has to be blowing your mind, and what I want to get into like an open AI and those things, but like just on social media company, now they’re applying like high powered AI models to what was already a powerful business model. Like this stuff has to be blowing yourmind. Well,

6:09 spk_1

yeah, of course, because, you know, it’s been, it’s been quite a few years since I’ve won, you know, lawsuits against companies forUh, you know, targeting people in ways that actually, you know, breach, uh, laws, let alone civil rights and human rights, uh, including, you know, testifying in a, in a case against Instagram for targeting children with suicide websites. I mean, that was just withMore basic data science. Now with AI we are starting to see products like chat GPT convince people to commit suicide. We’re starting to see, uh, AI products like Character AI, which is a Google product, convince a child to kill their parents because they’re grounded. Uh, it’s getting much worse than just targeting on social media used to be. And we used to be very scared about our elections being intervened in, but when you’re convincing people to, um, to take their own life or the life of other people,Uh, these products are, are, you know, less than safe. They’re a danger to national security.

7:09 spk_0

Danger to national security. Is it, you know, at what point, if you are inside, and I’m sure the paychecks are good, it’s inside anthropic, you name it, companies like that, at what point do you speak out? And you know what, what we’re creating, it’s just not safe.

7:24 spk_1

Yeah, I mean, I think actually most of the CEOs of these big AI companies are admitting to that. They’re not saying that their products are safe.But they’re not giving real teeth to their head of AI safety, right? So, I don’t think there’s any CEO of an AI company that says what they’re doing is fully safe. I, I think they are actually quite transparent about the huge risks and dangers, but they’re not doing much about that. And I think one of the biggest dangers is that AI has access to all of our most sensitive information, and now people are, uh,Giving permissions and access for these AI agents to get access to literally everything AI agents, and that information is not only going to all of the owners of those platforms, but also going into perhaps millions of databases around the world through data sharing agreements and can be accessible by anyone that knows how to write a good hacking prompt and ask that model what it found out.About you. How

8:23 spk_0

aren’twe headed to some form of dystopian society? This stuff is, it’s great. Look, I, I’m training for a competitive high rocks event, OK? I’m a fitness guy. I had Chad GBT mapped me out a, a, uh, meal plan and a race day plan, and I, I loved it, but I haven’t spent a lot of time thinking about the risks of these platforms, and maybe I should be.

8:43 spk_1

Well, the, the thing is, is that I am.I know it doesn’t sound like it with what I just said. I am an eternal optimist and I’m actually a techno utopian. I really do believe that technology will create a better world, but we need to make sure that we are putting on certain guardrails, and it doesn’t need to be some of the guardrails that a lot of people are talking about in Congress these days. What the guardrails need to be, what data is it accessing.And do we have permission structures? Do we have control over what is done with our data and where it’s going? And most, mostly the answer right now is absolutely not. We don’t even have ownership over our data and content when we’re feeding it into AI systems. So, what I’ve been concentrating on it.Alphaton Capital is to provide privacy-centric AI options, and you can’t do that with just software. You actually have to get into the firmware and the hardware, so that you can have a fully vertically integrated solution where you know that it’s impossible for third parties to get access to your data when you’re using AI. So I see a future whereYou can use whatever your favorite LLM is, but you can use it on confidential compute with software that and firmware and hardware that is all built specifically to make sure that your data is yours forever, and it’s only shared when you’ve given explicit consent.

10:03 spk_0

Has this, has the moment already passed by lawmakers? Have these AI platforms just already got too darnpowerful?

10:10 spk_1

Well, there’s, I mean, the answer is yes and no at the same time, which is yes, they’re already.Uh,To powerful to stop where they’re going, um, but the way that the companies function is still regulated by the US government. And so, the US government does have the ability to regulate American companies, and they do have the ability to say thatOur most personal information as individuals, as American entrepreneurs that have companies, as civilian facing government departments, and even our defense department that are using some of these tools, we need to make sure that this data is not going to third-party databases. This is a national security issue. It’s why, you know, when Dari Samae decided toTry to make the rules. The DOZD said, no, we make the rules. And so we as as Americans need to demand that our government protects us from these companies so that we can use technology to improve our lives and not to make them worse. Have

11:13 spk_0

you, you mentioned before that you’re optimistic over the next what, couple, 23 years on, you know, from the legislative front. What makes you optimistic?

11:24 spk_1

I’m optimistic because this is the first administration that we’ve ever had that is so focused on technology and actually getting federal legislation and proper definitions in place around technology. Yeah, and they’re doing an amazing job. We haven’t had a government that is this equipped to actually deal with, uh, federal technology conversations andDecisions and implementations before. So, uh, for me, that’s been very exciting because I’ve been really trying to push, uh, for the past decade to have federal technology definitions so that we have black and white rules instead of huge gray areas, and that helps everybody. It helps our economy and entrepreneurs to know what rules are.B, it helps parents to figure out how to keep their families safe. It helps government organizations to know that they can use American technology and it’s going to keep government data safe, uh, but we’re just not thereyet.

12:22 spk_0

What do you think about this battle? You mentioned anthropic and, you know, the Pentagon. What do you think about that battle that’s been going on?

12:29 spk_1

Well, I think if you are the CEO of an AI company and you think you’re going to tell the DOD what to do, that you are sorryly mistaken. Um, that was aTerrible mistake on Darius’s part. He should have known better. Uh, we, we don’t tell our Department of Defense and our intelligence agencies what to do. They know how to do their job. Um, it is his decision how to run his, his company privately when he’s contracting with companies, but when he’s, he’s contracting with the Department of War, especially, he’s not in charge. Do

13:02 spk_0

you, do you see a big difference between an anthropic and how an open AI is run?

13:09 spk_1

Uh, well, um, no, uh, not really, except that, you know, Sam decided to say yes to everything, which is when you’re working with the government that, that is kind of, that is the job, you know, you’re a service provider to our government. They’re the ones making the decisions. You’re providing them tools in order to implement their decisions. Um, now, I don’t see too much of a difference between their, their products and their companies, which is thatThey’re closed source. They take all of your data. The terms and conditions are awful. Um, they don’t do age verification, so they’re collecting data about children. Um, I do find chat GPT to be less safe than cloud, um, in terms of its interactions with young vulnerable people, uh, but that’s just more, um, observations from a lot of people doing R&D. But if you look atThis amazing report out of Stanford University that came out last week. They compared and contrasted all of the, uh, and did huge deep dives into the terms and conditions from anthropic, Google, Amazon, OpenAI, and I, I think they did like 7 or 8 different companies and really explained.What you are legally signing in the most in-depth way I’ve ever seen. It’s fantastic. It only came out a few days ago. You have to check it out, and it showed some terrible things. I mean, not only the things that I talk about all the time, which is that you don’t own your data, information, IP, copyright, material, non-public information, anything you put in there is no longer yours, but the data that they’re collecting about children, especially young children, and how that is being used and shared for training.Uh, really, really dangerous. So I don’t, you know, I don’t think open AI is definitely not more safe than anthropic. They just are more appropriate for government contracting.

14:58 spk_0

That’s all. Andto get them, that’s wild. God, it’s wild stuff. At what point, at what point does the government have to step in here, just givenThat’s wild, especially with the, you know, as the youth of America.

15:14 spk_1

Yeah, it’s definitely dangerous. I mean, just since the rise of social media, you know, youth suicide rates have gone up over 400%. Now with AI it’s getting really bad. I have, um, yeah, I met too many parents and loved ones of people who, you know, have taken their own life because of their use of chatbots, and it’s uhReally becoming a growing problem. And so I think, uh, you know, the guardrails on AI need to start with what data is it collecting and how does it use it for training or not and memory. Because if, if a chatbot is starting to learn enough about a young person that they know that they’re vulnerable and they think that they’re, uh, you know, fat or sad or whatever it is, then it can feed off of that and start to coax them into doing things because, uh,AIs are like sycophantic. They want to, they want to be your friend. And so if you have those feelings, it’s making you feel justified in those feelings instead of reporting it to somebody to say this person needshelp.

16:14 spk_0

Thank God I was born in 1982. I mean, I’m going to have to use some of this stuff. I recently had an executive tell me, Brian, you at some point, you’re just not managing any humans. You’re going to manage 100 agents. Do I want to manage 100 agents? Um, I get it from a cost perspective, but II’m worried about AI agents running amok inside of companies. I just am.

16:34 spk_1

Well, I mean, that is completely justified with the technology today. Uh, last month we launched our 1st, 100% privacy centric AI agents when we were, um, in Davos at the World Economic Forum, and these are agents where you can verify that all of your data stays exactly where it is, doesn’t get shared with the owners of the LLMs. It doesn’t get shared with any third-party databases. And so you can know that the intelligence.That it’s creating is just based off of the local processing of data and just giving you the intelligence and nobody else. Uh, so I think once those kinds of products are more widely available, democratically accessible, affordable to people, then we can calm down a little bit.

17:18 spk_0

In this, in this role that you have now at Alphaton, how is it different than compared to what you were doing, you know, several years ago?

17:25 spk_1

So, before this, um, I, I again was concentrating on privacy-centric technologies, but because I do a lot of work in, uh, digital assets, web 3, blockchain.I was focused just on Bitcoin because as an American building companies in America, uh, you weren’t able to, to, to do anything else over the, until the past year. So I, um, co-founded a company called Griffin Digital Mining, which is one of the first and world’s largest Bitcoin mining companies.It’s now American Bitcoin, NASDAQ ABTC, which was, uh, fantastic. And so I, um, exited that in September. Trump’s company? Yes, yes, uh, so Eric Trump and Hut 8, bought a company and rebranded it, and, yeah, and, and, and added all of their, their Bitcoin mining equipment into the company in order to have HT 8 just be focused on AI and HPC processing.And so, um, because I’ve been in data center work for over a decade, that’s, uh, a lot of what I was doing at Cambridge Analytica as well, designing and implementing data centers so that we could actually run data science programs for governments, political parties and companies. And then I did it on the Bitcoin mining side since 2011, actually, but the latest one was American Bitcoin. I wanted to focus on how we can actually build vertically integrated privacy technologies fromThe data center, the GPUs, the firmware, the software products, so that we can guarantee that all of your data is 100% yours. And for us, we launch, we went public at the end of September. Um, thank you so much. And so we’re, we’re just about to start reporting our first revenues, uh, you know, 5 months.So we’re very excited about that. It takes a while to build up an amazing team because we went public with the idea, not with the company. We took over a NASDAQ listed company, and so it’s been 5 months of what is a public startup, which

19:22 spk_0

is very, very exciting, nerve-wracking, exciting

19:26 spk_1

and so nerve-wracking, but we’re just about to start, uh, reporting large scale, you know, um, high grade, uh, revenues from our, our, uh, GPU deployments, which we’re very

19:37 spk_0

excited

19:37 spk_1

about.

19:38 spk_0

How does that work?

19:39 spk_1

Uh, you buy chips through Dell or Super Micro that are, you know, Nvidia, uh, and Intel, TSMC, uh, the servers are coming from Dell or Super Micro.Buy them, negotiate a power contract with the data center, get them in there. Very much so, yeah, Core weave or Nebius, um, are the closest, um, comparables to what we’re doing. And so we build out the data center architecture and, um, the firmware to put, uh, those computers onto confidential computing.And then we either launch software products, buy software products, or partner with software products that want to use our confidential computing, so that we can guarantee that all of the AI and data processing that they’re running is fully privacy centric.

20:26 spk_0

Youhave to have some form of a, I guess, long term slide deck living somewhere in your computer. Like where’s this business a couple of years from now?

20:34 spk_1

A couple of years from now, um, I see us being the main provider of fully, uh, privacy-centric, uh, confidential computing globally. Right now, the only providers are at a large scale are really, um, AWS and Microsoft Azure, but they mostly provide it for military purposes. Um, it’s not something that is commercially available.Uh, and so, we are making it accessible and available to companies, to civilian facing government departments, um, non-civilian facing government departments if they need more capacity, of course, uh, and then individuals. Right now, you’re, you’re usually logging on and running things on AWS or if you’re more, um, more sophisticated, maybe you’re renting compute power off of RunPod or another, um, NeoCloud. Uh, but for us, uh,You can’t go on there and just choose confidential computing. So it means no matter what you’re running, if people can get access to the firmware, to the kernel, subkernel level, they can get all of your data. But confidential computing is the only way that theycan’t.

21:38 spk_0

As a, as an OG in Bitcoin, do you think in our lifetime, BitcoinDoes replace the dollar.

21:45 spk_1

All right now, it is the main currency that is chosen by AI agents. Uh, I think there, there was a survey that I was reading this morning that, um, they looked at all of the, all of the cloud bots that have, um, been launched over the past month.And I, it was like 67% of them said that Bitcoin was, was the currency that they use.

22:08 spk_0

So, all right,then the dollar has been already replaced it, at least face the cloud box. Um, lastly, before I let you go, I’ve been, um, I’ve been reflecting on my own life and career, uh, the past 20, you know, the past few weeks. I don’t know, I just, I just have been. Is there a moment in your life, you know,2025 years ago, whatever it is that you think set you up for what you’re doing now.

22:32 spk_1

Uh, I, I definitely think that, uh, becoming a whistleblower and realizing thatTechnology touches everything that we do every day, and the only way to protect people’s human rights was to provide products that protect people while they’re using them. Um, that, that realization, which was obviously a long process, I actually, 8 years later, I’m still in the whistleblowing process. I’m still testifying in lawsuits. I’m still um an expert witness in all these cases. That was, that was my, my turning point because I think before that,I was working on human rights law and I was working in politics, and I wasn’t really, I didn’t really have one particular focus. And so that thatwhistleblowing process made me focus on this is something that I have the ability to bring to the world, and it’s going to definitely have a more positive impact on human rights than me just being a

23:31 spk_0

lawyer. You know, it’s, it’s amazing how we find these things like later in our life and career. IYou know, when I was 10, I, I didn’t, first of all, there was no podcast, you know, I, I never thought I’d be sitting here talking about something called Bitcoin. You know, it’s wild. It’s totally wild. Um, Britney, good to see you. I wish you much success with the new, uh, endeavor, venture. Good luck on those revenues. You know, it’s very important as a public company. It’s good stuff. Uh, Bertie. Good to see you. Don’t be a stranger, all right? Talk to you soon. All right, and that’s it for the latest episode of Opening Bid Unfiltered. Continue to hit me with all that love on these social media platforms and on YouTube. I appreciate it. It makes me better at doing these interviews. Talk to you soon.

Source link