What's News? Back to Work

This week we're back to discussing stories that have been in the news over the past few months - in particular the stories that have stayed with us and kept us thinking about them.

Video
English

Transcript

00:07.57
Gus
Welcome to The Technology Pill, a podcast that looks at how technology is reshaping our lives every day and exploring the different ways that governments and companies use tech to increase their power. My name is Gus Hosein and I'm the Executive Director at Privacy International.

Caitlin
And I'm Caitlin and I'm PI's Campaigns Coordinator. Hi.

00:28.45
Speaker
So this is a another edition of our What's News format, where we discuss the news that we found interesting over the last period. And in the Northern Hemisphere, where it's summer, I very much hope that you're traveling or have had time off from the various things that you normally do.

00:48.53
Speaker
And so we thought that this edition would be particularly useful for you, just in case you missed some of the stories that came up in the last few weeks, because maybe you've been on a beach or taking a break.

01:01.20
Speaker
And as ever, whenever we do these editions, all the stories we discussed today will be available in description wherever you're listening or on our website at pvcy.org slash techpillow.

01:27.05
Speaker
So the first round of stories essentially kicked off this summer in America. I don't know if you remember, but there was a lot of pressure by President Trump that by July 4th, in time for all the barbecues and all the fireworks, that America will have had its big, beautiful bill passed.

01:52.30
Speaker
Now, this was a problematic piece of legislation on countless fronts with the cuts to taxes the and the way that it regulates so many parts of society, including for nonprofits. But we'll leave that aside for the others to comment on. But we wanted to note particularly that it included a massive increase to the budget of the U.S. Department of Homeland Security.

02:22.37
Speaker
In particular, it was for the expansion of biometrics for immigration control, where apparently $175 billion dollars was being given to immigration control just for the fiscal year of 2025, and $28 billion specifically to Immigration and Customs Enforcement, or ICE, as they're known, of which $5.2 billion dollars was going to go for infrastructure modernization.

02:50.73
Speaker
And that's all code word for AI systems and biometric data collection and facial recognition and risk scoring and all these other horrible things, according to the industry outlet's biometric update.

03:03.48
Speaker
So yeah, that's a lot of money for toys for one of the least accountable parts of the US government. Yeah, the part of the US government which repeatedly has decided that due process doesn't apply to it.

03:15.55
Speaker
And ah combining an abject lack of due process with facial recognition, which is famously wildly inaccurate, seems like a recipe for obscene disaster. I mean, and America have already had a couple of cases of people being falsely arrested for like bad facial recognition matches.

03:33.41
Speaker
you... add that on top of ice kind of raiding and abducting people off the street with no you know identifying insignia like with face masks all that stuff with sketchy facial recognition risks and sketchy risk scores i mean it's just a recipe for disaster of facial recognition misidentifies you you don't get any kind of trial you don't get to talk to a judge you don't get to talk to a lawyer you just disappear i mean it's terrifying i mean it's terrifying now but it will be more terrifying then

04:11.13
Speaker
And to add to that, that toxic mess. And, you know, having just returned from North America and following the news very closely of all these celebrities who are now signing up to work for ICE to be part of the immigration raids. It's just horrifying. Wait, wait, wait, wait, wait.

04:27.20
Speaker
I don't think we can call Dean Cain a celebrity. Like this Superman who was on a B, like a B-level Superman TV show that was daytime TV when I was a kid in the UK.

04:38.16
Speaker
like Superman hitler who was on you know like the equivalent of a really rubbish soap opera back when superheroes weren't cool like but we both watched it we are but there's like a 20 year gap between the two of us and we both watched it and we both know who Dean Cain is and I didn't even have to say his name for you to know who I was talking about These are all valid points, but i think I think what he's doing is not particularly joining ICE, is desperately seeking like a way back into people like the public consciousness.

05:11.45
Speaker
like He's one of those people whose face I would recognize. I only know his name because of the recent stories. and like bake and Calling him a celebrity, I think, massively um ah gives him too much credit.

05:23.44
Speaker
Because what he's trying to do is become a celebrity again, right? To a very specific niche kind of... Bizarre audience. Yeah. and And this bizarre audience, he's actually becoming an American hero again.

05:35.94
Speaker
It's horrible. Just horrible. Yeah. It just dominated the TV screens for some of my time on vacation, which wasn't fun.

05:47.12
Speaker
But other stories linked to this, and particularly over the summer, um Wired found that for this summer alone, like i felt like a blockbuster period, for this summer, the that ICE would actually have login access to to medical data of 80 million people on Medicaid, including their diagnoses and procedures. And of course, they're going to use this data to track down people.

06:15.37
Speaker
And so your facial recognition concern about facial recognition being problematic just imagine them having access to all this Medicaid data, people's diagnoses, people's procedures, and then trying to discern something about their immigration status as a result of that. The thing about that is not necessarily that it's an effective way of tracking people, because I don't think it is. It's not necessarily that it's an effective way anything it's supposed to scare people like if you're kind of in the country and you have a migration status so you don't have a migration status it's to make sure that you don't seek medical attention or to and to make life generally harder which is familiar in the uk to a much lesser extent because part of the hostile environment for a while has been making accessing medical care harder and scarier like it's got a familiar echo which is really depressing

07:09.98
Speaker
Yeah, this is the thing that drove me nuts in all my travels. I went to two places this summer. But in in talking to people about American politics, they're so outraged by what's going on. It's like, oh, Trump did this. Oh, Trump did that.

07:25.03
Speaker
And I have to break their hearts by saying that, look, other governments have done that already. And including your own government has done that already. And for some reason, their outrage didn't extend to their own governments or to my examples of, say, the UK government already having done this.

07:44.02
Speaker
Their outrage just focused on the Trump administration doing this, which is... I think it's funny and I think it's familiar because it's the, we talk about Sven's powers a lot, right? And one of the ways that you can see people understand why things are bad is the hypothetical, oh, but what if this guy got into power? What if you lived in America and this was happening under Trump?

08:04.91
Speaker
And then people go, oh, yes, I wouldn't like that. And it's, you know, very sweet that people trust their government more than they trust Donald Trump. It's a low bar, but it's, you know, I guess reassuring.

08:16.19
Speaker
But it's not reassuring when that allows kind of carte blanche for these bizarre, sketchy, disturbing powers to be granted to the government. on the basis, I guess, of hope that they remain kind of respectable, I guess.

08:32.48
Speaker
And, you know, you get done or like you worry about the slippery slope argument, you know, like the classic, the argument on the internet ends when someone mentions Hitler. like it's ah it's a fat It's a fair critique. But also, occasionally, it is useful to think, what if the worst politician you can think of had this power?

08:50.31
Speaker
Because you can't guarantee that they won't in the future if you create the power now. But yeah, it's a tricky one. But it's also understandable. This why we focus on tech and data, because you know all these billions going to the Trump administration and going to Department of Homeland Security...

09:06.62
Speaker
They're trying to build new toys and these toys will be used by the next government and the next government. And they won't down these toys all of a sudden just because a new administration comes in or a new purpose ah arises for their use. and that's why, you know, I found it particularly chilling that The Intercept wrote a story about how.

09:26.36
Speaker
the Customs and Border Protection Agency ran an industry day where they basically put out a wish list for the tech that they would like to see. And industry was very willing to provide ideas and and solutions. and And some of the tech included the ability to use AI to do urban residential surveillance, the the ability to see through walls and more sophisticated autonomous systems, according to The Intercept.

09:53.83
Speaker
And so with their billions and industry profiting from those billions, the toys that are built are going to be used again and again and again.

10:05.62
Speaker
Which is, again, like another story that's quite familiar because the UK had a kind of reverse version of this day where they asked industry to come in and pitch them possible tech solutions to you know struggling with prison places and the size of the prison population.

10:21.09
Speaker
And that created a list of really disturbing, creepy ass technology like GPS trackers implanted subcutaneously under people's skin. Like, there is something uniquely awful about what's happening in America. And I think that's reasonable because of the volume and because of the kind of very visible implications for people day to day and because of the spread of it and because of the size of it and because of the speed of it. I think the speed particularly.

10:47.70
Speaker
But you're right, like, it's not unique. It's just dramatic. And it's kind of balancing the, this is particularly bad, but the, yeah, the things that you're mad at are not particularly unique. They're just happening all at the same time. They're the logic of our times.

11:05.48
Speaker
And so what Caitlin is referring to is a Guardian article that came out in early July talking about how the Justice Secretary of the UK government met with two dozen tech firms, including Google, Amazon, Microsoft, Palantir, IBM, Circle, all the fun ones.

11:22.45
Speaker
ah And apparently the firms, according to this article, suggested that using these tracking devices under a fender skin and also robots to contain prisoners in order to create, quote, a prison outside of prison and other forms of improving electronic tagging, which as we know at PI, because we've studied the use of electronic tagging in the UK, which is currently using cheap Chinese technology.

11:49.13
Speaker
I guess these other companies thought, hey, that's an opportunity for us. Yeah, and and and it's something that America's been exploring and it's something that is coming like from this direction to over there. yeah It's not that you know America is in inventing all these fancy new ways to do disturbing surveillance and authoritarianism. It's that it's kind of picking up on the worst ideas from other places and implementing them very, very quickly across a huge population in a very disturbing and violent and kind of militarized way.

12:22.28
Speaker
that we don't really see in the same sense in the uk but like is still a very present undercurrent and i think that's true of a lot of countries you're right because the kind of techno solution is a militarization trend that we've seen in the uk and the us s is also present in most countries because you know like a load of countries are struggling with budget a load of countries are wanting to clamp down on all sorts of different things and or be seen to be clamping down on things, which sometimes is the same thing and sometimes is really not.

12:53.57
Speaker
But yeah, that's a bit fun.

13:04.31
Speaker
So in a shift to a similar dynamic, but in a completely different domain, I thought it'd be interesting to reflect on the huge finding by The Guardian in early August that the Israeli Military Surveillance Agency, Unit 8200, has a contract with Microsoft to use Microsoft's Azure.

13:30.15
Speaker
Now, Azure is our cloud service that includes vast amounts of storage, but also the use of AI tools. And the Guardian found that Unit 8200 was using Microsoft's services to essentially store millions of communications by Palestinians and to then probably do additional analysis using the tools. Now, little can be actually known about this because Microsoft is claiming that even the CEO of Microsoft was unaware of what kind of data the Israeli military surveillance agency was storing on Azure.

14:13.04
Speaker
But what's interesting as follow on from that first is that apparently according to Bloomberg, investors are irate about this development and a group of 60 investors presented Microsoft's board with a shareholder resolution asking for a report explaining how it actually prevents the misuse of its AI tools.

14:35.08
Speaker
You know, that which then got even more interesting because Microsoft responded by saying it had conducted an investigation and found no evidence of misuse specifically according to Bloomberg article.

14:47.82
Speaker
um No evidence of misuse to harm people or violate terms of service. I think it blows my mind like that. Sorry, it doesn't. I don't think it will. It just blows my mind that terms of service may include, you know, not harming people.

15:00.51
Speaker
or perpetrating you know crimes against humanity um or there are Microsoft's AI code of conduct. But because of all this pressure, now just recently, Microsoft announced it's running a new inquiry to see what's going on. And they've commissioned their law firm in order to do this. So I'm not sure if we're going to find anything there. But the reason I find this even more interesting is that In May, The Intercept was able to access some secret internal Google reports where Google, according to The Intercept's analysis, Google was claiming it was worried about its own contract with the Israeli government because it couldn't control or monitor what Israel was doing under its deal with Google.

15:52.94
Speaker
And, you know, just another interesting thing about Google's contract with Israel is that Israel required Google to make sure that it wouldn't allow other governments to launch investigations into the Israeli government's use of Google services.

16:07.40
Speaker
So we're in this really interesting situation where Microsoft is saying nothing's going on here. Now, there's no abuse. Whereas Google was was worried, well, we wouldn't even know if there was abuse going on because we don't get to have insight into what's going on.

16:21.25
Speaker
And so mike Microsoft saying, oh, we're going to allow another investigation, but the investigation might find nothing. And the final thing I'll say is that when the Intercept story came out about Google's deal with the Israeli government, it also included the fact that Google had to staff a classified team who had Israeli security clearances in order to have this contract.

16:45.48
Speaker
And so it just creates this really difficult dark and murky relationship between governments and industry that, yeah, is likely just to get worse and worse.

16:57.40
Speaker
Yeah, P&I's got a big project looking at militarization and the cross-pollinization between military and civilian industries. So military technology moving into civilian field and civilian technology moving into the military industry.

17:11.25
Speaker
And it is bizarre when you're thinking about, oh do I use Microsoft? do i not use Microsoft? You know, what do i do with my life to think, oh, yeah, as I'm playing with Microsoft Azure or doing whatever cloud computing I want to do, so is the Israeli like version of GCHQ.

17:33.14
Speaker
I mean, that's weird. It's just kind of bizarre to think about that ba normal commercial technologies and softwares are also being used by government intelligence agencies and you know that's not necessarily kind of the weirdest part of the story but it is a bizarre aspect that that like civilian and military gap doesn't increasingly doesn't exist not that you know civilian technologies companies and military technology companies have always been 100 different and separate but it feels like there's a difference between you know

18:13.80
Speaker
an airplane company making civilian jet and a military jet versus software company making one software that those two state people use in very similar ways, but for vastly, vastly different purposes and aims.

18:30.88
Speaker
Yeah, absolutely. Like this was unthinkable, say five or eight years ago, and now it's just such the norm. And that line was crossed, it must have been four or five years ago when governments intelligence agencies started to do exactly this, started to outsource.

18:46.97
Speaker
And these companies who were always fond of saying, look, we're not defense companies, we are consumer companies. Well, they they crossed the Rubicon and now, um yeah, they're they're too big to fail for national security purposes.

19:01.15
Speaker
The last thing I'll say about this area of stories is that there was also an article in Rest of the World that covered how governments across Africa and Asia are negotiating now with these big tech firms when it comes to data storage and data processing while trying to assert sovereignty over this by saying that if they're going to have contracts with big tech companies they want to make sure that their nationals data are kept locally this is that policy of data localization that

19:33.44
Speaker
Well, we discussed this a little bit in our last podcast with Maria Farrell, because this is considered generally to be a a positive thing, this idea of sovereignty, except, you know, put into the light of how Israel asserted sovereignty over its use of Microsoft Azure or Amazon.

19:52.78
Speaker
the deal with Google. I never liked this sovereignty argument or this data localization argument because yes, according to the rest of the world article, it allows governments to make sure that big tech doesn't extract data or profits from the data of their nationals. And the governments included, it in according to the article, Nigeria, Vietnam, India, and South Africa.

20:17.91
Speaker
But it also allows these governments to make sure that they can do whatever the hell they want with that data and exploit it for their own purposes using the very tools of of Google and and Microsoft and others.

20:30.79
Speaker
I never know how I feel about this because, and it it goes back a long time, like for a long time, like China insisted that people opened offices in China and it was because China if you have staff in a country, then it's easier to tell a company what to do because their staff are not hostages to Fortune, but like a little bit, because you can hold them legally liable.

20:52.74
Speaker
And this is one of those like ones where I feel a little bit hypocritical because it's bad when some governments do it and it's kind of fine when other governments do it, which clearly is not a reasonable argument.

21:05.16
Speaker
Um, but, but because countries need to be able to hold companies liable for and like breaking their laws. So like GDPR in the UK and Europe are various different versions of GDPR.

21:18.57
Speaker
like Say European data must be treated and UK data must be treated a certain way. There data protection rules you've got to follow and there are legal consequences for not following them, which have to be meaningful and enforced, which they aren't always, which is a separate issue.

21:33.44
Speaker
And I understand trying to enforce data protection laws, for example. And Nigeria's got a new data protection law. And I understand saying, well, the thing is, you're a big tech company and you have to follow our data protection laws.

21:44.52
Speaker
And I understand how like sovereignty and trying to, I guess, ring fence would be the word, but I don't know if it's the right word. But that data in saying, well, you have to abide by our legal system for the data you extract from our citizens.

21:59.38
Speaker
I understand that. The problem becomes when you're not using it to protect citizens' data and to protect kind of your reasonable legal regime. It's when it's a regime of kind of censorship, of you know surveillance and scary uses of data when it becomes a problem.

22:20.20
Speaker
But unfortunately, in some respects, enforcing both those sets of laws looks very similar. And it's hard to disentangle when sovereignty, quote unquote, which is often like, yeah, localization, making sure data centers exist in particular places, making sure there is staff in particular places that you can hold liable for stuff becomes like concerning. I don't know where that line is. And I find it very difficult to kind of pin down what I think about it, if that makes sense.

22:51.16
Speaker
You've nailed so many of the dynamics and like the only two things I would add to that is you know That's why we take the fundamental approach. and admit No matter where your data resides in the world, to your rights should apply.

23:04.41
Speaker
And it shouldn't be limited to the measly rights that your own government may be willing to pass in their legislature and or ah for you, but should apply to your human right, which is not bound by jurisdiction.

23:20.56
Speaker
And on the inverse side, this is what's so worrying about the idea that Google now have to hire staff with Israeli clearance, like you know national security clearance.

23:33.05
Speaker
That means that every government that wants this is gonna wanna make sure that these companies hire, not just you know have local offices, but they have local offices staffed by intelligence agents or those who are approved by intelligence agencies.

23:49.99
Speaker
And again, we raised this in the last podcast with Maria. That's essentially what happened to the telephone system where telephone companies had secret rooms that were staffed by law enforcement and intelligence agencies and not just domestic intelligence agencies, even foreign government intelligence agencies.

24:08.77
Speaker
And that is the world that we woke up with with the telephone infrastructure that nobody likes anymore. And that's the world we're ending up with again when it comes to cloud infrastructure and AI. And it wasn't supposed to be this way.

24:22.19
Speaker
No. And it's just another like weird cross-pollinization where it's like, does that mean they're just an outpost of the intelligence service? like Is that what that is?

24:34.10
Speaker
If so, then it's not really a private company and it shouldn't be treated as a private company and it should be as transparent as a Well, I was going to say as transparent as the equivalent government agency, which are wildly untransparent. So it's a terrible, terrible like example. Yeah. But you're nailing it. And it's because, you know, um last month, one of our colleagues, Yanis, gave a presentation where he he showed the history of UK surveillance. And he was also mapping the history of the UK telecommunications infrastructure, going back to the post office and how it was all government run until it got privatized. But still, there were elements of it that were not privatized, such as a surveillance regime. And yeah, this is this is how it's been done. And this is how...

25:19.11
Speaker
It's just for a moment we thought that the internet was was going to be different. And that's that's what Maria is talking about. Well, we can rewild the internet back to that concept that it doesn't have to be like everything else that we've made horrible.

25:32.20
Speaker
But it sure as hell feels like it's rushing that way. Yeah.

25:48.67
Speaker
So the next one I thought maybe you might want to run with, Caitlin, which is there was a federal jury in the United States that found that Meta had violated California's wiretap law.

26:00.00
Speaker
Oh, yeah. This was a really interesting one. So we've done a ton of research on menstruation apps. In fact, we started out just doing apps. I think we picked like the 100 most popular apps in like, you know, 5, 10 years ago.

26:11.17
Speaker
This was 5, 10 years ago. We didn't pick 100 apps that were popular 5, 10 years ago. And when we I say we, it's slightly before my time at PI, but we went through them using the cool tool we've just created. I think we did it partly because we just created this cool new tool, the data interception environment, to see where data was going out of these apps. And it turned out a ton of them had implemented a software development kit from Facebook.

26:35.20
Speaker
And that meant you could log in with Facebook. It was like, you you know, oh, a cute little feature, whatever. And it was really popular at the time. You still see it on some apps. It's like continue with Google, continue with whatever.

26:45.82
Speaker
But it was Facebook. And a software development kit is basically like code that you can kind of copy based. It just is a shortcut for a developer say this code block will get you this thing.

26:57.15
Speaker
not going with Facebook. The thing is, the default on this code block was send all information to Facebook. And so a ton of information from all of these apps was being sent to Facebook. And we contacted a fan of those apps and we said, did you know that you were doing this?

27:09.74
Speaker
And a ton of them said no. We didn't realize that was owned up by default. They changed the default, way less data getting sent to Facebook. Huge win for us. We followed it up with a specific look at some menstruation apps like a year or so later.

27:22.08
Speaker
And a lot of them have the same problem. And this has ultimately come back round So there was a lawsuit against a couple of different people, Flow, who are a period app, Meta, and I think a couple of other people, an analytics company called Flurry, other people. And basically everyone settled except for Meta, who took it all the way through to a jury kind of case.

27:46.32
Speaker
And the jury finding was that Meta had violated California's wiretap law by collecting data from the Flow period tracker app without user's consent, which... Yeah, they 100% did.

27:57.23
Speaker
We know that they did. And this was mostly focused on that period of time, the 2017-2018 period of time when we were initially looking, where a ton of data was being transferred to Facebook from the minute you open the app, like, without any requirement for consent because of this weird default.

28:12.82
Speaker
And yeah, like, we've done this a couple of times since, so... This was the original one, which you can still find on our website privacyinternational.org forward slash, I think, appdata, A-P-P-D-A-T-A.

28:25.39
Speaker
And then we did it again with menstruation apps, I think around 2019, 2020. And we found a lot of similar issues. We did it again more recently. We did a weird one in the middle with some DSARS, but we did it again more recently with the DAI again, the data interception environment.

28:38.76
Speaker
And what that found is new problems. Although interestingly, a couple of apps were still trying to send, desperately trying to transmit information to Facebook. And Facebook was blocking Facebook was like, we don't want this. Stop sending this. And it was just Blocking the requests, which is really funny.

28:56.03
Speaker
but They had a lot of other kind of interesting issues. Increasingly, they're kind of, I would say, based on the trend that we've seen with the most recent set of research, which is pvcy.org forward slash period data. Let's go with that one.

29:08.91
Speaker
because I can't remember the really long learning resource link. It'll be in the description. But I think the trend is they're kind of slowly coming into compliance with data protection law. It's crazy that it's taking this long, but like they are slowly coming into compliance with data protection law.

29:22.78
Speaker
The issue being the environment that we're in has changed quite significantly since 2017, 2018 when it comes to the importance of privacy and security of your menstruation data.

29:33.64
Speaker
And that's, you know, particularly true in the US, but is actually true of lots of places. And that in fact, so if you listen to our episode about called, I think, Cycles of Control about periods a couple months ago, you know, a bit more about the specifics, kind of, there was a weird period in the UK where it became increasingly dangerous.

29:51.41
Speaker
very important and then it got decriminalized so it's a slightly odd story in the uk but yeah so all the menstruation apps seem to slowly be coming into compliance with age protection law but the question is is that adequate and when you're looking at period tracking apps what are the things that you should be looking for so if that's something you're interested in i will link the podcast and i'll link the research and we'll link the story about this particular court case but it is funny and it's also kind of funny that it's taken this long um the case started in 2021 ah but everyone yeah Everyone but Meta settled, but the actual violations like 2017, 2018, it was a couple of years before 2021.

30:31.53
Speaker
Financial penalties have yet to be decided, I think, ah but I imagine they are a quite large. And people are expecting Meta to appeal, which, good luck with that.

30:41.81
Speaker
I mean... Couldn't happen to a better bunch of people. I mean, if you're interested as well, actually, in the tool that we used, we have a podcast on that as well from a few years ago. And the older version of the data interception environment, I believe, is still up on GitHub. So it's a little bit out of date and a little bit complicated now.

30:56.83
Speaker
So I guess the theme of these last few stories is, yeah, we kind of told you that this was going to happen, is apart from the meta- and period tracker story is also, well, this summer has been a fascinating summer for the AI industry, mostly because it's been a bit of ah a quiet fail.

31:17.13
Speaker
Everybody was looking forward to the launch of GPT-5 and it it ended up not being as exciting as the world was hoping. And then also OpenAI i moved a little bit more into agentic AI and it,

31:33.28
Speaker
Also wasn't as exciting, although we did a bit of a write-up on it. You might want to have a look at. But when it comes to ai assistance, there are two things that we've been harping on about. First, of course, the privacy and security issues around assistance and agents, but leaving that aside for now.

31:50.20
Speaker
The second was that we've long been waiting for when the big tech AI firms would start doing what the big tech firms have become billionaires and trillionaires over, which is the shift to advertising.

32:07.60
Speaker
They've been so focused on trying to sell themselves as great innovators when it comes to AI in order to get a permissive legal environment for it, that they haven't been as overt about the likely shift to advertising.

32:22.24
Speaker
using your data to profile you in order to advertise to you. But that started to shift in the last few weeks where there was a story from TechCrunch coming out of Amazon's second quarter earnings call where the CEO of Amazon, being Andy Jassy, said that their Alexa Plus assistant ah could work derive new revenues by incorporating ads to apparently help people find discovery.

32:51.34
Speaker
not quite sure what that A very sensible, insane sentence that definitely has meaning. Yeah, exactly. Yeah. And that was a quote. I didn't make up those words. And not to miss the wave, Elon Musk just days later,

33:04.78
Speaker
in a live discussion with advertisers for X, said that he would allow marketers to pay to appear in suggestions from X's AI implementation being Grok.

33:17.59
Speaker
And so, yeah, this will get interesting and it will get dark in the sense it's interesting because these AI assistants will increasingly know more and more about you and your interactions with you.

33:30.88
Speaker
And you are trusting these AI assistants to provide you with trustworthy information. But when that trustworthy information is brought to you by whichever company that wants to advertise to you, then you have to even further question the integrity of the data being presented to you.

33:50.40
Speaker
I also it is different when it's presented in a conversation. Like I know, know, I don't love large language models because they are functionally like autocorrect, kind of gone on a wild weekend.

34:03.72
Speaker
um But they are presented in a conversation and I do think that feels different and is harder for people to kind of evaluate and think about. And part of the thing that I thought was disturbing with the chat GPT-5 thing was was not that people did or didn't like it. It was it was kind of the fervor with which some people felt that the previous iteration of ChatGPT had mattered to them personally and they had a personal relationship with it and they liked it.

34:33.40
Speaker
And the way that ChatGPT five the updated version talked didn't feel as friendly and they you know their personal relationship was being disrupted and that freaks me out um because it's autocorrect gone wild it's like you know it's not a person it's not it doesn't have thoughts or feelings or relationships but it feels like if you're vulnerable or if you're you know seeking connection then it it it does kind of it can sneak in there into your psyche in a way that freaks me out a little bit I think in the way that like Instagram and Facebook disrupt

35:16.69
Speaker
relationships by making you feel like you're kind of keeping up with your friends when in fact you're just kind of following them from a distance and not interacting with them. I think to some extent for some people, LLMs and these chat assistants and chatbots do something similar with human relationship more generally. And that combined with this replacement of search on this and certification of search combined with advertising, it's just really creepy.

35:48.26
Speaker
Do you know which AI company said they were going to start using AI to make ads that you could talk to? oh no, I didn't hear that one. I can't remember which company it was, but they they were suggesting that, yeah, they were going to make ads that you could have a conversation with, which...

36:02.98
Speaker
Seems bizarre on two levels. One, obviously that conversation gathering a lot data and is based on a lot of data about you and with you. But also for that company, that is a high risk strategy given the way that these chatbots have but really struggled with like guardrails and being sane and and coming up with random nonsense, including making things up. Like, I don't know how...

36:31.59
Speaker
If you're a company and your chatbot advertisement claims certain things about your company, I don't know how that doesn't fall into false advertising and fraud kind of laws, because surely it must.

36:44.39
Speaker
But yeah, it's a high risk strategy, certainly. Yeah, linked to what you were just talking about. There were some stories this summer also about how Meta was trying to make its chatbot more kind to teenagers. Yeah.

37:00.83
Speaker
or more attractive to teenagers. And that, you know, when you talk about freak me out type of feelings, that does sound quite dark. Now, don't know if there's a privacy or surveillance angle to that, but there's definitely a That sounds creepy and I don't trust the company that we were just talking about that was collecting data and has you know done so many things wrong.

37:19.89
Speaker
Well, i think it does come back to privacy and surveillance because if you're trying to put a friendly, inviting face on what is ultimately a data collection machine, like that is creepy. you know a lot of what they're doing is trying to make a data collection and advertising machine seem fun and engaging. Yeah.

37:40.07
Speaker
it is a little bit misrepresentation and it is a little bit sketchy and it is a little bit creepy yeah absolutely

37:55.11
Speaker
And just to take this in a slightly different direction, but the same themes, there was a study in July that came out that caused quite a bit of news coverage. And this the study was from Pew Research Center, where they studied 900 U.S. internet users.

38:10.83
Speaker
as they used Google's search and AI summary function and they found that Google users who encounter a AI summary were less likely to click on other websites and then in fact only one percent of users would actually click on something so that means when you do a search for something on Google Normally, or in the traditional way of using google search and search engines generally, you would get a list of websites that are responses to our your query. But now with Google's AI summaries or AI overview, as they call it,

38:48.19
Speaker
it provides a summary or an answer to your search saying, oh, um this might be the thing that you were searching for, but doesn't necessarily include links. And when it does include links, what this study found is that people don't click on those links. They get everything from Google.

39:06.17
Speaker
And that's very bad for the for the web. That's very bad for content creators because basically, Google is just keeping users for itself already. Like a prior study from, I think it was last year, found that even when people were clicking on things from Google search, a lot of those clicks actually stayed within the world that Google already owned.

39:32.51
Speaker
So this whole idea of an intranet a worldwide web has already been reduced to the properties of Google or the data collection exercise of Meta and so on and so forth.

39:45.42
Speaker
And adding shitification to that, and then all of a sudden you're going to see the end of the intranet as we know it. An example was given by 404 Media by one of their journalists who analyzed his own story and how it ended up on Google's AI overview, where his story on Google AI's overview did not include a link to his story. That is, he wrote a story about Spotify and something related to AI.

40:13.59
Speaker
A Google search about that issue generated an AI overview. And it covered exactly the issues that he raised in his in his original piece, but didn't include a link to his article.

40:28.35
Speaker
And in fact, and this is where it gets even, the inshidification gets worse, it included links to other news sites that had aggregated his article.

40:39.11
Speaker
And he got the impression that those news sites even used AI to generate their data. aggregation of his article. And so you have this situation where content creators are no longer going to get visitors and instead AI is going to summarize their their work and not even linked. And there'll be a layer of content aggregators in the middle who are going to be the ones making money if they make any money, if it's not just Google and the other AI firms. And all that does is, excuse the slightly gross metaphor, but it means that Google is hoovering up the layer of that's floating on top of like the ocean information.

41:17.78
Speaker
And all that's going to do is ensure that that ocean that the shit is floating on top of shrinks because there's no incentive if no one's like looking at it or using it there's no incentive to create it it's like you know take pi it's it's hard work to put together some of the research that we do or the work that we do and if no one ever sees it looks at it interacts with it engages with it because there's you know some ai sloppy rubbish version that google's hoovering up that's actually being seen Then over time, the people that make the stuff that that relies on will decrease, decrease, decrease and decrease in quality and decrease in quantity.

41:53.90
Speaker
And the AI slot will get worse, which it's already doing. yeah And then Google will get worse, which is already doing. But it won't have that solid basis of information and fact and kind of, you know, work that it's currently all based on.

42:08.49
Speaker
Which actually, i think, ah comes back to, at the same time, all of those websites are getting hammered by AI crawlers. Like, we've had this repeated thing at PI where, oh, you know, our servers are being hammered, the website's gone down, ah we're getting a huge volume of requests. Oh, what are they from? Oh, it's it's another AI crawler.

42:28.63
Speaker
And so you're getting it from both sides. You know, Google doesn't want to send people to your website. They want to avoid doing that. But they do want to use your resources and your server time and your money to ensure that they don't send people to your website. Like, it is immensely frustrating. And what our tech team did for a while, which is quite funny, is um start redirecting those some of those crawlers that were particularly abusive to our donate page. Yeah.

42:54.51
Speaker
Or actually, there's there's a specific error code that's like payment required. And they were sending them to that and they redesigned that page to include our donate link because we are a charity. And if you are interested in ensuring that our servers can keep running, then by all means, go to pbcy.org forward slash donate.

43:12.38
Speaker
But ah particularly if you're running one of these abusive web crawlers, which hoover up all of our server time and resource just to ensure that you don't send people to our website. I don't think I'm breaching any financial privacy rules by saying that to date, no AI crawler has donated to PI, unfortunately.

43:32.99
Speaker
And the reason um that example is s newsworthy, actually, is that this summer... Cloudflare, the the firm that basically helps websites stay alive, that our techies also refer to as a man in the middle, they have started to do something similar, which is when it comes to these crawlers, they're trying to create a a system where these crawlers can be forced to pay for the consumption of the content.

44:00.16
Speaker
or the hammering of the web servers. And that will be interesting if it plays out better for Cloudflare and its users, and it did for us. Well, there's also copyright issues and other things when it comes to paying to use that content. We're not suggesting, by the way, at any point that any AI web crawler that does donate to PI in the exchange for hammering our servers retains any legal copyright rights or anything.

44:26.90
Speaker
Technically, PI's content is under a Creative Commons license, which is a share alike. And I think... attribution license i can't 100 remember the numbers involved but it is a creative commons license so we have less of a of a kind of legal copyright argument than other people but yeah it is interesting to see how people are trying to work out where the balance is and how to make it reasonable and fair but i don't think we're going to understand the damage that ai has been doing like structurally to the structure and infrastructure of the internet for a long time and if you're running you know like

45:02.36
Speaker
One of the myriad of small kind of fun websites and forums that the internet was built on the back of and your websites, the your servers being hammered, like you don't necessarily have the resources to keep it up just straightforwardly.

45:15.42
Speaker
And that's incredibly Yeah.

45:24.76
Speaker
Okay, so let's end with one final story, which is it being the end of summer and it's back to school time. Or by the time you're listening to this, in Northern Hemisphere, many schools have been reopened and kids are going.

45:40.84
Speaker
And Caitlin, you found this last story. This is a super fun one and it's actually i got a lot longer history than just this one story. But in Florida schools, they're about to start a pilot program of deploying drones.

45:58.01
Speaker
It's about $1,000 per month for a school of 500 students. And they're going to deploy the drones for school safety and preventing like an active shooter situation.

46:09.66
Speaker
The drones can fire pellets to knock down suspects and they're very difficult to shoot down. They're in a box on campus. A silent alarm gets triggered, the drones get deployed and then they can start shooting at people.

46:23.58
Speaker
Is this sane? No. A few years ago, there was a company called Axion, which suggested deploying taser drones in school for a very similar reason. Its entire ethics board resigned and it ultimately dropped the idea ah because it is, i mean, bizarre, unreasonable.

46:41.93
Speaker
It's just bizarre. I mean, the governor, according to the CBS News article, the governor of Florida, Ron Desantis, has approved just over half a million dollars in the 2025-2026 budget to fund a pilot in three Florida school districts of these drones.

46:59.02
Speaker
It blows my mind that these kind of like what are functionally military technologies being deployed in schools in response to a situation which happens almost nowhere else.

47:12.29
Speaker
Like this is the thing I think that drives me crazy. This doesn't happen at the scale that happens in America. anywhere else in the same way but rather than tackling any of those issues which has led to this situation florida has decided that the most sensible thing to do is deploy drones that can shoot at people with powder pallets but which can knock people down and can hurt them That doesn't seem to me like it's going to make children safer in schools. It doesn't seem to me it's going to make children feel safer in schools.

47:43.63
Speaker
It seems to me that it is a sticking plaster on a terrifying and unreasonable problem, which requires much larger scale structural changes rather than these bizarre things.

47:57.61
Speaker
bizarre nonsense technologies. So that this is what I find most interesting about this article from CBS, because it it ends up quoting a school principal. And the school principal starts off how you started off, Caitlin, where he says, and I quote, school safety used to be a fire drill.

48:16.76
Speaker
fire alarm or tornado drill. This is Florida, of course, once a year. And he goes on to say, it's now active shooter drills, hostage drills, things of that nature. It's turned into traumatizing events for students and parents. And I'm going to end the quote for a second because that captures what you're saying, which is there used to be a normal.

48:36.00
Speaker
And now for some reason, there's this new normal. But i going have to finish with with with his quotation as he finished his quotation. He goes on to say, We send our kids to school to be safe.

48:48.24
Speaker
I think this is another step in the right direction to ensure the safety of our kids. End quote. So the principal says, yeah, things used to just be slightly crazy. Now things are very crazy.

49:00.60
Speaker
But because they're like that, it's okay to take this extra crazy step. And it's like, when do we? ha Yeah, I, I, yeah, yeah. You know, ah.

49:11.25
Speaker
it is It's a serious and scary problem. And like, America has not found a way to deal with it. Does that mean deploying these drones in schools is the next reasonable step?

49:24.74
Speaker
No. Like, I think that's pretty clear. It's a pilot program. We'll see what happens. I would be shocked if at least once someone didn't get hurt. because of these drones. I would be surprised. that The hope is that they're never deployed, as is always the hope with these measures, that they never become necessary.

49:42.06
Speaker
But yeah, it's just a horrendous situation. The logical disconnect is just very frustrating. And i mean, the taser drones were also mad.

49:53.52
Speaker
So I don't want to end this edition on a with Caitlin being very frustrated and miserable. You know, summer vacations aside and all that. you know is there is Can you think of any interesting news story that you want to end on that was fun or curious or exciting or a positive direction?

50:18.79
Speaker
i mean we're taking action at least on facial recognition in the uk because it turns out that the police have been abusing several different databases in the uk for facial recognition searches when they shouldn't have been because there is no legal basis for facial recognition or legal system or legal regime or safeguards or approval process required.

50:42.25
Speaker
So we're taking action on that, which you can find out more about our website. And we're also, to as of, I believe today, submitting a hundred page complaint to the ICO in the UK. That's the the UK data protection regulator. Yes. Sorry. Yes. About UK sketchy immigration algorithms and algorithmic management.

51:00.39
Speaker
And, you know, we've got an increasing... endlessly increasing number of legal battles and complaints around the world that we're currently exploring or participating in so um there's that i guess keep an eye out for good news can keep an eye out on our website if it's not being hammered by ai crawlers but this is all part of the same theme we've been talking about for all the stories which is bad stuff are happening and in you'll read about specific circumstances in specific countries, but what we see is that this is happening across the world and so we're taking the fight in the various ways we can and writing reports on our website that will hopefully be crawled but not destroyed.

51:41.69
Speaker
And you can read more about them by visiting pvcy.org. But it's going to be an interesting September, October, November, December because of our worldwide listeners. I don't want to say what seasons those are.

51:55.37
Speaker
But yeah, we have a lot of interesting work coming out along the lines of all the stories we covered today. So we look forward to talking about those more in the future.

52:16.37
Speaker
Thanks for listening. You can sign up to be the first to learn more about our work and the upcoming work we talk about at pvcy.org slash pod sign up. And I know we discussed a large number of stories in this edition, and we will include some links to the relevant articles and information in description wherever you're listening or on our web website at pvcy.org slash tech pill.

52:41.70
Speaker
Don't forget to rate and subscribe to the podcast on whichever platform you use. Music is courtesy of Sepia. This podcast was produced by Max Burnell for Privacy International.

Media type
Related learning resources