The disinformation economy isn’t invincible, meet the two women working to end it

Nandini Jammi and Claire Atkin of Check My Ads want to make publishing lies unprofitable, while also helping advertisers protect their brands

Episode Summary

By now, a lot of people are aware of and concerned about the lies, hatred, and rumors that are so rampant on social media. The social media companies are definitely responsible to a large degree for letting this happen. But they don’t deserve all the blame. There’s another sector of the internet economy that has been promoting and even paying the merchants of hatred and lies—the advertising technology business, better known as “Ad Tech.”

What is ad tech? Basically it’s the software and services behind the zillions of ads that are basically inescapable on the internet through a complicated system of exchanges and instantaneous bidding. These companies are known for their practice of tracking and spying on people’s browsing habits, but they also are, in many cases, lying to the advertisers themselves about their ads—who watches them and where they appear.

The long and short of it is that companies who advertise are often unaware that their advertising budgets are being used to subsidize bigoted and hateful content.

How this all works is very complicated but the good news is that there is something that we can all do to help stop these shady practices. In this episode, we feature Nandini Jammi and Claire Atkin, the co-founders of Check My Ads, a new organization that’s working on helping organizations and individuals put an end to online bigotry.

The archived video stream of our conversation is below, a transcript of the edited audio follows.


Transcript

MATTHEW SHEFFIELD: Thank you for joining me today, ladies.

CLAIRE ATKIN: Hi, good to be here. 

NANDINI JAMMI: Thanks for having us. 

SHEFFIELD: As I said in the intro, this is a complicated subject. And so I want to maybe go back a little bit in the history here of how web advertising works and how it used to work. Because ad tech is complicated, and a lot of people don’t even know it even exists.

So let’s maybe start with the background. The way that advertising used to work on the internet was that people who had websites [00:02:00] would sell space on their website to advertisers and they had ad salespeople and things like that. And gradually Google and some other companies came along and said: ‘You don’t have to do that. We can sell that space for you.’

And that’s how the technology behind ad tech began to emerge. But then it’s become gradually more and more complicated and using a system of instantaneous bidding. Do you guys want to do a little overview on how that works?

JAMMI: So yeah, the way that advertising works is, and the way that it has worked for the past 15 years or more, is that advertisers no longer place ads themselves on the internet. What happened is that they handed over control first to Google. Google was the first programmatic advertising company on the market, and changed the game, obviously.

And they hand over their ad placements to Google and other ad exchanges, which basically do the work for them. So what used to be manual is now programmatic. And what they do is place your ads across the internet for you and claim that it’s more efficient and that it gets you better results as a result of them placing these ads at scale.

Now what happened is that over time, these ad companies started to invite more publishers into their inventory and that made it very difficult for advertisers to really understand where their ads are going. If you have a site placement list of like over 400,000 websites, who’s going to ever check to see what those websites are, and what kind of content that they’re peddling?

So what happened really is over time, really large brands started to scale up their advertising, spending millions, and even collectively, billions of dollars and move their ad spend from other channels towards digital advertising. And they don’t know anymore where their ads are appearing on the internet.

SHEFFIELD: Yeah, that’s right. And the way it works is through this bidding, instantaneous bidding. You want to talk about that Claire, a little bit? 

ATKIN: Sure. It’s just like the stock market. It’s called real-time bidding or RTB and it happens instantaneously. What happens is the advertisers who are being represented by ad tech companies will bid on a placement and it’s always on a per [00:04:00] placement basis. That is every unit of ad that you see that’s running around the web after you, every single one is getting bid on. And then the person who pays the most wins. 

SHEFFIELD: So, the way it works, as you guys said, a lot of companies that are trying to advertise, they have no idea where their ads are showing up.

And you’ve got multiple companies now out there that are operating these things called exchanges. And basically they can be a way to mask publishers and things like that. And so that is something that is a problem for a lot of brands because their ads are ending up on websites that promote hatred or white nationalism or religious bigotry, things like that.

But if we could Dini, maybe talk a little about what you were doing before you guys started Check My Ads. 

JAMMI: Sure. In 2016, I was a growth marketer for a small tech company, and in 2016 right after the elections, I went and visited Breitbart for the first time just to see what it was all about. And what I saw as a marketer, wasn’t just the headlines but the fact that it was plastered in ads from some of the biggest brands in the world. And this was jarring to me because I knew that these brands wouldn’t want their ads placed if they knew they were. And as a marketer who had myself run ad campaigns before, I knew that it was likely that they just weren’t checking to see their site placements module within their Google dashboards.

So knowing that they probably didn’t know where their ads were appearing, I began tweeting working on a campaign called Sleeping Giants and together with my partner at the time, we began to alert companies that their ads were appearing on Breitbart with a screenshot of their own ad next to some of these really incendiary and hateful headlines.

These advertisers were just shocked. They never knew that their ads could be appearing on that type of content. We would hear back from advertisers within minutes and hours. And people could see that this was really getting results.

Advertisers were blocking Breitbart instantly, so people joined us, started taking screenshots with us, and we were able to turn this [00:06:00] little action into a crowdsource campaign. We grew to over 400,000 followers and we were able to successfully confirm that over 4,000 advertisers blocked Breitbart from their media buy.

During this time, over 30 ad exchanges also dropped Breitbart, which meant that when an ad exchange drops a publisher, they lose access to thousands of ads and millions of bids and impressions. So that’s what happened. And then over time , we just kept running this campaign and I started to ask myself, at what point do we shut this thing down? How much longer do we have to keep working on alerting advertisers for them? Why can’t they get a handle of where their ads are appearing? 

Because I wasn’t just seeing ads on Breitbart. I was also seeing ads on similar websites, like The Gateway Pundit. What Claire and I did was we — first of all, we decided we need to understand what’s going on in this ad tech industry, because this is something that they have the resources to take care of.

They’re talking about it. They’re holding conferences on brand safety, but they’re not fixing the problem on a very basic level. So we dug into the ad tech industry and then we started a newsletter called Branded so we could share what we were learning. Because what we were learning was really shocking. And we believed that marketers didn’t know this information and we needed to bring it out to them. So we launched Branded, our newsletter, and then after about six months or so, we decided to go from being activists and outside agitators to actually working with companies themselves, to help them get ahold of their ad buy.

I’ll let Claire jump in and talk about our nonprofit and how we got to that. 

ATKIN: Yeah. So we’ve been working for a year and a half with advertisers, mostly Fortune 500 advertisers because they’re at the intersection of, they have huge ad spends and they care a lot about their reputation. So we work with them to literally “check their ads.” The CTA is in the title of our company. 

And what we found over and over again is that we’re having the same conversation. ‘How do we check our ads for a bigoted content or xenophobic content [00:08:00] without being political? Or how do I have this conversation with my team? I’m in comms, but brand should be involved. Marketing should be involved. The advertising department itself needs to be involved. How do I bring everyone together?’

So we worked with a ton of companies. And we spoke to over 200 teams about how to check their ads, what was at stake, what brand safety really was. And then we found that we kept repeating ourselves. And so we decided that we needed to do this work a lot more publicly and we needed to to get everyone involved. This is not just an advertising problem. It’s not a brand safety problem. This is a community safety problem. 

JAMMI: Over the past year or so, I kept tweeting. I kept seeing ads on disinformation and hate sites and I just couldn’t stop myself. And I realized that the Sleeping Giants movement was so powerful. It’s actually a global movement. We started the one here in the U.S., but there has been really successful Sleeping Giants movements in Canada, in France, in Brazil. That’s a really big one right now. 

This is a global issue. So us working with a handful of clients at a time, isn’t going to solve the problem at large, it’s not going to help us get to the bottom of this.

One of the reasons we wanted to start a nonprofit was to be able to bring the public and to bring advertisers into the fold. There is currently no trade organization or any organization at all that’s focused on advertisers and consumers. In fact, consumers have no voice in the way that advertising works today. And what we wanted to do is be the voice of just them and be advocates for them in the industry. So what we’re doing is we’re coming to the table with our own chair, and and we’re banging on the door of the industry and asking them to start being accountable. 

SHEFFIELD: Yeah, and you sent me a PDF that talks about part of the problem and how it works. And I’m going to put it up on the screen here in a second. Just talking about how ad tech works and why it works with disinformation. So basically just to review, it’s a complicated system. So can you guys go through, just walk through it for us? 

ATKIN: [00:10:00] So I have to say, because there’s going to be ad tech folks on this call. This is oversimplified in a huge way. 

SHEFFIELD: Yeah, for sure. 

ATKIN: You got the advertisers on one side and consumers on the other. The advertisers want to reach the eyes of the consumers. So how do they get there? First, they usually sometimes employ a media agency and then the media agency says, don’t worry, we’ll handle your campaign.

And they work with either a trade desk or a media network to do direct buys. But the trade desk we’re talking about, real-time bidding, they will give all the creative to them, and then the trade desk will send it out to the real-time bidding exchanges. So then we’ve got a slew — I don’t want to over-complicate it — we’ve got a slew of ad tech companies in the middle here. Some are called DSPs or demand side providers. They represent advertisers within the real time bidding exchange. And then on the other side, we’ve got supply side providers. Now supply here is the publishers. So those are the ones who represent publishers to the exchanges.

It goes through this real-time bidding process, and then it ends up, the ads end up, on a per-placement basis on all of the publishers. And we included some publishers here, but they’re all examples. There’s hundreds of thousands of them. And some of them are disinformation. So every time you see orange here on this map, you’ll see that there’s a chance that there’s disinformation, and that’s where the ads end up causing brand safety crises.

SHEFFIELD: Yeah. And then I’m going to just scroll up a bit here in the file. But basically the way that disinformation works on the publisher side of disinformation. So you guys in the PDF here, you call it an ad tech, an ATM for the disinformation economy.

And there’s three steps: (1) pump out fake news, (2) monetize and multiply with ad revenue and (3) overwhelm the public and the media ecosystem. So on the first step here, there’s the phrase that I think a lot of people are familiar with, that [00:12:00] a lie travels around the world while the truth is still getting its pants put on or something, some variant of that.

And that’s actually really true, especially on the internet, because there are a lot of people out there who don’t know very much about politics or political actors, or, governmental policies that they’re pretty much willing to believe anything that confirms their biases. And social media companies, that’s where they come into the play here in between number one and two because you have a lot of these websites out there that will just make up stuff, or they’ll exaggerate things or misconstrue things and not bother to confirm whether they’re true.

And so these websites like Breitbart, like Gateway Pundit, like Post-millennial, like a lot of other ones, they exist to either a push a extreme political agenda, or just simply make money. Like they don’t care what they’re doing with it. And then, the step two though, is ad tech basically is paying for this.

And they’re doing it in a way that advertisers have no idea that this is happening to their brands. And so you guys have, since, worked on trying to get people to understand how their dollars are doing it. And what kind of response have you gotten from people? 

JAMMI: It depends what kind of people, right? 

SHEFFIELD: From, sorry. 

(Laughter) 

SHEFFIELD: I’m sorry, yes! We’ll talk about on the other side, from the advertisers first, we’ll go there first. 

JAMMI: They’re into it. They love it. Advertisers and marketers feel — one of the sort of overwhelming feelings on a personal level is that sort of sense of embarrassment that they don’t know more about this industry as they should.

And the truth is that nobody understands how the ad tech industry works, because that’s how it was designed. It was designed to be full of jargon, and numbers, and KPIs that people are — like the average person is just not gonna ever really understand. And we’ve talked to ad tech professionals, who’ve done this their whole lives, and they still don’t know what half these companies do.

What we’re doing is empowering them [00:14:00] to, when we ask questions in Branded, and on Twitter, and on social media, and we start to, we really dig in and ask these questions. It empowers marketers to start digging into their own data and start asking questions on their own.

And in fact, we’ve had advertisers who read our work, and they go in and reach out to their ad tech vendors. And they started asking questions: ‘Hey, can you give us access to our log level data? This is our data belongs to us because it’s our money that you made these placements with.’

And they’re realizing right alongside us that the ad tech industry doesn’t want them to have that data. What we’re doing is creating a really powerful movement for conscientious marketers and advertisers to join us and to really start to push back on an industry that has seized control of advertising and media budgets, and refuses to give it back.

ATKIN: The advertisers that we’ve worked with who have checked their ads are more than — I don’t think they’ve ever not been appalled at what they’ve found. We had one team, they checked their ads and they were like: ‘Oh boy, we spent a lot of money on the Gateway Pundit and a few other really brand unsafe outlets that traffic in racism, bigotry, xenophobia, the (2020 election) big lie, like the whole gamut of advert of scapegoating marginalized people.

And they went to their VP of marketing with the report and they reported back to us. The VP marketing was like: ‘Oh my God, our ads are up the asshole of the internet. How did this happen? How do I fix it?’ He was so shocked. And this is what we find again and again. It’s these teams they check their ads, they just look at the site list alone.

They don’t even have to look at the log level data at this point. They just want to see where their ads went and how much money they spent. And they’re horrified. 

SHEFFIELD: And just to go back to your graphic here a little bit. So from most advertisers’ standpoint, so first I showed how it actually works. [00:16:00] But in terms of the advertisers, they don’t see any of those things. And so all they know is, they paid somebody some money and they told them they would display their products. And then what they end up with is that they are subsidizing hate websites. 

JAMMI: Quite often what they see is just the high level KPIs. So marketers care a lot — 

SHEFFIELD: And what is a KPI for those who don’t know? 

JAMMI: Key performance indicators. So they’re looking at the top level sort of numbers that they need to report up to their boss and say: ‘We did a good campaign.’

So they’ll look at impressions served. So that’s, every time an ad is served and that total number there’s clicks and the aggregate conversion. So like the high level number of how many conversions that a campaign yielded. 

And what we found when we did a case study last year for a company called Headphones.com, a small business based in Canada, they sell high end headphones. We found that they were using a retargeting company called Criteo. Everyone has been served an ad by Criteo. It’s very major ad platform. And what they do is they follow you around the internet. So you go to a website and you get cookied, and then they follow you around. So you don’t forget what you saw. 

So the CEO of that company, his ads were ending up on disinformation websites. So we did an audit for him and what we found at the end of that audit. And I’ll just skip over the boring stuff with that. He was spending $1,200 a day, which is a good amount of money for a small business, $1,200 a day. And 95% of that spend was garbage. 

And we were only able to get him that data after he had to pull teeth with the team at Criteo and ask for the detailed dashboard that he wanted to be able to see, because all they were showing him were those top level stats. And he seemed, he was happy with those numbers at first, right?

Like it looked like a pretty good campaign. He was making money. His ads are being served, so everything looks [00:18:00] good on a high level. But when you dig in: ‘Wow, my ads are being served on disinformation. My ads are being served on garbage spammy click-bait websites. My ads are being served on websites that serve audiences that aren’t even in our shipping area.’

So once we were able to block all that stuff out, his spend went from $1,200 a day to $40 a day, and there was no change in performance. 

SHEFFIELD: Yeah. Exactly and the other, one of the other layers to ad tech and it’s adjacent to the disinformation, is that there are a bunch of companies out there that specialize in fake clicks, fake views to pump up advertising, and sometimes they work for publishers. Sometimes they work for exchanges. It depends on who they work for, but basically they will operate what people are calling click farms, where they will just literally have a bunch of computers sitting there simulating a person browsing the internet, and then clicking on ads.

And so the revelation that this was happening, it sent a shockwave throughout the ad industry. But the general public, I don’t think were ever really found out about it. What would you say? 

JAMMI: I think it may have sent a shockwave maybe to advertisers, but everyone in the ad tech industry knows that.

SHEFFIELD: Oh yeah, yeah. No, the advertisers is who I’m talking about. Yeah, the ad techs, yeah, they didn’t care. 

(Laughter) 

ATKIN: Yeah. I don’t think the public concerns themselves, it’s such a huge brush. Most people don’t care about digital advertising. A lot of us use ad blockers, although the percentage is still definitely not the majority, but in general, advertising is just an annoyance of the internet.

We don’t think about all of the different ads that are on the news articles that we look at because we’ve gotten used to them. We just have blinders on and what Nandini and I are trying to say to everyone is we need to start to pay attention to the ads that are on publishers that are causing community harm.[00:20:00] 

These publishers make money, the more hate bait they put out. And so we need to start paying attention to these digital ads as a tool of the disinformation economy as a funder of the disinformation economy. 

SHEFFIELD: Yeah. And these publishers of disinformation, hate, they hate you guys quite a bit. And they’re accusing you of censorship. So is this censorship what you’re doing? What would you say to them? 

JAMMI: I think this is the ultimate expression of the free market, right? 

SHEFFIELD: Which supposedly they support. 

JAMMI: Which supposedly they support. I fully support the right of Breitbart to exist and to write whatever they want and say whatever they want. I also support the rights of advertisers to not associate with it. And advertisers are beholden to a whole host of stakeholders, their customers, their employees, their investors, and so on. So they have a right to, to build the kind of business that they want. And when it comes to the concept of censorship, I just don’t understand what they’re talking about.

They’re all free to continue publishing. They’re all free to continue feeling their feelings and expressing their opinions, but they don’t have a right to dupe advertisers. And I think advertisers right now are being duped, because they don’t even have the knowledge and the information that they need to say: ‘Oh, my ads are appearing on this site.’ Like they don’t even know. 

So they have the right to know where their ads are appearing. They have a right to know how their money is being spent, and they have a right to pull out if they so choose. 

ATKIN: And from an industry perspective, like the ad exchanges themselves, the people who collect publishers in order to sell them to advertisers on their websites, on every single website, they say: ‘We only have premium quality publishers. We are, we adhere to brand safety standards. You’ll only get the very best publishers if you work with us.’And [00:22:00] so what they say is we would never let anyone in who is a part of what is called the brand safety floor. So the brand safety floor is like the bar is on the floor for for whether or not an advertiser would want to be there.

That means that hate speech, disinformation, the promotion of drugs, the promotion of violence, no advertiser wants to be there. And then they turn around and they work with these companies that promote hate, violence, and bigotry. And what we’ve done for the last two years is just point that out. Just to say: ‘Hey, according to your own standards, according to your own marketing materials, this should not be in your inventory.’

Advertisers should not be having to tippy-toe around these publishers that are in your own inventory because you say that they’re not even there. And so this really doesn’t it. 

SHEFFIELD: Yeah, no, that’s a good point. These disinformation publishers, they are trying to, they’re trying to make a false conflation of free speech versus subsidized speech.

So in other words, they’re trying to tell people they have the right to be paid to lie, to create bigotry, to spread falsehoods about the election. That’s what they’re trying to say, but nobody has the right to free money from someone else. Just to go back to what you were saying earlier, Dini, this is the free market, the advertisers, they don’t want to pay these websites. In many cases, they don’t even know that they exist. And then when they find out, as you said, that guy said that this is the asshole of the internet as that advertiser said. And then meanwhile, and this is not within the scope of Check My Ads, but I would add also these same organizations and publishers that are constantly claiming to be concerned about censorship. They say nothing about actual government censorship that is happening right now in America’s school boards. So there are all kinds of, unfortunately, teachers who have been [00:24:00] fired, principals who have been fired for daring to teach about racism, for daring to, use books that hurt people’s feelings.

Like actually out of Virginia this week, there were two school board members in Fredericksburg, Virginia, that said they wanted to burn books that were in their school district’s libraries. That’s the censorship that is happening in this world. And for you guys to pretend that you’re about protecting free speech, no, you’re not. You’re about subsidizing your own speech. 

JAMMI: Yeah. And actually tying that back to the scope of Check My Ads is the idea that these issues have been seeded in the media, the mainstream media, by disinformation outlets. Critical race theory is not being taught in elementary schools. This whole thing is an organized strategy to sow more division. There’s no real basis in reality. And what I see, and what Claire and I see when we’re researching disinformation outlets is, what they do is they create one story and then they link. Then they create another story. They link to each other, and then they bring in a third disinformation outlet, and they all just link to each other and create this huge frenzy. And as you so eloquently said before, the truth hasn’t even put on its pants yet. So it’s really hard to keep up with this stuff.

And then by the time you’ve caught up, it’s already become a major media story. And this is the kind of manipulation, media manipulation that we need to document that advertisers need to understand is not really an expression of free speech. It’s a propaganda effort. 

One of the things that we do, and that we are researching is disinformation rings, fake news rings, and how they ratchet up and ricochet across their echo chamber and make it seem like it’s much more of an issue than an it actually is. 

SHEFFIELD: Mm-hmm. Yeah. I think there was some understanding of that in the general public, after the 2016 election when people were looking at it in the context of Russia or Macedonia or things like that. But these people in America who were doing this stuff, they never went away. Like they are still out there. And they’re still doing this stuff.

Why don’t we talk a little bit [00:26:00] about Check My Ads specifically about what the components are that you guys are putting together and how people can get involved. 

ATKIN: So at Check My Ads, our goal is to cut the funding from disinformation by sunlighting what is going on within the ad tech stack. All we’re doing is showing advertisers and the public what is going on, and then everyone else can make their — the advertisers and the public can make decisions based on that information, just to be more informed. So we do two huge things. We research the hell out of what’s happening. And then we publish using Branded, which is our newsletter.

We publish stories every couple of weeks and those stories get results. Every time we show what is happening within the ad tech stack, we show the relationship between ad tech and disinformation outlets, and we’re not talking about just a general relationship. We’re talking about business contracts.

We’re talking about shared bank account numbers. Like these relationships are serious. They’re contractual. Every time you mention it, someone blocks something, 

JAMMI: Or drops something. 

ATKIN: Or drop it. 

SHEFFIELD: And then in terms of people, how can they get involved?

JAMMI: Oh yeah. So there’s a couple ways you can get involved. No matter what, please sign up for for Branded again, checkmyads.org/branded. That’s where we put out our newsletter every two weeks with a new investigation or something that we found. And we are able to very effectively demonetize disinformation through our research. 

The other way that you can get involved is by joining us as a Checkmate. So we have a membership level for for our supporters and our fans. And the memberships start at $10 a month and this money goes directly into research. And we have assembled a team of just absolutely brilliant researchers who are helping us to dive deeper into the ad tech supply chain.

And certainly as [00:28:00] we chase down the bad guys, they’re burrowing deeper and deeper into the ad tech supply chain. And that’s why this research is so important and necessary.

SHEFFIELD: And I guess for people who may not be inclined, let’s say as interested in understanding how all this works these networks, I’m going to put up on the screen an article that came out last month. So because this is complicated material to some degree, there’s a thought that it doesn’t really impact you necessarily, but the reality is that it does.

So like in 2019, almost all of the top Christian American Facebook pages were run by foreign troll farms. So basically people who were trying to go onto Facebook to get some religious messages in their Facebook feed, they were just being manipulated by a foreign operation just existing for the purpose of making money, and obviously weren’t really interested in promoting Christianity or anything like that. 

This is just one of the ways that that it can cause people to be misinformed or just be manipulated. And the same thing is true in politics that a lot of political pages that are out there, there’s one page out there that was called, you like ‘Being American,’ and that page was run by people in Macedonia or things like that. What are some of the other examples that are particularly egregious in this regard that you can think of? 

ATKIN: So we have a lot to say about this kind of thing. First of all, Facebook pages that are, that are sort of generic Facebook pages about patriotism or religion are the kind of Facebook pages that we see peddling disinformation a lot or linking to disinformation.

Now, there has been a lot of discussion about Facebook in the news for the last years, but especially last month or so and that’s not where disinformation outlets make money. Disinformation outlets make a lot of money thanks to Facebook, but it’s not on the Facebook platform. So what happens when [00:30:00] you’re browsing Facebook, and you see a disinformation headline is you either scroll past or you click it.

Now, if they’ve baited you enough, it’s like clickbait, but we call it hate bait. If they’ve baited you enough, you’re going to click just to understand what’s going on the story. And then that will take you to an external site, a site off of Facebook, hatespeech.org, I’m making that up, and it will take you to a site. And that’s where they get the payout from the ads. 

So it is their incentive to dupe you into clicking as much as possible from these Facebook groups. And that’s the most pernicious part of the disinformation economy, because that’s the part that fuels their engines. 

SHEFFIELD: Yeah. So do you have any other examples or like specific ones you’re thinking about, ones that you’ve encountered—

JAMMI: I mean the Facebook top 10?

SHEFFIELD: Yeah. (laughs) 

Well talk about that. Yeah. Talk about that. Yeah. For, just for people who don’t know. 

JAMMI: For those who don’t know, there’s a Twitter account called Facebook’s Top 10, that every day auto-posts the top 10 performing links and posts on Facebook. And it’s usually the same people.

A lot of familiar names on that. There’s Breitbart, there’s Ben Shapiro, there’s Daily Wire, there’s Dan Bongino, and a handful of others. And we all know that there’s a pattern here, and Daily Wire is one that we wrote about recently. We wrote about Daily Wire because this is the type of site that has created a funnel of disinformation for its audience. 

So they reel you in with high level, like it doesn’t seem so bad in the beginning, but as you go engage more and more with their content, at first they’re just asking: ‘What is transgenderism? Is it a thing?’

And then it goes slowly, deeper, deeper into: ‘Is this person really a woman?’ Putting the word woman and transgender into scare quotes and then once you’re on their email list they send you content that’s essentially designed to further radicalize you. I think Candace Owens recently did a session on what makes a woman. And it was just using pictures of actual transgender people who are minding their own business, and that’s the kind of stuff that creates real engagement, kind of creates outrage. It creates like a [00:32:00] false sense of something’s wrong in the world and we need to address it. And that’s the reason that content does so well. And it’s simply not public interest reporting. It is propaganda and it is disinformation.

SHEFFIELD: Yeah. All right we got a viewer question. This is from one of our Flux members on Facebook. She asks: “Does Check My Ads have a way to combat disingenuous advertisers on search engines, like Google, for example?” You want to talk about that a little bit or what you guys are doing on that or planning to do?

ATKIN: That’s such a good question. We’re so hyper-focused. I thank you so much for that question. But we are super hyper-focused on how publishers make money, not on the advertising itself. That’s a whole other kettle of fish, and we would love someone to take that up. 

SHEFFIELD: But to Ty’s [the viewer] point though, a little bit, I would say yeah, that Google definitely is a big part of this problem here. And and one of their ways that they contribute to the problem is their browser, Google Chrome. Google Chrome is spyware like that’s straight up what it is. And I always tell people don’t use it.

It points to a problem in a lot of tech journalism, I feel like, that people who write about technology are not advocating for privacy for their audience. Like I always see people in articles or videos, saying: ‘Yeah, and then I load it up, 15 Chrome tabs here, and it was just fine.’ Please stop telling people to use things that spy on them, that’s just a minimum of what you could do.

JAMMI: Check My Ads Institute, it’s going to be around for a long time. This is what we launched as. In our Twitter bio, we say that our current mission is dismantling the disinformation economy, that in and of itself is a massive job. 

We need to rally advertisers. We need to rally the public. We need to organize them. It’s a huge educational and awareness campaign, and that is going to, it’s going to take time and it’s going to take resources, but that is not all that we hope to do with Check My Ads Institute. Over [00:34:00] time, we certainly agree with you on the dangers of surveillance advertising and the more that we’ve studied the advertising ecosystem as it works today, the more we’ve realized that we cannot have a functioning society under surveillance advertising. We need to dismantle that too. And so that’s our, we have our eye on that. And we do plan to, as we grow our organization, we plan to put our hands in that pot as well.

And that is part of the roadmap. And ultimately that means, we will be holding the hands of advertisers as we move away from the surveillance based advertising, and what really is an addiction of marketers to data, to consumer data. And to basically this concept, this sort of unchallenged concept of stalking and monitoring and logging everything that our customers do in order to know as much as you can about themselves and their intimate lives. And help them start to think about ways to, alternative ways to do marketing. And that’s just something that has been lost in our industry. 

Today, the average marketer, someone who’s just gone out of college and enters the industry, doesn’t know a world without programmatic advertising, or world without surveillance based advertising.

And so that’s something we’re going to have to build up again, that’s a muscle that we need to build. We need to bring marketers into the room together to have those discussions and to start thinking about — I talked about KPIs before, those clicks and the views, and the impressions, and the aggregate conversions, what are other ways that we can do our jobs without these invasive vanity metrics?

How can we transform the digital marketing discipline in a way that is respectful to consumers? 

SHEFFIELD: Yeah. And it’s important to note also here that, a lot of these they call them behavioral advertising, and they basically do involve surveillance, but the reality is that they’re not even that accurate in many cases.

So for some reason, ads that I see on Twitter think that I work in cancer research. I have never worked in cancer research or any sort of medical research, but I guess because I tweet about medical disinformation and debunk stuff, they seem to [00:36:00] think, they think I work in cancer, and I don’t. And that’s just one example, but I think everybody’s had lots of these examples. But it does go back to these sort of privacy invasions. 

I remember there was a hearing, a congressional hearing, last year, I believe it was, where Mark Zuckerberg, the CEO of Facebook was asked about if Facebook was spying on his phone conversations, or using his phone to spy on him, to serve him advertising. And the member of Congress had no concept of ad tech and the surveillance economy. And he didn’t seem to have any idea that’s how you got an ad for this thing that you happened to mention. It wasn’t that Zuckerberg was spying on you, it’s that data brokers were looking at your credit card. They were looking at your browsing history. That’s how you got this stuff.

And the more people can understand about how this works, and the better they can support solutions to having more privacy, and understand this is not an inevitability of the internet. There was a time in the internet where there wasn’t surveillance advertising, where there weren’t cookies following you around everywhere, and people selling your data to the highest bidder.

These are real world — they have real world consequences, especially like if you live in an authoritarian country they can use advertising tech against you. And I forget which country it was, but there was some country recently that was passing some laws cracking down on people who are gay or lesbian. And if they want to find out who is gay or lesbian, they can do that to some degree. And that’s dangerous. This is dangerous stuff. 

JAMMI: What we need to remember is that, and I don’t think marketers have realized, but we are actually at war with our customers. That’s what surveillance advertising is.

It’s we’re effectively mercenaries, unwitting mercenaries to a much, much bigger, a much more — it sounds like a conspiracy theory, but it’s a lot more diabolical. Because the way that you can use that data that we [00:38:00] collect just to sell things can be used by authoritarian states to do a whole lot worse.

That really is the urgency behind our message. We have, that is how we came to where we are today with a disinformation economy, because there’s certain kinds of inflammatory content that just rises to the top, because again, it confirms biases, it creates outrage, and it’s designed for outrage. And I think that it is the biggest fight of our lives to take on — the marketing industry is ground zero. This is where the fight is at to cut off disinformation at the source. And that’s why we’re here. 

SHEFFIELD: All right. That’s a great summary there. So if people are interested in joining your effort there, you can go to checkmyads.org/membership. So to the audience, I recommend doing that. 

So what are some other things people can do to change their browsing habits or things that they use? What would you recommend for them to do? What are some things?

JAMMI: I will say that what I learned from Sleeping Giants is that every voice matters in this fight and at Sleeping Giants, what most people don’t know is that, and sometimes the way that our campaign was described as a mob of people, but a lot of the times, advertisers who blocked their ads from Breitbart did so after just one person contacted them, and it made people feel important. And it was a really powerful motivator for our community. So it doesn’t really matter who you are. And what matters is that the advertiser the advertiser really wants to do right by you, and that’s the real takeaway. 

SHEFFIELD: They want you to like them. They want you to like them.

JAMMI: They want you to like them. Exactly.

And what we hope to do with, I neglected to say this before with our members, with our Checkmates is, we really want to bring these folks into the fold of our work. So one of the things that we plan to do is after each Branded, we will be holding a private call with our Checkmates to give them the behind the scenes of how we came up with the story, how we found, [00:40:00] how we went through the research and what our process was.

And over time, what we’d like to do is rebuild that Sleeping Giants campaign, but this time at scale. Whereas before we went to the advertiser directly this time, we’re going to go directly to the ad exchange. We’re going to go into the weeds and this stuff isn’t impossible to learn. It’s not rocket science. And what we hope to do is build an army of marketers, of volunteers, of checkmates who really want to join us in this fight, to dig in with us, and make a difference on their own. We want to give them the tools, the trainings, the knowledge that they need to join us, because anyone can do this work.

SHEFFIELD: Yeah. So Dini, you talked a little bit about your background in all this stuff. Claire, why don’t you tell us a little bit about how you got to know Dini and what’s your experience with all this internet marketing stuff? 

ATKIN: Sure. Yet we’re both marketers. I was running a consultancy on just to help teams with their marketing. So I was helping build marketing departments for B2B tech companies, and I was deeply disturbed by the media landscape of the American 2016 election. So in 2017, I went to study international election observation at the Global Campus for Human Rights in Venice. Just to understand what the pros were doing to observe media during elections and how I could maybe help.

And what I learned there is, at the time in 2017, election observers were not given the data that was required in order to monitor what was going on, per country, per election, by the tech companies just, were not giving them the data that they needed. And that was deeply disturbing to me.

So I had one foot in election observation, and one foot in the tech industry. And I was deliberating on what to do next. When I met Nandini, Nandini came to my hometown, and we ended up meeting in person and [00:42:00] hanging out all weekend. And then we went to a conference together a few months later and we just riffed on everything. Both of us were in the same industry. We saw the same problems. And we were both ready to do something about it. 

SHEFFIELD: Okay. Cool. Yeah. And just to go back to the idea of the advertising models and things like that, so you want to help advertisers move to a healthier model. And what about publishers? What have you guys thought about how publishers can contribute to a healthier advertising economy? 

ATKIN: Yeah, we care a lot about journalism and we know that publishers are in a bind. It’s been 20 years of hell for the news industry, and we know that in the last decade, 30,000 jobs have been lost in news media.

And this is a devastating blow. Publishers need to be smart. 

And they have to be strategic. We don’t have a, we don’t have a set of recommendations for publishers beyond try to be as direct as possible with the advertisers. We know publishers, companies that own four or five newspapers in America, who pivoted to digital and they decided to do everything digital on real-time bidding. And now that they’ve lost so much revenue, they’re really disturbed by that decision that they made. So they’re wondering what to do next.

It’s a tough position for them, and we think that advertisers need to take the first step. 

SHEFFIELD: Did you have any thoughts on that, Dini? 

JAMMI: I did. I view the work that we’re doing to dismantle this economy as also an opening for new ecosystems to emerge. So there’s folks out there who are thinking really hard about how to develop a much more healthier publisher ecosystem.

And that’s really not going to be viable until we address what’s happening now because the advertising ecosystem really is the backbone of the internet economy, [00:44:00] as we know it today. And it’s brought us all the problems that we are dealing with today. And we need to create that opening.

And then again, this is not something that me and Claire are doing alone, we are working in collaboration with folks, and our goal is to open the door for other folks to come in and bring forth solutions. 

SHEFFIELD: Yeah. 

ATKIN: Publishers, now more than ever, which is a phrase I’ve promised never to say, really need to adhere to journalistic standards, because that’s what we’re talking about.

This is not a left-right discussion that we have with our advertiser clients. We talk about what is disinformation, and what are journalistic standards, and how to be on news media that adhere to journalistic standards. So the very best thing you can do as a publisher is continue to uphold those standards internally and talk about it. Be transparent. 

SHEFFIELD: And tell your audience about it. And tell them why it matters. 

And let me just, did you guys see those stories that came out a little while ago about these fake local news websites? Did you read those? Yeah? 

So basically for those that didn’t see that, there are a bunch of organizations that are linked all together behind the scenes and they are basically creating fake publications. Like they’ll pretend to be based in Dallas or pretend to be in Topeka, Kansas, or something like that. But in reality, they’re all run out of one area and one operation. And in many cases, the articles are actually automatically generated. They’re not even real articles. And they are designed to push propaganda operations and then they use ad tech to subsidize themselves.

JAMMI: Directly financed by Google. 

SHEFFIELD: Yeah. Google pays for that. And so there’s just so many ways that what you’re doing is relevant even to people who may not realize it. And so I would definitely encourage people to check out what you guys are doing. Now what about, just as we wrap up here, what kind of role do you think the government should be [00:46:00] playing, governments should be playing in opening up the black box of advertising, and letting the public and letting advertisers know what’s going on?

ATKIN: The ad industry has done a job to try to build transparency within the supply chain. They use text pages that are easily read by machines like ADS.txt, or SELLERS.json. These are just standards that the ad industry have collectively invented in order to track where the ads are actually going. And who has a relationship with who. 

They’ve developed these standards and then no one within the ad tech industry, no association has said we will enforce them. And that’s what Check My Ads is here to do, is to hold the industry accountable for the enforcement of these standards that they have created.

So we think that the government could do a lot more to build transparency within the supply chain and demand reports from both advertisers and ad tech about who they’re financing. 

SHEFFIELD: Yeah, and I think, I guess in the United States and a bunch of other countries that are not the European Union, we don’t have anything like the GDPR, which is the General Data Protection Regulation. What that law does for people who are in the EU is, it does a lot of things, but among other thing is it allows you as a person who uses the internet and various services, that gives you the right to see how people are using your data. And there is nothing like that in the United States.

And not really anything like that in Canada and a bunch of other countries. Now in California, where I live, we have some protections that are good, but I think that pushing for things like that and pushing for disclosure statements in terms of funding who paid for an ad, like these are things that are policies that people I think need to really be talking to their representatives about.

ATKIN: Yep. You’re [00:48:00] absolutely right. And I think government representatives should be aware that this is. An advertising industry problem. This is a public security problem. As I said, it’s a national security problem. As we have said over and over again in Branded, this is—

SHEFFIELD: Why don’t you talk about that in the national security sense? 

ATKIN: You’re talking about privacy of personal data, right?

If I was running a propaganda outfit anywhere in the world, I would be able to use the marketing tools that Americans use every day, that American companies use everyday to target us, to target Americans. 

At Branded, we’ve discovered Russian propaganda outlets that are using the American ad tech industry to target Americans, to trick them into thinking that they’re reading American publications about American topics, but actually, it’s Russian propaganda.

To me, that’s a national security issue. And so what we’re talking about here is not just marketers need to do better. We’re talking about America needs to take this seriously. 

SHEFFIELD: Yeah. And as an example of that, just today I put out on my Twitter, somebody had done a extensive report about how Russian bot farms have been boosting this guy who was a supporter of Texas secession, and they’d been boosting him on Twitter, and on YouTube, and elsewhere.

And, these are things that that people should know about who come into contact with this guy’s content if they — the government regulation and the laws haven’t kept up with the technology.

And so basically it’s this wild west, where there is no accountability, where there is no disclosure, where there’s no transparency. And people need to step up and start demanding better. 

ATKIN: Yes. And the first way that he can do that is to become a Checkmate. 

SHEFFIELD: All right. Then let me put that up one more time. So it’s checkmyads.org/membership [00:50:00] and that’s, I think a good way to end it here. So I’m just going to put up on the screen for everybody. So Nandini’s Twitter handle is Nandoodles, and you can find her over there. And then Claire’s is Catthekin, K-I-N. And then of course, you can also go to checkmyads.org, and see what you can do to get involved.

I appreciate you ladies for joining me today. Thanks for being here. 

ATKIN: Thanks so much, Matt. 

JAMMI: Thanks, Matt! 

SHEFFIELD: All right, we’ll be in touch. See you guys. 

All right. So that’s our look at the disinformation economy and things that people who want to fight this information what you can do to kind of contribute to putting an end to it. And there’s a lot that can be done, the good thing is, this is not some invincible machine that is grinding inexorably forward. And Check My Ads is definitely a good effort. I encourage everybody to check that out. 

And I thank everyone who is one of our supporters right now, you guys are great, and I really appreciate you. I’m Matthew Sheffield and I will see you next time. 

About This Podcast

Lots of people want to change the world. But how does change happen? History is filled with stories of people and institutions that spent big and devoted many resources to effect change but have little to show for it. By contrast, many societal developments have happened without forethought from anyone. And of course, change can be negative as well as positive.

In each episode of this weekly program, Theory of Change host Matthew Sheffield delves deep with a solo guest to discuss larger trends in politics, religion, media, and technology.