Sen. Roger Wicker hit a familiar note when he announced on Thursday that the Commerce Committee was issuing subpoenas to force the testimony of Facebook Chief Executive Mark Zuckerberg and other tech leaders.
Tech platforms like Facebook, the Mississippi Republican said, "disproportionately suppress and censor conservative views online."
When top tech bosses were summoned to Capitol Hill in July for a hearing on the industry's immense power, Republican Congressman Jim Jordan made an even blunter accusation.
"I'll just cut to the chase, Big Tech is out to get conservatives," Jordan said. "That's not a hunch. That's not a suspicion. That's a fact."
But the facts to support that case have been hard to find. NPR called up half a dozen technology experts, including data scientists who have special access to Facebook's internal metrics. The consensus: There is no statistical evidence to support the argument that Facebook does not give conservative views a fair shake.
Let's step back for a moment.
When Republicans claim Facebook is "biased," they often collapse two distinct complaints into one. First, that the social network deliberately scrubs right-leaning content from its site. There is no proof to back this up. Secondly, Republicans suggest that conservative news and perspectives are being throttled by Facebook, that the social network is preventing the content from finding a large audience. That claim is not only unproven, but publicly available data on Facebook shows the exact opposite to be true: conservative news regularly ranks among some of the popular content on the site.
Now, there are some complex layers to this, but former Facebook employees and data experts say the conservative bias argument would be easier to talk about — and easier to debunk — if Facebook was more transparent.
The social network keeps secret some of the most basic data points, like what news stories are the most viewed on Facebook on any given day, leaving data scientists, journalists and the general public in the dark about what people are actually seeing on their News Feeds.
There are other sources of data, but they offer just a tiny window into the sprawling universe of nearly 3 billion users. Facebook is quick to point out that the public metrics available are of limited use, yet it does so without offering a real solution, which would be opening up some of its more comprehensive analytics for public scrutiny.
Until they do, there's little to counter rumors about what thrives and dies on Facebook and how the platform is shaping political discourse.
"It's kind of a purgatory of their own making," said Kathy Qian, a data scientist who co-founded Code for Democracy.
What the available data reveals about possible bias
Perhaps the most often-cited data point on what is popular on Facebook is a tracking tool called CrowdTangle, a startup that Facebook acquired in 2016.
New York Times journalist Kevin Roose has created a Twitter account where he posts the top ten most-engaging posts based on CrowdTangle data. These lists are dominated mostly by conservative commentators like Ben Shapiro and Dan Bongino and Fox News. They resemble a "parallel media universe that left-of-center Facebook users may never encounter," Roose writes.
Yet these lists are like looking at Facebook through a soda straw, says researchers like MIT's Jennifer Allen, who used to work at Facebook and now studies how people consume news on social media. CrowdTangle, Allen says, does not provide the whole story.
That's because CrowdTangle only captures engagement — likes, shares, comments and other reactions — from public pages. But just because a post provokes lots of reactions does not mean it reaches many users. The data does not show how many people clicked on a link, or what the overall reach of the post was. And much of what people see on Facebook is from their friends, not public pages.
"You see these crazy numbers on CrowdTangle, but you don't see how many people are engaging with this compared with the rest of the platform," Allen said.
Another point researchers raise: All engagement is not created equal.
Users could "hate-like" a post, or click like as a way of bookmarking, or leave another reaction expressing disgust, not support. Take, for example, the laughing-face emoji.
"It could mean, 'I agree with this' or 'This is so hilarious untrue,'" said data scientist Qian. "It's just hard to know what people actually mean by those reactions."
It's also hard to tell whether people or automated bots are generating all the likes, comments and shares. Former Facebook research scientist Solomon Messing conducted a study of Twitter in 2018 finding hat bots were likely responsible for 66% of link shares on the platform. The tactic is employed on Facebook, too.
"What Facebook calls 'inauthentic behavior' and other borderline scam-like activity are unfortunately common and you can buy fake engagement easily on a number of websites," Messing said.
Brendan Nyhan, a political scientist at Dartmouth College, is also wary about drawing any big conclusions from CrowdTangle.
"You can't judge anything about American movies by looking at the top ten box office films hits of all time," Nyhan said. "That's not a great way of understanding what people are actually watching. There's the same risk here."
'Concerned about being seen as on the side of liberals'
Experts agree that a much better measure would be a by-the-numbers rundown of what posts are reaching the most people. So why doesn't Facebook reveal that data?
In a Twitter thread back in July, John Hegeman, the head of Facebook's NewsFeed, offered one sample of such a list, saying it is "not as partisan" as lists compiled with CrowdTangle data suggest.
But when asked why Facebook doesn't share that broader data with the public, Hegeman did not reply.
It could be, some experts say, that Facebook fears that data will be used as ammunition against the company at a time when Congress and the Trump administration are threatening to rein in the power of Big Tech.
"They are incredibly concerned about being seen as on the side of liberals. That is against the profit motive of their business," Dartmouth's Nyhan said of Facebook executives. "I don't see any reason to see that they have a secret, hidden liberal agenda, but they are just so unwilling to be transparent."
Facebook has been more forthcoming with some academic researchers looking at how social media affects elections and democracy. In April 2019, it announced a partnership that would give 60 scholars access to more data, including the background and political affiliation of people who are engaging content.
One of those researchers is University of Pennsylvania data scientist Duncan Watts.
"Mostly it's mainstream content," he said of the most viewed and clicked on posts. "If anything, there is a bias in favor of conservative content."
While Facebook posts from national television networks and major newspapers get the most clicks, partisan outlets like the Daily Wire and Brietbart routinely show up in top spots, too.
"That should be so marginal that it has no relevance at all," Watts said of the right-wing content. "The fact that it is showing up at all is troubling."
'More false and misleading content on the right'
Accusations from Trump and other Republicans in Washington that Facebook is a biased referee of its content tend to flare up when the social network takes action against a conservative-leaning posts that violate its policies.
Researchers say there is a reason why most of the high-profile examples of content warnings and removals target conservative content.
"That is a result of there just being more false and misleading content on the right," said researcher Allen. "There are bad actors on the left, but the ecosystem on the right is just much more mature and popular."
Facebook's algorithms could also be helping more people see right-wing content that's meant to evoke passionate reactions, she added.
Because of the sheer amount of envelope-pushing conservative content, some of it veering into the realm of conspiracy theories, the moderation from Facebook is also greater.
Or as Nyhan put it: "When reality is asymmetric, enforcement may be asymmetric. That doesn't necessarily indicate a bias."
The attacks on Facebook over perceived prejudice against conservatives has helped fuel the push in Congress and the White House to reform Section 230 of the Communications and Decency Act of 1996, which allows platforms to avoid lawsuits over what users post and gives tech companies the freedom to police its sites as the companies see fit.
Joe Osborne, a Facebook spokesman, in a statement said the social network's content moderation policies are applied fairly across the board.
"While many Republicans think we should do one thing, many Democrats think we should do the exact opposite. We've faced criticism from Republicans for being biased against conservatives and Democrats for not taking more steps to restrict the exact same content. Our job is to create one consistent set of rules that applies equally to everyone."
Osborne confirmed that Facebook is exploring ways to make more data available in the platform's public tools, but he declined to elaborate.
Watts, University of Pennsylvania data scientist who studies social media, said Facebook is sensitive to Republican criticism, but no matter what decision they make, the attacks will continue.
"Facebook could end up responding in a way to accommodate the right, but the right will never be appeased," Watts said. "So it's this constant pressure of 'you have to give us more, you have to give us more,'" he said. "And it creates a situation where there's no way to win arguments based on evidence, because they can just say, 'Well, I don't trust you.'"
AILSA CHANG, HOST:
Something we've heard from Republicans for quite some time now is they don't trust Facebook. Just today President Trump called for the repeal of legal protections for the tech industry. That was after Facebook removed one of the president's posts, which made a false claim about the coronavirus. And here's Republican Congressman Jim Jordan back in July.
(SOUNDBITE OF ARCHIVED RECORDING)
JIM JORDAN: I'll just cut to the chase. Big Tech's out to get conservatives. That's not a suspicion. That's not a hunch. That's a fact.
CHANG: Well, we wanted to follow the facts on this argument, so we turned to NPR tech reporter Bobby Allyn.
BOBBY ALLYN, BYLINE: Hey, Ailsa.
CHANG: All right. Just before we begin, we should note that Facebook is a financial supporter of NPR. Now, Bobby, what specifically are Republican lawmakers accusing Facebook of doing when it comes to conservative viewpoints?
ALLYN: Yeah. So when conservatives talk about Facebook not giving them a fair shake, you know, they're really talking about two separate things - first, that Facebook silences conservative voices by taking down right-wing views. And on that, there's just no evidence. And secondly, there's this idea that conservative news is suppressed on Facebook. And to evaluate that, we just need more data. Facebook keeps information like the top 10 most popular news items totally secret. And so that creates this big vacuum, and into that vacuum goes, you know, lots of speculation and conjecture. I talked to Kathy Qian. She's a data scientist with Code for Democracy.
KATHY QIAN: Facebook can't just release more data about what people see privately, right? So it's kind of like a purgatory of their own making.
ALLYN: Right - a purgatory of their own making because Facebook refuses to open its hood and show the public what people are seeing on the site. But we do have a little bit of data, Ailsa.
CHANG: OK, so what does that data tell us?
ALLYN: Yeah. There's this tool called CrowdTangle. It's owned by Facebook, and it measures engagement. So, you know, every time you like or give a sad face or leave a comment, it counts all that stuff up. And far and away the most engaged-with content comes from conservatives - so commentators like Ben Shapiro and right-wing outlets like Breitbart. So that tells us something - right? - that, you know, conservative stories are pulling people in unlike anything else. But just because posts have lots of likes doesn't mean it's the most popular thing on Facebook. Here's data scientist Qian again. She says, you know those emoji reactions you can do on Facebook? She said it's kind of hard to understand what they mean sometimes.
QIAN: The laughing face - like, what does that mean? It can mean, like, you know, I agree with this, or, like, this is so hilariously untrue. It's just really hard to tell what people actually mean by those reactions.
ALLYN: Yeah, I'm not so much of a hate-liker on Facebook, but I do know a few.
CHANG: So do I. All right, so we don't know exactly what's popular on Facebook, but we do know that conservative outlets do draw a lot of responses online. So how about Facebook pulling President Trump's post like it did today? Does that suggest Facebook is targeting the president?
ALLYN: Yeah. So whenever Facebook acts on Trump's posts, it's going to get a lot of attention, though that doesn't necessarily mean that Facebook is out to get the president. You know, Facebook very well may be pulling down more conservative posts than liberal posts. But, you know, I talked to a lot of former Facebook employees, and they told me that's just because there is more extreme right-wing content floating around on Facebook. The conservative media world is very well-developed and has such a passionate following, and some of this stuff, as we know, stretches the truth or just contains straight-up falsehoods. Brendan Nyhan is a political scientist at Dartmouth who studies Facebook. He thinks it's the opposite of a conservative bias. He says in order to appease conservatives, Facebook is, you know, bending over backwards sometimes.
BRENDAN NYHAN: They are incredibly concerned about being seen as on the side of liberals. That is against the profit motives of their business. So I don't see any reason to think they have a secret, hidden liberal agenda.
ALLYN: Right. So it all comes back to this Facebook black box. And until Facebook gives us more data, the debate is just going to be very frustrating.
CHANG: That is NPR's Bobby Allyn.
Thank you, Bobby.
ALLYN: Thanks, Ailsa. Transcript provided by NPR, Copyright NPR.