GROW YOUR TECH STARTUP

House Intel Committee ditches script, grills big tech on polarizing users and toxic civic discourse

June 18, 2020

SHARE

facebook icon facebook icon

Today, representatives from big tech companies Google, Facebook, and Twitter testified remotely in a virtual hearing before the House Intelligence Committee on the issue of “Emerging Trends in Online Foreign Influence Operations: Social Media, COVID-19, and Election Security.”

Yet, the hearing wasn’t so much about foreign influence operations, nor did it discuss election security in great detail, and mentions of COVID-19 were almost all but omitted.

This was a hearing about big tech companies polarizing civic discourse, where hard-hitting questions were met with avoidance and softball lobs were smacked out of the park with regurgitated, corporate jargon.

At the onset of the virtual hearing, Nathaniel Gleicher, Head of Security Policy at Facebook, while doing his best Mark Zuckerberg impersonation, completely dodged Rep Jackie Speier’s question of whether Facebook considered itself a media platform or not.

“You may be a technology company, but your technology company is being used as a media platform, do you not recognize that?” — Jackie Speier

Speier: Facebook does not consider itself a media platform. Are you still espousing that kind of position?

Gleicher: Congresswoman, we’re first and foremost a technology company.

Speier: You may be a technology company, but your technology company is being used as a media platform, do you not recognize that?

Gleicher: Congresswoman, we’re a place for ideas across the spectrum. We know that there are people who use our platforms to engage, and in fact that is the goal of the platforms — to encourage and enable people to discuss the key issues of the day, and to talk to family and friends.”

Publicly, Facebook has always claimed being a public platform, but a 2018 report by The Guardian revealed that in court, Facebook’s lawyers repeatedly argued that the company was a publisher, and thus had different legal protections from that of a platform.

Platform vs publisher has been an ongoing debate for years, and it greatly affects how big tech companies go about content moderation, including user-generated content and comments.

Last month, the Wall Street Journal broke the news that Facebook “internally studied how it polarizes users, then largely shelved the research,” suggesting that social media giant thrives in an environment of polarized discourse.

The apparent polarization that Facebook’s algorithms flame was the subject of Rep Jim Himes questioning.

“Will Facebook be willing to make not just the attributes of the algorithm publicly available, but the effects of the algorithm?” — Jim Himes

Himes expressed concern that Facebook’s algorithms encouraged division and asked Gleicher whether or not Facebook would be willing to make the effects of its algorithms publicly available, with an emphasis on how they affect human behavior.

Nathaniel Gleicher

Nathaniel Gleicher

Gleicher deflected his answer to a spiel about transparency.

Jim Himes: Will Facebook be willing to make not just the attributes of the algorithm publicly available, but the effects of the algorithm? How open is Facebook to sharing with the public what the actual algorithm looks like, and more importantly, the behavioral outcomes of the algorithm?

Gleicher: Congressman, transparency is important here. I think one of the challenges is of course the algorithms we’re talking about, the decision making process we’re talking about, is incredibly complex.

Showing that information in a way that is consumable and meaningful is extremely important because it’s very easy to jump to conclusions.

One of the things we’re focused on is how to provide more context to users, so they can make those assessments, whether it’s on content that’s rated false by our fact checkers, state-controlled media entities, and others.

We’re exploring other ways to be more transparent, and I’d be happy to talk more about that.

While Gleicher dodged the algorithm question in Zuckerberg-like fashion, there is ample evidence that biased algorithms have the power to turn the tides of elections.

Last year, Dr. Robert Epstein, former editor of Psychology Today, told lawmakers that Google’s algorithms were biased and that biased search results had the potential to sway millions of votes during an election year.

“Biased search results can easily produce shifts in the opinions and voting preferences in undecided voters by up to 80% in some demographic groups because people blindly trust high-ranking search results over lower ones,” Dr. Epstein said in 2019.

Turning again towards big tech platforms creating a toxic environment for civil discourse, Rep Denny Heck said to Gleicher in today’s hearing that “civic discourse in America has degraded” and that the big tech companies represented in today’s hearing “have amplified that degraded civic discourse, and as a corollary to that — that you have all profited off of it.”

“Civic discourse in America has degraded […] Do you not accept any responsibility for this?” — Denny Heck

Heck’s question was one of Facebook’s responsibility for user behavior as it pertained to civic discourse.

Heck: Do you not accept any responsibility for this, and if you don’t, for the love of God tell me your logic for not accepting any responsibility?

Gleicher: Congressman, I think we have critical responsibilities. Yes, to ensure that debate on our platforms is authentic, also to ensure that it is as open and positive and collaborative as possible.

Part of what you’re identifying, Congressman, is how humans interact in public discussion. It’s why we’ve taken very serious looks. It’s why we have thought about what we promote, how we promote, what we recommend to address exactly these challenges.

I do also think that the rise of social media platforms, the rise of the internet has led to voices being heard at volumes that have never happened before, and the most difficult challenge here is how to peel these two apart.

I would never suggest that we could solve this problem alone. I think part of this is how humans engage, and the platforms have an opportunity and a responsibility to do everything we can to encourage and enable the best discussion, but I never suggest that we can solve this problem, Congressman.

Richard Salgado, Director of Law Enforcement and Information Security at Google, and Nick Pickles, Director of Public Policy Strategy at Twitter, were also witnesses during today’s virtual hearing, but it was Facebook’s Gleicher who received the hardest hitting questions about policies regarding civic discourse, which was the focus of this article.

Symptoms of shadowbanning on social media for your diagnosis

Tech products, culture are ‘designed intentionally for mass deception’: Ex-google ethicist testifies

What’s wrong with anti-competitive behavior in big tech?

SHARE

facebook icon facebook icon

Sociable's Podcast

Trending