Government and Policy

‘Censor social media content & harvest data from banned accounts’: House Intel witnesses testify on combating misinformation

Witnesses call for more big tech censorship to kick ‘conspiracy theories’ to fringe platforms, perform digital autopsies on removed accounts

Expert witnesses tell the House Intelligence Committee how to best combat “misinformation” online with more content restriction on social media and posthumous data harvesting from banned accounts.

Today, the House Permanent Select Committee on Intelligence held a rare open hearing on the subject of “Misinformation and Conspiracy Theories Online” where expert witnesses testified on best practices for censoring content.

Towards the end of the hearing, Chairman Adam Schiff asked a peculiar question whose answer hints at a future where there’s even more censorship and greater amounts of data harvested from social media accounts.

Cindy Otis

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed” — Cindy Otis

Schiff asked Alethea Group Vice President of Analysis Cindy Otis what data social media companies are not sharing that they should be sharing in order to help analysts do their work.

Otis responded that getting access to the data from content that had already been removed would be “extraordinarily helpful” for conducting digital autopsies to find out the strategies, methods, and tactics of social media movements.

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed,” Otis testified.

“On Facebook, for example, you get an announcement every week or every couple of weeks about the content that’s been removed. We get a couple of screen shots maybe. We get maybe an account, maybe a page name — that sort of thing — but it’s after the content has been removed.”

“That sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how’s this manifesting on the platform” — Cindy Otis

Otis added, “Unless we were particularly tracking that threat or were part of that analysis to begin with, we’re not able to go back and identify the tactics and procedures that were used by threat actors to do this campaign in the first place.

“And so that sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how this is manifesting on the platform.”

While Otis called for harvesting data posthumously from banned accounts like digital autopsies, Melanie Smith, Head of Analysis at Graphika Inc., testified that big tech platforms should continue to restrict content, so that movements like Qanon would be forced to alternative platforms with smaller audiences.

Melanie Smith

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted” — Melanie Smith

She argued that on the so-called alternative platforms, there would be fewer opportunities for the cross-pollination of ideas.

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted,” Smith testified.

But it didn’t stop with the big tech companies. Smith told the committee that there should also be more pressure on alternative platforms to restrict content after it’s already been beaten back to “the fringes.”

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space,” Smith testified.

“I also think there could be changes to platform engineering to restrict the exposure of new audiences to algorithmic re-enforcement of some of these ideas,” she added.

If you combine the strategies of both Smith and Otis, what you get is more censorship, and then once a user or group is banned, the data is harvested posthumously to discover their tactics.

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space” — Melanie Smith

As big tech companies purge thousands of accounts for spreading so-called conspiracy theories, there’s a lot of personal data that analysts could have access to if they had their way and if the platforms were to have an obligation to hand over that data.

The data could then be used to track where users go next, and the potential for abuses of privacy is enormous, no matter how well-intended the idea may sound.

Why would the chairman of the House Intelligence Committee ask data analysts what they would need if he wasn’t already thinking about a  way to obtain that data?

If analyzing data harvested from banned accounts would be so “extraordinarily helpful,” would there be an incentive to ban even more accounts, so more data could be collected?

Where would it end?

Tim Hinchliffe

The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. tim@sociable.co

View Comments

Recent Posts

What sitting all day is quietly doing to your body and why you don’t even realize it (Brains Byte Back Podcast)

Adults today spend over nine hours a day sitting, according to national health data. On…

2 days ago

Kryterion and Automattic partner to create a gold standard in WordPress developer credentials

The web has a WordPress problem – not the platform itself, but the people who…

3 days ago

Consciousness computing tech exists, ‘whoever governs identity governs society’: World Forum

Neural rights was a hot topic during a session called "Approaching Singularity: Our Brains Interfacing…

4 days ago

Decision Points: The “Tiger” Methodology for Decisive Action

At some point in the last 10 years, I started viewing Colonel John Boyd as…

1 week ago

Architecting Zero-Click AI Eval Pipelines

When I started designing an AI Evaluation pipeline/framework at my organization, I had no idea…

1 week ago