GROW YOUR TECH STARTUP

‘Censor social media content & harvest data from banned accounts’: House Intel witnesses testify on combating misinformation

October 15, 2020

SHARE

facebook icon facebook icon

Expert witnesses tell the House Intelligence Committee how to best combat “misinformation” online with more content restriction on social media and posthumous data harvesting from banned accounts.

Today, the House Permanent Select Committee on Intelligence held a rare open hearing on the subject of “Misinformation and Conspiracy Theories Online” where expert witnesses testified on best practices for censoring content.

Towards the end of the hearing, Chairman Adam Schiff asked a peculiar question whose answer hints at a future where there’s even more censorship and greater amounts of data harvested from social media accounts.

Cindy Otis

Cindy Otis

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed” — Cindy Otis

Schiff asked Alethea Group Vice President of Analysis Cindy Otis what data social media companies are not sharing that they should be sharing in order to help analysts do their work.

Otis responded that getting access to the data from content that had already been removed would be “extraordinarily helpful” for conducting digital autopsies to find out the strategies, methods, and tactics of social media movements.

“As researchers, as analysts, one of the most important things for us is getting the data on what content, what accounts, what pages, and all of that have been removed,” Otis testified.

“On Facebook, for example, you get an announcement every week or every couple of weeks about the content that’s been removed. We get a couple of screen shots maybe. We get maybe an account, maybe a page name — that sort of thing — but it’s after the content has been removed.”

“That sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how’s this manifesting on the platform” — Cindy Otis

Otis added, “Unless we were particularly tracking that threat or were part of that analysis to begin with, we’re not able to go back and identify the tactics and procedures that were used by threat actors to do this campaign in the first place.

“And so that sort of data would be extraordinarily helpful as we look at things like current threat actors shifting their operations, what new tactics are they employing, how this is manifesting on the platform.”

While Otis called for harvesting data posthumously from banned accounts like digital autopsies, Melanie Smith, Head of Analysis at Graphika Inc., testified that big tech platforms should continue to restrict content, so that movements like Qanon would be forced to alternative platforms with smaller audiences.

Melanie Smith

Melanie Smith

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted” — Melanie Smith

She argued that on the so-called alternative platforms, there would be fewer opportunities for the cross-pollination of ideas.

“The best possible solution, here, when we restrict content on mainstream social media is that Qanon will retreat to the fringes, and therefore not be able to be exposed to new audiences and new communities that could be impacted,” Smith testified.

But it didn’t stop with the big tech companies. Smith told the committee that there should also be more pressure on alternative platforms to restrict content after it’s already been beaten back to “the fringes.”

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space,” Smith testified.

“I also think there could be changes to platform engineering to restrict the exposure of new audiences to algorithmic re-enforcement of some of these ideas,” she added.

If you combine the strategies of both Smith and Otis, what you get is more censorship, and then once a user or group is banned, the data is harvested posthumously to discover their tactics.

“We need to be talking to more alternative platforms about restricting content and making a concerted effort in that space” — Melanie Smith

As big tech companies purge thousands of accounts for spreading so-called conspiracy theories, there’s a lot of personal data that analysts could have access to if they had their way and if the platforms were to have an obligation to hand over that data.

The data could then be used to track where users go next, and the potential for abuses of privacy is enormous, no matter how well-intended the idea may sound.

Why would the chairman of the House Intelligence Committee ask data analysts what they would need if he wasn’t already thinking about a  way to obtain that data?

If analyzing data harvested from banned accounts would be so “extraordinarily helpful,” would there be an incentive to ban even more accounts, so more data could be collected?

Where would it end?

Online censorship is toppling statues that haven’t been built yet: op-ed

Facebook to censor anything it deems necessary to avoid ‘adverse legal or regulatory impacts’

SHARE

facebook icon facebook icon

Sociable's Podcast

Trending