Thanks to AI, seeing is no longer believing. This technology has given birth to deepfakes, false videos that seem really, which can be produced by anyone using a simple app or website.
This week, we will be looking at the increasing prevalence of deepfake technology and how it stands to play havoc in the media and our personal lives.
Listen to this podcast below and on Spotify, Anchor, Apple Podcasts, Breaker, Google Podcasts, Overcast, and Radio Public.
This rise is worrying for a number of reasons but above all, it is the simplicity of how they can be created that is potential the most alarming factor. In the words of this week’s expert guest “It is putting an incredibly sophisticated tool into the hands of the unsophisticated and they can disseminate and leverage that in any way they see fit.”
Read More: Is there nothing that can’t be faked with vocal, facial manipulation?
My guest this week is Adam Dodge — a deepfakes expert who has given training to hundreds of people in government, law enforcement, PR, non-profit, healthcare and more. His work has been covered in major publications such as The Washington Post and Mashable.
And for our Neuron to Something feature, where we look at scientific research in the field of psychology and technology, we have some hilarious case studies cited in a research paper discussing PornHub traffic during events such as the Hawaiian false missile alert or the 2018 Super Bowl. So stay tuned if you want a laugh.
As AI becomes increasingly central to business, HR leaders in Canada are considering again how…
The number of languages spoken across the world is immense, with some seven thousand different…
DARPA is putting together the Protean research program that would provide preventative protection against chemical…
With sophisticated precrime tools at its disposal, the proposed national police force & digital forensics…
Unlike large, traditional companies that have been in the market for decades or centuries, many…
In an age of rising diet-related chronic diseases, how we eat matters just as much…
View Comments