Paul Shales is a computer programmer by day and takes care of run-of-the-mill operations for a bank. By night though, he creates deepfakes that portray Kim Kardashian freestyle rapping, President Donald Trump as a tantrum-throwing pageant contestant on “Toddlers & Tiaras”; and Elon Musk as a weird, giggly baby.
Paul’s shenanigans are just for fun, not meant to scare or malign anyone. There do however exist, on the internet, a gazillion videos that portray people doing things they never even thought of doing. Real faces, real people, so close to photorealistic footage; caught up in entirely unreal events.
These “deepfakes” are videos fabricated using a particular kind of AI. Inevitably, they had their roots in a thriving online market for superimposing celebrity faces on porn actors’ bodies, but now they pose a bigger threat. People are increasingly voicing concerns over their impact on our already fervid political scenario. These concerns have prompted the US congress and the British government to look into ways of regulating them.
Check out this video purporting to show Kim Kardashian West mocking her fans for violating copyright and showing her support for a shadowy organization known as “Spectre”. Hopefully, you see a blank!
YouTube stepped up and took down the deepfake on Monday, which gives us some hope in the fight against deepfakes, but the method is unlikely to help much as such videos continue to target an ever-greater number of people every day. The video, posted by anti-advertising activists Brandalism, was removed only after a copyright claim by publisher Condé Nast, since the original was posted somedays back by the publisher’s Vogue magazine.
Henry Ajder, head of communications and research analysis at Deeptrace, an organization currently devising a system to detect deepfakes on the web, has raised concern over the increase of deepfakes being uploaded to YouTube. The Kardashian copyright claim sets a new precedent for when such videos can be removed, but the matter is more complex than it appears to be. Who decides if such manipulated videos are being used ‘fairly’? Taking videos like these make tech companies vulnerable to tirades of accusations of taking away freedom of expression.
He further added that,
“It certainly shows how the existing legal infrastructure could help,” Henry Ajder, head of communications and research analysis at Deeptrace, told Digital Trends. “But it seems to be available for the privileged few.”
But does this all mean that deep-fakes are subject to copyright claims? If so, like we have the Kardashian video to show for it, it would make for a simple reason to take down misleading fake videos. YouTube hasn’t yet confirmed that it is coming out a new policy to mitigate such deepfakes.
While this seems like a new weapon in our arsenal, there is still a long way to go before we can wage war against deepfakes. For one, a huge company like Condé Nast can effortlessly make a copyright claim on YouTube any day, but for me and you, we are not so fortunate. If someone made a deepfake of you, by recording you doing something but then manipulating it, there’s no copyright claim to make since they are the ones who have recorded the footage.
Even worse, what if someone takes your pictures off any social media platform and make a video of you doing or saying something you never did? Well, it’s already happening all over the world. Women have seen the worst of it so far, including those whose faces have been pasted on the bodies of adult stars. Once a video has gone viral, what can one do about it. According to Aider,
“The legal recourse to take down deepfakes of individuals are sparse. We don’t have the infrastructure in place to deal with these problems.”
We appreciate what YouTube has done but not everybody seems to be following up. For instance, Another Brandalism video deepfaking CEO Mark Zuckerberg as praising Spectre has garned over 100,00 views on Instagram. The original video was a Zuckerberg interview on CBS News. Even though CBS has requested Instagram to remove it on grounds of an “unauthorized use of the CBSN” trademark. However, its sad to see the Zuckerberg deepfake still online on Brandalism’s page, while the Kardashian video is still online on both Twitter and Instagram.
While Youtube did take it down, it remains on Facebook with a note tagging it as fake. Such a malicious video, focused on slighting one person, seems much more a threat than the Kardashian one, which some people found plain funny. YouTube will shut down misinformation when pushed, but Facebook would prefer not to be an arbiter of truth.
What chance does people like us stand!