To fight that, we must be skeptical about the content we consume and understand the context behind certain videos being shared.

How deep fakes can have an adverse impact on the way we consume online contentImage for representation
Atom Deep Fakes Wednesday, March 28, 2018 - 16:11

‘Deep fakes’ have shown us the impact of what technology can do to image manipulation. This is a free software that anyone can install for morphing images into one another. Not only that, similar to Snapchat filters, this software can embed one’s face onto someone else’s.

This has been the greatest innovation in motion pictures and still images, since Photoshop, as its made the job much easier. With the spread of fake news and media manipulation, ‘deep fakes’ can add much more fuel to the fire. The way we consume content is becoming that much more fragmented. It’s all about trust, and if a media outlet can manipulate that, there isn’t much trust left.  

“The reality is that the number of people working on the forensics side, like me, are relatively small compared to the number of people working on the other side. We are greatly outnumbered and out-resourced. Google is not developing forensic techniques. Facebook is not developing forensic techniques. It's a bunch of academics. We’re outgunned,” Hany Farid, Professor of compuer science at Dartmouth University told Outline.

What first emerged as objectionable content, quickly went on to be used by media outlets for funny content via image manipulation. There was a lot of misinformation also shared across the internet claiming that certain videos were real when they were manipulated.

‘Deep fakes’ have really transformed the meaning of originality, to be able to provide consumers with what they want. They want more for less and settle for free software to manipulate images and videos for their pleasure.

Probably the worst indicator of where these videos are headed are with anti-political fakes.

A video clip of Delhi Chief Minister Arvind Kejriwal, was released showing him urging people to vote for Congress. This spread like wild fire, and the CM had to post an update telling people it’s fake.

“From my old videos, they’ve cut and spliced it to show that I am appealing to vote for Congress. There is no such thing,” he was quoted as saying.

There are significant problems associated with ‘deep fakes’, with regards to content of all kinds. When we aim to find the truth behind political content, we can be easily tricked via the visual medium. Since we trust the words more when they’re spoken, ‘deep fakes’ can easily manipulate voices, images and gestures of all leaders and company heads, so that the message can be changed as necessary. The true problem arises when these videos are uploaded and deleted by spam accounts.

“Those kind of things are happening already. There is no sophistication in it, you take some video, give it a false narrative and push it out. That is when they will have to advance to more sophisticated kind of technologies,” Pratik Sinha, CEO Altnews, working to expose fake news and scammers told FactorDaily.

Owing to the rise of fake social media followers and Russian impact on American elections via social media, fake videos can have a drastic effect in a certain region. People can easily edit and showcase fake videos that can incite an entire population, for political motives.

This can also cause problems when it comes to political leaders taking accountability for their words. They can claim it is fake news, or Deep Fake manipulation. No truth can be completely verifiable, unless this technology has an antecedent that can track the editing of the original video. We can do that to some extent on manipulated images, but with the rapid rise in technology, it’s going to get tougher.

Although major publications and content houses have deleted any remaining deep fakes, there are videos still being floated on the dark web. This is where a majority of the content exists and can only be accessed through a Tor browser. The browser protects the identity of the person sharing these videos, thereby masking the source completely.

This means that there can be millions of videos in circulation, affecting the way we think and consume content online. These videos could influence our decisions towards certain brands, people and policies that are presented to us.

In India itself, the problem could get out of hand quickly when mainstream media picks up these videos but doesn’t know their source. This is because the person posting them has been able to hide their identity completely from mainstream media companies.

The area where the waters get murky is that the technology is so good that it can fool even discerning viewers. By the time experts weigh in on the matter, the video gets lost in the barrage of online content available to viewers.

Content consumption hubs like Reddit, Facebook and Twitter can only do so much before utilising manual reviews to track and stop the extent of deep fakes out there. They have systems and professionals in place, but it’s going to get harder as the number of videos increases and the technology becomes more advanced.

While international conflicts arise due to miscommunication and language barriers, both sides may use deep fake videos to fuel their propaganda. It can also be utilised for manipulating the way we think in our own social context. If we see many videos of a certain community performing criminal activities, we may be less inclined to have positive views about them. This can open up many challenges in terms of context.

That’s the missing factor here – “context”. When content doesn’t have the right context and consumers can’t figure it out for themselves, it leads them towards a slippery slope. This is when audiences start to believe everything that’s out there. To fight that, we must be skeptical about the content we consume and understand the context behind certain videos being shared. At the end of the day, the responsibility rests on our shoulders to be discerning consumers of content.

 

Show us some love! Support our journalism by becoming a TNM Member - Click here.