Deepfake Video Trust
The cluster debates the impact of easily generated AI fake videos on trust in video evidence and media authenticity, often comparing it to Photoshop's effect on images and predicting widespread skepticism or disinformation risks.
Activity Over Time
Top Contributors
Keywords
Sample Comments
I kind of suspect that fake videos will not be much more of an issue than photoshopped images. I don't recall with certainty but I believe there was some fear surrounding the ease at which photographs could be manipulated in the digital age. Some combination of a general growing awareness of the ease at which photos could be altered along with continued advancements in the ability to determine the authenticity of photographs has rendered faking of photographs relatively harmless. In fact, I
I think your faith in the public to be discerning is hopelessly naive. The evidence is all around us that people just don’t care. You don’t even need to photoshop images anymore, just write down a few specially-crafted words and people just run with it. Altered images go largely undetected by the vast majority of people, and video will be even worse. Video just has a truth to it way more than any other medium.
How can we trust photos now? Fiction or reality, it's becoming harder to differentiate
At first I was also fearful of people using this tech to make fake "evidence" of things...But now that I think about it, the most damage will come from making people even more incredulous... of everything.It will become even harder to use evidence to prove a point. Uninformed people will just say "yeah sure, it's probably faked".On the other hand... who could blame them?
I think people overestimate the concern of fake videos. Consider photos for comparison. There have been fake photos of well known people for decades online, many of which are indistinguishable from reality. It doesn't lead to much confusion or issues in our everyday life. We just assume every noteworthy image is fake unless it comes from, or is cleared by, a credible source. The same will apply to video.
I don’t understand the hand-wringing about this. We have had this problem for decades with all forms of media other than video. Text, photos, and audio are all easily faked and can be done so to a degree indistinguishable by 99.99% of people. Why weren’t we all terrified that bad actors would create fake images or audio files that the stupid electorate would find so persuasive that it would overthrow governments and swing elections? What makes video different? And in particular, why won’t video
Re 1 - aren't people mostly doing bullshit jobs already?Re 2 - fakery in videos and images has been a part since their inception.So, don't worry, I guess?The only thing is the scale of bs that could be put out, and whether the wider population lose trust in what they see on screens. Which would be a good thing, imo.
the larger issue is that if it's this easy to create fake photorealism, video evidence is no longer valid.not from the government.not from amateur videographers.not even from a video you took yourself-- think of it from everyone else's perspective.but people believe their eyes.we are witnessing the capstone of the national security state's propaganda strategy for the next few decades... when in doubt, construct a false reality.
Is it that much worse than the effect that photoshop had? We have been able to fake pictures/images for a long time now. What changed is that people no longer blindly trust pictures. I assume something similar will happen with video.
People already produce all kinds of fake news and doctored photos and false flags and all kinds of things. This has been going on since we developed language and photography I suspect.People already have trouble telling propaganda from fact. That has been going on since forever.At the end of the day I don't see this being a game changer. If anything, now video and photos are less evidence for/against something as the potential falseness becomes well known. Congressman X: "no