- 13,865
- Adelaide
- Neomone
I'm not worried about rich snooby celebrities or athletes, what if someone made a very convincing video of a "rival/enemy" president saying something that leads to a massive chaos? What about bullies using it to distort victims into some sort of sick person? What if it's used for political propaganda reasons?
The more I hear about the current and future times, the more I wish if I was born 50-500 years prior for all I know.
I think you'll find that people could do similar things in the past. They might have looked different, but the results would be the same. In the dark ages if a respected member of the community condemned you as a heretic you were going to have a real rough time no matter what the truth of the matter was.
Most societies have some method by which people can lie to completely mess up the lives of others. Hell, 20 years ago you could just accuse someone of being a paedophile and things would probably not go well for them.
This is just the latest in a long line of ways that humans can be dicks to each other. Thankfully, it's relatively easily counteracted in the same way as all the others - apply a little skepticism and common sense. Unfortunately, humans are on the whole resoundingly bad at using either of these things.
Assuming he's not the one who was paid to do it in the first place.
I guess, but that gets into the question of whether it's practically possible to create a fake that cannot be identified as such. Technically it's possible, any video is just a set of binary data and so one could artificially create the exact same set of values that a real video would have. Practically it may be possible, but it seems at best incredibly hard to do it to the level that a criminal analyst couldn't figure it out.
You've got an actor that isn't Tom Cruise with TC's face mapped onto him, so there's any number of potential non-face physical features that could give it away. There's the blending between the edge of TC's face and the actors face, and any edges like the hands waving in front that all need to be 100% perfect when stepped through frame by frame. The lighting has to be spot on, the reference material that they built the TC face from needs to be time appropriate for when the video is claimed to have been shot, the video noise across the face has to be the same as the rest of the scene, and I'm sure that a professional could come up with more ways to catch this out. It's not so much about the skill of the creator as much as how much time it would take.
I'm sure given enough time anything is possible, but it feels a bit like cracking passwords. Yeah, any password is technically crackable by brute force, but we consider them secure because it takes long enough that it's not practical in real life without other workarounds. I don't think the current iteration of deepfakes is ever going to reach the level of completely indistinguishable from real video on a technical level - there are too many limitations with adding a face to a different body no matter how high the fidelity. But the same tech can be extended to take whole body video of a human as it's learning data, at which point you're essentially creating a full 3d model that simply mimics the behaviour of the actor. Make it all high enough fidelity and potentially you have something indistinguishable, but I suspect by the point that technology is widely available the idea of video as an indisputable source of accurate information will have long since died.