News organizations, camera makers, and tech companies create a web tool called Verify for checking the authenticity of images for free. It's being adopted by Nikon, Sony, and Canon.
Camera Companies Fight AI-Generated Images With ‘Verify’ Watermark Tech::undefined
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it’s a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
It’s not just a sig on the image, but on metadata as well. Harder to fake time + place if they implement it thoroughly. (I.e., they would have to make it only trust GPS and verify against an internal clock, I suppose, and not allow updating time and location manually.)
…including the date and time a photo was taken as well as its location and the photographer…
Not including gps and time makes this worse, but including it makes it useless because you can’t ever verify a photo sent across social media, since the exit tags will be stripped.
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it’s a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
It’s not just a sig on the image, but on metadata as well. Harder to fake time + place if they implement it thoroughly. (I.e., they would have to make it only trust GPS and verify against an internal clock, I suppose, and not allow updating time and location manually.)
That raises the bar on faking but doent rule it out just give more credit to those who can fake it
Not including gps and time makes this worse, but including it makes it useless because you can’t ever verify a photo sent across social media, since the exit tags will be stripped.
Add a depth sensor?