Camera Companies Fight AI-Generated Images With ‘Verify’ Watermark Tech::undefined
Great, DRM on my personal photos. Next they’re going to charge a subscription to view my own goddamn vacation pictures
Fuck this timeline. I want to get off Mr. Bones Wild Ride.
I guess this is better than nothing, but what happens if you take a photo of a generated photo? There are setups where the result will be impossible to tell that it’s a photo of a photo, and then you can have the camera digitally sign the fake photo as real.
It’s not just a sig on the image, but on metadata as well. Harder to fake time + place if they implement it thoroughly. (I.e., they would have to make it only trust GPS and verify against an internal clock, I suppose, and not allow updating time and location manually.)
…including the date and time a photo was taken as well as its location and the photographer…
That raises the bar on faking but doent rule it out just give more credit to those who can fake it
Not including gps and time makes this worse, but including it makes it useless because you can’t ever verify a photo sent across social media, since the exit tags will be stripped.
Add a depth sensor?
Can we have certificate signed media now, please?
Sooo how long until theres a plugin in a1111?