Thoughts on Content Credentials?
Quote from lil ie on March 24, 2024, 5:36 pmIf you've been living under a rock like me and just hearing about it for the first time here's a summary:
Content Authenticity Initiative association was made by Adobe, Twitter and New York Times (yikes) in 2019 to increase authenticity in media or something of the sort. They rolled out Content Credentials which is an open source (yay) software technology that lets you document the creation and edit history of media in the metadata that is functionally read only. Like EXIF, but extended to image editing amd you can't just edit it to whatever. That way photos can be sourced easily and if AI is used it is transparently seen. Theoretically it applies to other media too but it's mostly discussed in context of images.
Around 200 companies are partners of the association now inlcuiding Canon, Sony, Nikon and Leica who let you add content credentials at moment of capture in their latest and upcoming cameras instead of just export at your computer which makes the whole thing function better .
Do you think this will actually help with authenticity? Do you want it in your camera? I sure do. The tech is open source so Adobe shouln't make a cent from it, I'm a fan. Do you want this in your editing software? I'm concernted that the darktable team is not gonna add it, that would be a bummer.
If you've been living under a rock like me and just hearing about it for the first time here's a summary:
Content Authenticity Initiative association was made by Adobe, Twitter and New York Times (yikes) in 2019 to increase authenticity in media or something of the sort. They rolled out Content Credentials which is an open source (yay) software technology that lets you document the creation and edit history of media in the metadata that is functionally read only. Like EXIF, but extended to image editing amd you can't just edit it to whatever. That way photos can be sourced easily and if AI is used it is transparently seen. Theoretically it applies to other media too but it's mostly discussed in context of images.
Around 200 companies are partners of the association now inlcuiding Canon, Sony, Nikon and Leica who let you add content credentials at moment of capture in their latest and upcoming cameras instead of just export at your computer which makes the whole thing function better .
Do you think this will actually help with authenticity? Do you want it in your camera? I sure do. The tech is open source so Adobe shouln't make a cent from it, I'm a fan. Do you want this in your editing software? I'm concernted that the darktable team is not gonna add it, that would be a bummer.
Quote from KankRat on March 25, 2024, 12:49 pmsounds interesting.
I've been limiting what I post on the web. Contemplating deleting my whole Flickr page.
sounds interesting.
I've been limiting what I post on the web. Contemplating deleting my whole Flickr page.
Quote from James Warner on March 27, 2024, 1:53 amI haven't heard of this before, so this is just my initial reaction to what you said. It sounds like a good idea, and making it an open standard would be important for adoption.
I think it's even more important with AI imagery now. Both to detect what was actual a photo taken on a physical camera, but also maybe it could help limit your photos from being used to train AI. I know there's been a lot of backlash with where some of these generative models have sourced their training data.
All that said, and I really do think some way to protect people's creative property is a good thing, I've never personally been that protective of my own photos. I never watermark them. Some of my best stuff is just floating on Flickr available for high resolution download. Of course, I don't want people to use them for their own stuff and certainly not commercial, but if that ever came up I would battle it then. I've talked to a lot of photographers who feel very different than that so I know I may be alone, but it's just never really worried me before. Maybe because my photos aren't very good LOL
I haven't heard of this before, so this is just my initial reaction to what you said. It sounds like a good idea, and making it an open standard would be important for adoption.
I think it's even more important with AI imagery now. Both to detect what was actual a photo taken on a physical camera, but also maybe it could help limit your photos from being used to train AI. I know there's been a lot of backlash with where some of these generative models have sourced their training data.
All that said, and I really do think some way to protect people's creative property is a good thing, I've never personally been that protective of my own photos. I never watermark them. Some of my best stuff is just floating on Flickr available for high resolution download. Of course, I don't want people to use them for their own stuff and certainly not commercial, but if that ever came up I would battle it then. I've talked to a lot of photographers who feel very different than that so I know I may be alone, but it's just never really worried me before. Maybe because my photos aren't very good LOL
Quote from KankRat on March 30, 2024, 12:33 pmQuote from James Warner on March 27, 2024, 1:53 amI haven't heard of this before, so this is just my initial reaction to what you said. It sounds like a good idea, and making it an open standard would be important for adoption.
I think it's even more important with AI imagery now. Both to detect what was actual a photo taken on a physical camera, but also maybe it could help limit your photos from being used to train AI. I know there's been a lot of backlash with where some of these generative models have sourced their training data.
All that said, and I really do think some way to protect people's creative property is a good thing, I've never personally been that protective of my own photos. I never watermark them. Some of my best stuff is just floating on Flickr available for high resolution download. Of course, I don't want people to use them for their own stuff and certainly not commercial, but if that ever came up I would battle it then. I've talked to a lot of photographers who feel very different than that so I know I may be alone, but it's just never really worried me before. Maybe because my photos aren't very good LOL
I like to refer to Facebook, Instagram and X as "anti-social media", which is EXACTLY what it is. Those platforms do not deserve anyone's photography. They deserve bad cooking recipes, videos of kids pulling pranks and people who sound like they never had high school civics or economics espousing opinions on how to save the country. it's incredible how obtuse and gullible people have become as a result.
Flickr is nice, but I don't think many people even see my photos, which is fine. Actually preferred. I am not sure if there is any value in posting them there. With AI out there now, I am not sure I want to become a trainer. The easiest solution is to stop posting.
This AI thing is scary. A friend of mine bought the software. He was showing how it works. He grabbed one of my photos. This one:[url=https://flic.kr/p/2nL3oR2][img]https://live.staticflickr.com/65535/52350577571_91805be4f9_h.jpg[/img][/url][url=https://flic.kr/p/2nL3oR2]American Red Squirrel[/url] by [url=https://www.flickr.com/photos/55893063@N04/]Mark Kasick[/url], on Flickr
he fed in a really long description including my camera, the lens the EXIF data a really detailed explanation of the photo with my camera, the lens the exposure data, the color o the background, super sharp, soft background.
That came back was astounding. It had an "AI" look to it, but it was damned impressive. He owned 100% of the rights to it. I deleted the image but I'll see if I can get him to resend it to me.
I like the idea of something more malicious. I don't completely understand it, but "data poisoning" sound like more fun.
https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/
Quote from James Warner on March 27, 2024, 1:53 amI haven't heard of this before, so this is just my initial reaction to what you said. It sounds like a good idea, and making it an open standard would be important for adoption.
I think it's even more important with AI imagery now. Both to detect what was actual a photo taken on a physical camera, but also maybe it could help limit your photos from being used to train AI. I know there's been a lot of backlash with where some of these generative models have sourced their training data.
All that said, and I really do think some way to protect people's creative property is a good thing, I've never personally been that protective of my own photos. I never watermark them. Some of my best stuff is just floating on Flickr available for high resolution download. Of course, I don't want people to use them for their own stuff and certainly not commercial, but if that ever came up I would battle it then. I've talked to a lot of photographers who feel very different than that so I know I may be alone, but it's just never really worried me before. Maybe because my photos aren't very good LOL
I like to refer to Facebook, Instagram and X as "anti-social media", which is EXACTLY what it is. Those platforms do not deserve anyone's photography. They deserve bad cooking recipes, videos of kids pulling pranks and people who sound like they never had high school civics or economics espousing opinions on how to save the country. it's incredible how obtuse and gullible people have become as a result.
Flickr is nice, but I don't think many people even see my photos, which is fine. Actually preferred. I am not sure if there is any value in posting them there. With AI out there now, I am not sure I want to become a trainer. The easiest solution is to stop posting.
This AI thing is scary. A friend of mine bought the software. He was showing how it works. He grabbed one of my photos. This one:American Red Squirrel by Mark Kasick, on Flickr
he fed in a really long description including my camera, the lens the EXIF data a really detailed explanation of the photo with my camera, the lens the exposure data, the color o the background, super sharp, soft background.
That came back was astounding. It had an "AI" look to it, but it was damned impressive. He owned 100% of the rights to it. I deleted the image but I'll see if I can get him to resend it to me.
I like the idea of something more malicious. I don't completely understand it, but "data poisoning" sound like more fun.
https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/