Forces gather to ensure to protect the integrity of images

Every picture tells a story, and technologists are working together to make it a little easier to know what that story is.

There’s new information on the Content Authenticity Initiative. If you’ll remember, Adobe, Microsoft, The New York Times, and Twitter announced the formation of the Content Authenticity Initiative at Adobe Max in 2019. Obviously, they’re tackling a complex problem that along the way seeks to build a pipeline from content capture to content consumption with documentation that attests to the authenticity of the content. More members have joined and different initiatives have come together.

Adobe has already started enabling CAI in Photoshop and Behance. The addition enables data to be included with images that reveal what processes have been used in post. (Source: Adobe)

In August 2020, the group revealed their first steps on the way with the revelation of a hardware and software solution that captures authentication data with the capture of an image. In a case study video published on Adobe’s CAI website, a photographer talks about a CAI-enabled camera that enables her to take a picture with embedded information that stays with the picture through editing in Photoshop and remains attached through export. That information is available to anyone interested in the image. DPReview published a story on the workflow.

As the work has gone on, a new, related organization has emerged with additional partners including Microsoft, Truepic, Arm, Intel, and the BBC called the Coalition Content and Provenance and Authenticity (C2PA), which has been established under the auspices of the Linux Foundation. This group will take on the development of open specifications to define provenance standards.

There’s probably a rule somewhere that states that you can tell how difficult a problem might be by the complexity of the standards body name. In this case, that name reflects the participation of multiple companies who have been doing their own work on this front. Truepic, in this case, has developed Controlled Capture Technology for camera devices. Interestingly, they’re using crypto security techniques to verify the images which are “immutably” stored on Truepic’s servers. They’re working with Qualcomm on the mobile phone front. Microsoft and the BBC have been working on Project Origin, which focuses on the publisher’s side of the transaction.

And there are more to come. There have to be because the participants have not only pledged to create a powerful antidote for deep faking content but they want to create as practical solution that is easy for creative people to set in motion. In a blog post by Andy Parsons at Adobe, Andy Parsons says “the founding members of the C2PA and those that join the Coalition after today share a commitment to creating standards that bolster the public’s understanding of what is real and what is not. We aim to enable cryptographically verifiable facts about content to engender trust across all the surfaces through which we consume content.”