We’ve been watching all the chatter on social media lately about deepfakes and the (sometimes humorous) trouble AI engines such as Midjourney have with rendering human features like hands and ears realistically. Freaky fingers are now the first thing a media-literate person looks for in an image that seems too good to be true. People are even thinking of clever new ways to leverage that weakness to their advantage:
The ring-finger finger-ring in that Tweet is real, by the way—it’s a 2016 creation of German artist Nadja Buttendorf. There’s good news for those who are thinking of trying this trick out, too; she also has you covered on the uncanny ear front!
But seriously: Deepfakes are posing increasingly real and serious challenges for journalists, human rights advocates, public figures, and anyone who wants to know if they can believe what they see. Pending legislation in the Senate highlights deepfakes and digital provenance as issues of major concern for national security. Lawmakers are calling for the creation of a national task force to create standards for establishing digital authenticity where it matters most, and to put research funding toward the development of new tools for trust.
At Medex Forensics, we’re heavily invested in this process. Our patent-pending technology is already being used by national, state and local police, as well as NGOs and civilian watchdog groups like WITNESS and the New York Times Visual Investigations Team, to quickly and reliably identify video that isn’t what it pretends to be. We are also proud to be joining with trusted clients and colleagues from across the tech community as a contributor member of the Coalition for Content Provenance and Authenticity (C2PA). We’ll be sharing a larger press release on this initiative in a follow-on blog post.