Sam Gregory of the nonprofit Witness, which helps people use video and technology to protect human rights, encourages viewers to rely on context and intuition in situations like this one.
“My starting point with all images like [Perry’s] is to not trust the online detectors as there are too many variables around whether they give an accurate result,” he explained over email.
He said when he ran both Perry images through a widely-used detector, the flower dress came back as “likely human” and the corset as “likely AI generated.” He also discourages people from looking for visual clues in these kinds of images, saying that can “lead down a rabbit hole of unproductive forensic skepticism.”
With a high-profile event like the Met Gala, Gregory says, it’s best to use “classic media literacy and verification approaches.” In this case, that could mean looking for more proof of Perry’s attendance, from a variety of sources.
“Although some media literacy strategies like checking the source might lead us astray — perhaps we do trust Katy Perry to share real images of herself — if we use another strategy and look for other images from the same event from reliable sources, we’d quickly see this isn’t real,” he explained.
He adds that it reminds him of the AI-created images of a fire at the Pentagon that went viral last year. In both cases, he says, the first question people should ask themselves is not whether they can spot the AI glitches in the photo, but “Why aren’t there other photos and videos of this event in a highly populated area?”