Digital Forensics

Fake Spotting: The Challenge of Authenticating Photos in a Generative AI World

16 oktober 2023

Imagine having the ability to create brand-new content with just a few clicks of a button. This is the power of generative artificial intelligence (AI), a cutting-edge technology that can generate text, images, and videos based on existing data. While still in development, the potential of generative AI to revolutionize industries, such as marketing, entertainment, and product development, is truly incredible. With generative AI, the possibilities are endless.

The potential for generative AI to positively impact the world cannot be understated. However, it is crucial to be aware of the risks of misuse. In this article, I hone in on the challenges related to faked photos generated through artificial intelligence.  

Common Procedures Before the AI Era

Before generative AI, video and photo forensics experts used various methods to determine if a photo was fake. Some of the most common procedures included:

Analyzing Metadata

A photo’s metadata can contain information about the camera used to take the picture, as well as the date and time the photo was taken. Forensic experts can use this information to identify inconsistencies that may indicate a photo is fake. For example, suppose the metadata indicates that the photo was taken with a camera that was not yet available at the time that the photo is purported to have been taken. In that case, this is a clear sign of forgery.

Analyzing Lighting and Shadows

Forensic experts can look for inconsistencies in the lighting and shadows in a photo to identify signs of manipulation. If a shadow is going in the wrong direction, or if two objects are casting shadows in different directions, this could be a sign that the photo has been edited. Forensic experts use various tools and techniques to analyze the lighting and shadows in a photo, such as measuring the angles of shadows and comparing the brightness of different areas of the image.

Analyzing Textures

Forensic experts can also look at the textures in a photo to identify signs of manipulation. If someone's skin looks too smooth or plastic-like, this could be a sign that the image has been edited. Forensic experts can examine the photo's individual pixels and compare the textures of different objects in the photo.

Analyzing Reflections

Reflections in a photo can help forensic experts identify signs of manipulation. For example, if a reflection in a mirror differs from the object being reflected, this could be a sign that the photo has been edited. Forensic experts can use various tools and techniques to analyze the reflections in a photo, such as measuring the angles of reflections and comparing the brightness of different areas of the photo.

Specialized Video Forensic Software

There are specialized video forensic software programs that can be used to analyze photos for signs of forgery. These programs can look for inconsistencies in the lighting, shadows, textures, reflections, and other signs of forgery. For example, some software programs can be used to detect the presence of cloning, airbrushing, and liquefying.

While these methods are still relevant and useful for uncovering fakes created by generative AI, the technical challenges and expertise required to spot fakes have increased substantially. Even the most experienced video forensic examiners are challenged by fake photos created using generative AI. As generative AI technology develops, distinguishing between real and fake photos will become even more challenging. 

Stand Out Signs of a Faked Photo

As of this writing, generative AI has challenges in creating photos that can fool an experienced forensic examiner. There are various signs of a faked photo that an examiner would review, including:

Inconsistencies in Lighting and Shadows 

Generative AI models sometimes have difficulty creating realistic lighting and shadows. For example, a fake photo may have shadows that go in the wrong direction or that are too dark or too light.

Inconsistencies in Textures 

Generative AI models can also have difficulty creating realistic textures. For example, a fake photo may have skin that looks too smooth or plastic or hair that looks too perfect.

Inconsistencies in Reflections 

Generative AI models can have difficulty creating realistic reflections. For example, a fake photo may have a reflection in a mirror that is different from the object being reflected.

Examination Using Specialized Video and Photo Forensics Software

Fortunately, specialized video and photo forensics software in the hands of a qualified photo and video forensic expert is powerful and increasing in capability as the challenge and need to authenticate photo evidence rises daily. For example, an examiner armed with these tools can perform the following examinations:

File Analysis

By using databases of known images, the original unaltered image can potentially be located to see if it originated from a social media platform before being used to create a fake. This type of analysis can also uncover the originating device, such as the camera used to take it in some instances. 

Compression and Reconstruction 

With forensic software, an examiner can uncover if a photo has multiple compression ratios in the same image and if the original compression used differs from the photo in review. Both would indicate potential tampering, for example, if more than one photo were collaged in creating the fake. This analysis can also uncover artifacts related to resizing, color processing, rotation, or other modifications to an image.

Camera Identification

If the fake is made from a photo taken with a digital camera, it is possible to link it to the camera by the visual artifacts it creates. These artifacts are often undetectable to the human eye but can be used to link the tampered photo back to the device that took the picture. For example, if someone claimed they did not take the photo, but their camera can be positively identified as the device that took the photo by comparing the tampered photo and exemplar photos from the camera, their assertion would be proven false. 

Geometric Analysis

One of the most challenging parts of forging an image is keeping the lighting, shadows, and perspective consistent with what would be captured by a camera in reality. Forensic software can be used to analyze the visual scene captured by the photo to determine if the shadows are cast realistically if the highlighting makes sense on an object or person given the location of a light source, or if the angle by which the photo was taken makes sense compared to a realistic perspective.  

Suggestions for Attorneys and Claims Professionals

While their jobs are challenging enough, it is unfortunately true that attorneys and claims adjusters need to be more vigilant than ever before. A faked photo could be a screenshot of text messages containing a damning conversation or an alleged injury or assault. Complicating this issue is that the tools used to create generative AI photos are available to everyone and require a low level of technical sophistication to employ.

In general, it is wise to maintain a posture of incredulity concerning photos submitted as evidence. Here are some suggestions for attorneys and claims professionals: 

  • Be skeptical of photos submitted as evidence, especially if the original device the photos were allegedly taken on is gone and cannot be used as a source of verification.  
  • Request the device that took the photos, not just the photos themselves. If you find a photo does warrant examination by a photo and video forensic expert, having the device the photo was allegedly taken with aids in the examination process. 
  • When looking at the photo, even if you cannot spot anything in particular but the image feels off, it might be worthwhile to have it examined.  

While generative AI creates new challenges, the sophistication of forensic methods and tools to examine photos has also. As a community, the legal and insurance world has dealt with forged documents, manually faked photos, and other forms of misinformation. Knowing is the first step in preventing or remediating the impacts of faked photos and that starts by being aware that one showing up in your case or claim is a real and distinct possibility. As they say, a picture is worth a thousand words. At least it used to be.  

Har din virksomhed været udsat for skade?

Vores eksperter kan hjælpe dig!

Om forfatteren
Lars Daniel
Lars Daniel, EnCE, CCO, CCPA, CIPTS, CWA, CTNS, CTA
Practice Leader
Digital Forensics

Mr. Lars Daniel is the Practice Leader of the Digital Forensics Division. Mr. Daniel has qualified as an expert witness and testified in both state and federal courts, qualifying as a digital forensics expert, computer forensics expert, cell phone forensics expert, video forensics expert, and photo forensics expert. He has testified for both the defense and prosecution in criminal cases and the plaintiff and defense in civil cases.

Hvordan kan vi hjælpe dig?

Vi har eksperter i mange tekniske discipliner fordelt over hele verden. Kontakt os, så vi kan finde den rigtige ekspert til opgaven.

 Envista Forensics Logo
Udforsk vores hjemmeside

Vores job er at løse komplekse udfordringer for vores kunder ved skadehændelser. Vi servicerer virksomhedsejere, små som store og uanset, hvor det er henne i verden og uagtet af, hvilket problem de står overfor.