Last week Brett Shaver’s had a good post on his blog about placing the suspect behind the camera (Link). Phill Moore named this post in his excellent weekly roundup This week in 4N6 and also suggested the tool Camera Ballistics. Coincidentally I got a chance this week to test the latest release of Camera Ballistics, a chance I certainly didn’t want to pass on.
Disclaimer:
I was able to test this product under a commercially purchased license.
The license was not provided by Compelson Labs for this review and I am not affiliated with Compelson Labs and their products in any way.
Camera Ballistics 2
Camera Ballistics is a unique tool that is able to identify photo’s that have been taken with a specific camera. Camera Ballistics does not identify the pictures by analyzing meta data like EXIF data, but analyses the characteristics of the camera sensor. Each sensor is unique, by analyzing several photos taken by a camera the software is able to generate a unique fingerprint of the device. You are able to use this information to identify pictures taken with the same camera.
The software can be valuable when you want to prove a suspect has taken certain images with his or her camera. If are in possession of the camera of a suspect you are able to create a unique fingerprint of this camera and compare it against images to prove that this camera took the images. If you are not in the possession of the camera, or the camera is broken, you are still able to run the software using images you know are taken by a suspect (e.g. holliday images).
The process to identify pictures consists out of a simple 2-step approach: 1. Learn, 2. Identify.
Step 1: Learn
The first step is to create a fingerprint from reference photos. The best way to supply these is to create some new pictures taken from empty walls or clouds. The images shouldn’t contain a lot of detailed information. If you are unable to create these pictures because you don’t have access to the original camera, you should try to find images taken with this camera with as little detail as possible.
Some examples:
If you dive into the program without understanding the learning process you might run into some issues. One thing that could be improved in future versions is how the software handles images that are unsuitable for learning. During my testing, the software sometimes crashed on images that contained too much “noise”, while other times it would produce an error message. Another user was able to explain to me that this normal but only occurs when you use unsuitable images. In this case, noise means that the images contain too much detail, like explained before it’s important that the images used for learning don’t contain a lot of detail.
As soon as the fingerprint has been created you are greeted with the Fingerprint properties. This includes the Normalised mean square error (NMSE). The NMSE is an estimation of the overall deviations between predicted and measured values. As you might understand, the smaller the number the better. If you get a high value here you might want to add a few extra images to the analysis.
Step 2: Analyze
In the second step, analyze, you point the program to a directory containing the images you want to analyze and select a fingerprint to compare against. It isn’t possible to compare against multiple fingerprints. If you have multiple cameras you want to check you have to run the analysis multiple times.
After each analysis, you are presented with a summary of the results.
The program places the results into 5 categories:
- Very high probability – match of the fingerprint on at least 99.9%
- High probability – match of the fingerprint on at least 99.9%
- Medium probability – match of the fingerprint on at least 99%
- Low probability – match of the fingerprint on at least 95%
- No match / Fingerprint not found
The match probability is shown on top of every analyzed image.
You are also able to filter the results based on these categories.
Step 2b: Report
You are able to generate a report of your results. This report contains all analyzed images and the result. Each image will also contain a summary of the EXIF data. It’s a pity the report does not contain any additional information regarding the match. Without any supporting information regarding the match, you have to completely trust the software. Personally, I prefer the understand the inner workings of a tool before I accept its conclusions. With photo manipulation, it’s possible to use an automated tool to detect manipulation using error level analysis. This is, however, something you can replicate yourself in order to verify its findings.
The way this program automates everything and presents it to you in a very simplistic and accessible way makes for a very convenient user experience. The lack of any detailed information regarding the match in the report is a shortcoming.
Camera Ballistics 2 Tested
In order to put Camera Ballistics to the test I gathered 42 photo’s I took over the last 13 years with 3 different cameras that I still have in my possession. I conducted two tests.
Case 1: I don’t have access to the cameras.
I generated the fingerprints from existing images from my library of photos.
Case 2: I still have access to the original cameras.
I generated the fingerprints from new images specially created for fingerprinting (Only the DSC-H3 and FZ38).
Test images
In order to give the software something to compare the fingerprints against I created folders containing images from 3 “Suspects” with each their own set of photos:
- Emilio Largo
- Aristotle Kristatos
- Brad Whitaker
To make sure Camera Ballistics didn’t identify the photo’s by image dimensions or EXIF data I resized all images down to 3264 x 2448 px and removed all EXIF data. I also included some “trap” images for the Sony-H3 (Suspect 2) and FZ38 (Suspect 3) which are sample images from the internet taken with the same model camera. If the software performs as advertised it should not match with these images.
Case 1: I don’t have access to the cameras.
In this case, I assume that the camera was not working anymore or not in possession of the investigator. I generated a fingerprint using suitable images from my archive.
These are the images I used to generate the fingerprints:
Suspect 1: Emilio Largo
Suspect 2: Aristotle Kristatos
Suspect 3: Brad Whitaker
Results Case 1
Next, I ran the fingerprint against the test images.
Suspect 1
- NMSE: 0.139
- Match: 5/15 (33%)
- False positive: 0 (0%)
Suspect 2
- NMSE: 0.124
- Match: 12/14 (86%)
- False positive: 0 (0%)
Suspect 3
- NMSE: 0.201
- Match: 10/13 (77%)
- False positive: 0 (0%)
I was pleasantly surprised to see that the software performed so well. While it didn’t detect all the images taken with the cameras it did not trigger any false positives. It didn’t perform as good on the images of suspect 1 as on the images of the other suspects. The camera used by suspect 1 is a rather old camera without optical zoom, so it might lack the required amount of detail for this program to work properly.
For detailed per image results, please see the attached table at the end of this post.
Case 2: I still have access to the original cameras.
In this case, I assume that the investigator has access to the camera. I took some suitable pictures for fingerprint generation.
These are the images I used to generate the fingerprints:
Suspect 1: Emilio Largo
Suspect 2: Aristotle Kristatos
Suspect 3: Brad Whitaker
Results Case 2
Next, I ran the analysis again with the new fingerprints.
Suspect 1
- NMSE: 0.071
- Match: 5/14 (33%)
- False positive: 0 (0%)
Suspect 2
- NMSE: 0.108
- Match: 12/14 (86%)
- False positive: 0 (0%)
Suspect 3
- NMSE: 0.124
- Match: 11/13 (85%)
- False positive: 0 (0%)
I was surprised to see the same results with a few exceptions. I expected the results to be more accurate since I created the most optimal images for the software. This is also reflected in the lower NMSE. But it seems the images I provided in Case 1 were good enough for the software to do its job. Only one additional image was identified in the image set of Suspect 3. In this small-scale experiment, it seems it doesn’t really matter if you use suitable images from an archive. Interestingly enough the new fingerprint of suspect 1 resulted in other matches than the old fingerprint from case 1.
Additional tests
The different matches between Case 1 and Case 2 in the image set of Suspect 1 made me curious if combining new and old images would result in more matches. To test this is combined the images used for both Case 1 and Case 2 to generate a new fingerprint.
Results combined test:
Suspect 1
- NMSE: 0.036
- Match: 0/14 (0%)
- False positive: 0 (0%)
Suspect 2
- NMSE: 0.052
- Match: 13/14 (93%)
- False positive: 0 (0%)
Suspect 3
- NMSE: 0.084
- Match: 11/13 (85%)
- False positive: 0 (0%)
While the combined fingerprints do result in a lower NMSE and one additional result in the case of suspect 2, it does also result in no results at all for suspect 1. Again I think this has more to do with the camera in question which is rather old.
I also did some additional testing with a larger image pool of over 5000 images. In these tests, the software correctly identified about 80% of the pictures taken with the camera. In no case did the software render a false positive, something I was pleasantly surprised about. The speed when analyzing these images was about 0.07 seconds per image.
Verdict
Camera Ballistics is an interesting tool. When I first heard about the software I have to admit it was skeptical. The developer claims that the software is able to identify anomalies in images and use that information to generate a unique fingerprint of the device’s sensor.
During my tests I was able to verify this claim, I have found that the software was able to correctly identify about 80% of all the images taken with a specific camera. More impressive was the fact that the software did not generate a false positive during my tests.
The tool is extremely user-friendly and offers a simple and easy to use interface. This is also one of the main downsides of the software. The lack of any detailed information about the matches makes it impossible for me as the investigator to verify the results. While the online user manual explains the workings of the software in more detail (About Camera Ballistics technology) it’s still up to the investigator to “blindly” trust the outcome of the analysis. Therefore I would personally primarily use this tool as a triage tool or as a way to verify my findings. Your findings should always be backed by multiple tests.
As this moment (17-08-2017 v2.0.0.9325) I feel the tool has proven itself to be accurate enough to be of great value during investigations where you need to reliably identify images taken with a certain camera. I do however hope that the developers expand the reports with detailed information about the generated fingerprint and how this relates to the matched images.
Research papers
If you found this software as interesting as me and want to read more about Digital Camera Identification from sensor noise you might find the following research papers interesting.
Digital Camera Identification from Sensor
Pattern Noise by Jan Lukáš, Jessica Fridrich, and Miroslav Goljan
http://www.ws.binghamton.edu/fridrich/Research/double.pdf
Sensor Noise Camera Identification: Countering Counter-Forensics by Miroslav Goljan, Jessica Fridrich, and Mo Chen
http://ws2.binghamton.edu/fridrich/Research/EI7541-29.pdf
Improvements on sensor noise based source camera identification by Y. Sutcu, S. Bayram, H. T. Sencar and N. Memon.
http://isis.poly.edu/~forensics/pubs/icme2007.pdf
Source camera identification based on sensor dust characteristics by A. Emir Dirik, Husrev T. Sencar and Nasir Memon.
https://isis.poly.edu/~forensics/pubs/safe2007.pdf
Detailed results case #1
S1 | S2 | S3 | |
AK1 | High | ||
AK2 | Medium | ||
AK3 | V. High | ||
AK4 | V. High | ||
AK5 | |||
AK6 | Low | ||
AK7 | V. High | ||
AK8 | High | ||
AK9 | V. High | ||
AK10 | V. High | ||
AK11 | Medium | ||
AK12 | V. High | ||
AK13 | V. High | ||
AK14 | |||
BW1 | V. High | ||
BW2 | High | ||
BW3 | |||
BW4 | V. High | ||
BW5 | |||
BW6 | V. High | ||
BW7 | High | ||
BW8 | |||
BW9 | V. High | ||
BW10 | V. High | ||
BW11 | V. High | ||
BW12 | Low | ||
BW13 | V. High | ||
EL2 | |||
EL3 | Low | ||
EL4 | |||
EL5 | |||
EL6 | Low | ||
EL7 | |||
EL8 | |||
EL9 | |||
EL10 | |||
EL11 | High | ||
EL12 | Low | ||
EL13 | Low | ||
EL14 | |||
EL15 | |||
EL16 | |||
S2Fake1 | |||
S2Fake2 | |||
S2Fake3 | |||
S2Fake4 | |||
S2Fake5 | |||
S2Fake6 | |||
S3Fake1 | |||
S3Fake2 | |||
S3Fake3 | |||
S3Fake4 | |||
S3Fake5 | |||
S3Fake6 |
Detailed results case #2
S1 | S2 | S3 | |
AK1 | V. High | ||
AK2 | Medium | ||
AK3 | V. High | ||
AK4 | V. High | ||
AK5 | |||
AK6 | V. High | ||
AK7 | V. High | ||
AK8 | Medium | ||
AK9 | V. High | ||
AK10 | V. High | ||
AK11 | V. High | ||
AK12 | V. High | ||
AK13 | V. High | ||
AK14 | |||
BW1 | V. High | ||
BW2 | V. High | ||
BW3 | Low | ||
BW4 | V. High | ||
BW5 | |||
BW6 | V. High | ||
BW7 | V. High | ||
BW8 | |||
BW9 | V. High | ||
BW10 | V. High | ||
BW11 | V. High | ||
BW12 | Low | ||
BW13 | V. High | ||
EL2 | |||
EL3 | |||
EL4 | |||
EL5 | |||
EL6 | |||
EL7 | |||
EL8 | |||
EL9 | Low | ||
EL10 | Low | ||
EL11 | Medium | ||
EL12 | Low | ||
EL13 | Low | ||
EL14 | |||
EL15 | |||
EL16 | |||
S2Fake1 | |||
S2Fake2 | |||
S2Fake3 | |||
S2Fake4 | |||
S2Fake5 | |||
S2Fake6 | |||
S3Fake1 | |||
S3Fake2 | |||
S3Fake3 | |||
S3Fake4 | |||
S3Fake5 | |||
S3Fake6 |
Tested version: Version 2.0.0.9325