24042024Wed
Last updateThu, 28 Mar 2024 2pm

Legal News

Guilty as Charged???

by PAUL VELLA of Evidence Matters Ltd

Evidence Matters Ltd. has been providing Expert Witness services in defence cases for a decade now, and it never ceases to amaze me how odd and inaccurate indictments can be in cases involving computer evidence, especially in cases involving indecent images of children.

This leads me to often ask solicitors if they know what their client is actually charged with, especially if they are about to plead. The first time we encountered a serious problem with an indictment was in 2003, when we discovered that a defendant had been charged with indecent images that had been recovered from an unrelated defendant’s computer. I don’t know how the police computer forensics lab had managed to mix up the data, but our defendant was forced to sit through an interview being shown indecent images of children. Naturally, the interviewing officer didn’t believe the defendant when he kept responding with “I’ve never seen that before in my life”!

Not all mix-ups are quite as dramatic of course, but just today as I started writing this article, one of my examiners told me that in the case he is working on, the defendant has been charged with forty-one indecent (video) images of children, when in fact there are only four. The police accepted this was due to a typographical error that hadn’t been spotted by either the police or the CPS.

Earlier this year we had a similar incident where ninety-four indecent (SAP level 3) images were charged as a result of a typographical error on the part of the police examiner – there were in fact only four (SAP level 3) indecent images of children on the computer. A significant difference for the sentencing judge to consider.

Often, the problem is not as simple as a typographical error, but rather a technical one, where the police examiner doesn’t understand the technical issues. I dealt with a case this year where the police had recovered more than five thousand Level 5 indecent images of children. The problem, however, was that the forensics software had recovered each frame of a deleted video file individually. Had the file not been deleted, the forensics software would have recovered just one video file and the indictment would have looked very different. More importantly, the defendant would probably have pleaded much earlier and saved a lot of unnecessary court time. This was a case of a police examiner simply not understanding the technical nature of the video file.

Recently, we have been working on a case involving extreme pornographic images – several thousand images have been identified by police, however more than a third are neither extreme, nor in some cases, pornographic. I am at a loss as to what criteria has been used to select these images, but this is nothing new. I once dealt with a case where the defendant had been charged with two Level 1 images and fifty at Level 5. The fifty level 5 images, whilst unpleasant, did not involve children, and were not therefore illegal at the time. Someone clearly needed to understand that the Protection of Children Act required children to be in the photographs.

The problem is getting worse. Technical issues of where the images are and how they got there aside, the biggest problem seems to be that there is little or no objective checking going on. The CPS often, it seems, takes the word of the police report as to whether something is indecent or not without verifying it for themselves.
 
These days police often use a very good tool called C4P (Categorizer for Pictures). This tool, provided by a Canadian police force, allows automatic categorisation of indecent images based on previous categorisation by police, thus saving time and money. A central database of file ‘signatures’ is maintained and distributed – when the software finds an image that has already been categorised by a user in another part of the country as indecent, it automatically puts your image into that category. The problem is that it relies on every single user of the software being accurate and using the same standard of categorisation as yourself – the old adage of ‘garbage in, garbage out’ applies here. We have seen countless cases where the same picture (albeit different sizes and therefore different files) has been categorised at different levels within the same case simply because no-one from the prosecution has double checked the findings.

Just last week, upon our request for a copy of the computer evidence, a Police Force admitted they had ‘lost’ the extreme pornographic images from the computer (originally discovered during a speculative search) and the matter was discontinued. The defendant was on the Sex Offenders Register and a guilty plea to the ‘non-existent image’ would have likely impacted on his liberty.
 
We see these basic errors with alarming regularity, which is why Evidence Matters Ltd. ensure that we don’t examine the data in isolation, but examine the case in its entirety, even if that means re-examining thousands of images individually and re-categorising them correctly.