Recently, the Home Office issued a communication urging police to help make their communities safer by increasing their use of facial recognition to track down offenders. Specifically, the letter asked police to literally ‘double down’ on their use of this AI technology by May 2024 for retrospective investigative searches.
While the use of facial recognition technology has created polarizing views, it has also been the subject of much misinformation. But the fact is – facial recognition software has been around for a long time. Today, it is used in a lot of personal applications that have nothing to do with policing – for example unlocking smartphones, and even helping to recognize the person standing in front of your smart doorbell. The technology has also been proven in a policing context as well, with forces like South Wales Police taking the lead.
There are three key applications where facial recognition technology is used in police work: live, operator-initiated, and retrospective.
In a live environment, such as a large sporting event, facial recognition technology is often used in real-time in conjunction with surveillance cameras, to identify people on various watchlists. These could be missing or vulnerable people, or suspects who remain at large. Operator-initiated facial recognition, on the other hand, usually involves an officer taking a photograph and then checking it against a watchlist.
Retrospective facial recognition has the biggest potential to impact day-to-day policing operations. In fact, one force reported an annual savings of more than 11,000 officer hours (equivalent to £368,036), by using facial recognition technology to retrospectively search for suspects. In addition to saving time and money, implicitly, the use of the technology made the community safer.
But these benefits would not be possible were it not for the exponential growth in digital evidence, such as high-quality photographs and video (taken on mobile phones), CCTV, and body worn, dashcam, doorbell and drone video. The challenge is being able to collect and analyze all of this information in a timely manner. You cannot effectively use AI to search digital evidence if it’s not readily and quickly accessible and searchable in the first place. This is where digital evidence management solutions, like NICE Investigate, and retrospective facial recognition can work together.
NICE investigate is a cloud-based solution for collecting, analyzing and sharing digital evidence. It has AI capabilities of its own, but essentially it brings digital evidence together in one place.
Recently, one of the forces using the NICE Investigate solution shared an example of how NICE Investigate and facial recognition technology were able to work together. It involved the investigation of the theft of a laptop from a library. Normally, the investigator would have needed to travel to the location to obtain CCTV video. Once he made the trip, he would probably find that the video might not even be easily playable.
But in this instance, the library was able to upload footage to the force’s NICE Investigate public portal. Once uploaded the video was automatically converted to a playable format. In just over three hours the images were processed and searched, and probable matches were returned, leading to the arrest of a suspect.
Now this was a minor incident, but consider how impactful this combined technology (DEMS and AI) could have been if it had been available to the Metropolitan police, as part of their response to the riots that took place in London in 2011. In the aftermath of the incident, more than six hundred officers and staff, along with one hundred volunteers, were needed to view and produce evidence packs from 200,000-plus hours of CCTV footage, recovered from shops, local authorities, public transport and mobile phones. Such was the scale of the task, officers spent 10 hours per day viewing the material.
It’s true, there are still some among us who have reservations about the use of facial recognition technology (particularly in a live context). Ongoing education and positive examples of uses of facial recognition technology (applied responsibly) are going to be vital to winning minds and hearts.
An important voice in the debate is the UK’s Surveillance Camera Commissioner, who in a blog post in May 2023 said: “I am convinced that modern facial recognition, and other AI-driven biometric surveillance technologies in the pipeline, are potentially too useful an advance in the fight against crime and terrorism for us to turn our noses up at. And while the many legal issues have yet to be defined let alone tested, some victims and their loved ones will not forgive us for eschewing legitimate tools that could have changed the outcome of events which devastated their worlds.”
While we wait and watch to see how the debate plays out, one thing is clear. The growing ubiquity of digital evidence management as a core policing technology means that forces will be ready to rise to challenge of embracing new AI technologies like facial recognition for retrospective investigations, when the time comes.