Earlier this year, in the immediate wake of the storming of the United States Capitol, only a handful of arrests were made. Yet, in the weeks that followed, there were more than 400 additional people charged, and hundreds more are still under investigation.
According to Samuel Hodge Jr., we can thank facial recognition software for that.
This technology is one of the latest examples of how forensics has evolved in the 21st century. But that evolution has brought with it considerable ethical and legal implications.
Hodge, professor of legal studies at the Fox School, studies new forensic technologies and the role that they play in law and court cases. His article Big Brother is Watching: Law Enforcement’s Use of Digital Technology in the Twenty-First Century was recently published in the University of Cincinnati Law Review. In it, he outlines many of the technologies now being used by law enforcement, including facial recognition software, automatic license plate readers, street surveillance cameras, police body cameras and drones.
“There are so many legal and ethical issues in play here,” says Hodge, who also teaches a forensics class in Temple’s Beasley School of Law. “It’s a fine line because a lot of these technologies are capable of doing some good, but one of the big issues that we see is an invasion of privacy. We enjoy certain privacy rights, but that line can become blurred with some of the forensic work currently being done.”
We caught up with Hodge to gain his insight into the key takeaways of his new article and the use of forensics.
Q: One of the topics you discuss in your new article is facial recognition software, which has been in the news recently following the storming of the U.S. Capitol. What exactly is it, and how does it work?
A: Essentially, the government has millions of pictures of us. They obtain these images from driver’s licenses, passports, arrest records, social media images and more. The police have also started taking photographs of tattoos, scars and birthmarks, and they have another database for just that. In total, 50% of adults in the U.S. have their pictures warehoused in one or more facial-recognition databases that law enforcement agencies can search.
But this is also where privacy issues arise. We all have certain privacy rights. If you’re just walking down the street, and law enforcement captures your image via a street camera, is that legal or ethical that that image could then be stored in a database? For that reason, we’re seeing more legislation being enacted that is limiting its use. Eleven states have now restricted its use, and some cities like San Francisco and Oakland have passed ordinances that prohibit its use entirely. However, Washington, D.C., is not one of those cities.
Q: Do you think the use of that software will hold up in court when trying those who stormed the Capitol?
A: It’s tough to say. Historically, facial identification obtained through a software application cannot be used in court as substantive evidence since the technology cannot conclusively match a picture to an identity. There also continue to be accuracy issues. It might take more than that. The defense will almost assuredly argue the possibility of misidentification.
Q: In your new article, one of the other fascinating things you discuss is automatic license plate readers (ALPR). Can you elaborate on those a bit?
A: This is one of the most employed law enforcement surveillance tools and can be placed anywhere from police vehicles to stationary objects like poles, traffic lights and overpasses. These readers can image about 2,000 license plates per minute and pick up vehicles going as fast as 120 miles per hour.
The primary legal issue is that the data is stored on a computer often linked with regional sharing systems, thereby creating vast databases of driver location information. It’s also shared, and there are few restrictions on how the technology can be used. So essentially, this allows anyone with access to the data to snoop into an individual’s daily activities, habits, or present and past relationships. There have been reports where police officers have used it to stalk an ex-girlfriend, so this can be potentially really, really dangerous. It’s an example of where Big Brother is indeed watching us. It’s one of many forensic issues that we need to develop stronger guidelines for because there are just too many concerns.
Q: You had another article recently published in the Richmond Journal of Law and Technology entitled, Police Body Cameras – A Lesson In Objectivity And Accountability Or A Tool Without A Scientific Basis? We have seen body cameras in the news, most significantly in the wake of the tragic death of George Floyd last spring. From a legal standpoint, what issues do their use present?
A: Several years ago, President Obama started to push for the use of these cameras, and we have seen police departments across the country start to adopt them. The thought was that if the body cameras recorded what happened, the public could see it, and it would make police act more responsibly and limit police brutality. However, it hasn’t necessarily worked that way.
One problem is that many police departments have police body-worn cameras but haven’t adopted appropriate guidelines. For one, officers don’t always turn them on because many don’t like that they are forced to wear body cameras; they believe that they should not have to justify their work. In other cases, the police department will look at the video, and then decide that they do not want to release the video to the public. So it’s essentially a moot point in those circumstances.
We also have seen instances where police will mistakenly go into the homes of innocent people with the cameras on, and then they end up recording these individuals in private settings. So there is a multitude of issues involved.
This is why it’s so important for there to be standard guidelines and rules when dealing with these types of forensic tools, and that’s something that all law enforcement agencies need to push. On the surface, these body-worn cameras seem like a great tool, but that’s not the case when they’re being improperly employed or, in some cases, not being used at all.