The Hot New AI Tool in Law Enforcement Is a Workaround for Places Where Facial Recognition Is Banned
At the end of 2024, fifteen U.S. states had laws banning some version of facial recognition. Usually, these laws were written on the basis that the technology is a nightmare-level privacy invasion that's also too shoddy to be relied upon. Now, a new company aims to solve that problem — though maybe not in the way you'd imagine (or like). Per a report in the MIT Technology Review, a new AI tool called Track is being used not to improve facial recognition technology, nor as a way to make it less invasive of your personal civil liberties, but as a […]


At the end of 2024, fifteen US states had laws banning some version of facial recognition.
Usually, these laws were written on the basis that the technology is a nightmare-level privacy invasion that's also too shoddy to be relied upon. Now, a new company aims to solve that problem — though maybe not in the way you'd imagine (or like).
Per a report in MIT Technology Review, a new AI tool called Track is being used not to improve facial recognition technology, nor as a way to make it less invasive of your personal civil liberties, but as a workaround to the current laws against facial recognition (which are few and far between, at least when compared to the places it's allowed to operate). It's a classic tale of technology as "disruption," simply by identifying a legal loophole to be exploited.
That new tool, called Track, is a "nonbiometric" system that emerged out of a SkyNet-esque company that specializes in video analytics, Veritone.
According to MIT Technology Review's story, it already has 400 customers using Track in places where facial recognition is banned, or in instances where someone's face is covered. Even more: Last summer, Veritone issued a press release announcing the US Attorney's office had expanded the remit of their Authorization to Operate, the mandate that gives a company like Veritone the ability to carry out surveillance operations.
Why? Because Track can (supposedly) triangulate people's identities off of footage using a series of identifying factors, which include monitored subjects' shoes, clothing, body shape, gender, hair, and various accessories — basically, everything but your face. The footage Track is capable of scanning includes everything from closed-circuit security tapes, body-cams, drone footage, Ring cameras, and crowd/public footage (sourced from various social media networks where it's been uploaded).
In a view MIT Technology Review obtained of Track in operation, users can select from a dropdown menu listing a series of attributes by which they want to identify subjects: Accessory, Body, Face, Footwear, Gender, Hair, Lower, Upper. Each of those menus has a sub-menu. On "Accessory," the sub-menu lists: Any Bag, Backpack, Box, Briefcase, Glasses, Handbag, Hat, Scarf, Shoulder Bag, and so on. The "Upper" attribute breaks down into Color, Sleeve, Type (of upper-body clothing), and those types break down into more sub-categories.
Once the user selects the attributes they're looking for, Track gives the user a series of images taken from the footage being reviewed, containing a series of matches. And from there, it will continue to help users narrow down footage until they've assembled a triangulation of their surveillance target's path.
If this sounds like current facial recognition software — in other words, like it's a relatively fallible Orwellian enterprise, bound to waste quite a bit of money, netting all the wrong people along the way — well, the folks at Veritone see it another way.
Their CEO called Track their "Jason Bourne tool," while also praising its ability to exonerate those identified by it. It's an incredibly dark, canny way to get around limitations on their ability to use facial recognition tracking systems, simply by providing something very much like it, that isn't precisely biometric data. By going around that loophole, Signal equips police departments and federal law enforcement agencies with the unencumbered opportunity to conduct surveillance that's been legislated against in all but the precise letter of the law. And surveillance, it's worth noting, that might be even more harmful or detrimental than facial recognition itself.
It's entirely possible that people who wear certain kinds of clothing or look a certain way can be caught up by Track. And this is in a world where we already know people have been falsely accused of theft, falsely arrested, or falsely jailed, all thanks to facial recognition technology.
Or as American Civil Liberties Union lawyer Nathan Wessler told MIT Tech Review: "It creates a categorically new scale and nature of privacy invasion and potential for abuse that was literally not possible any time before in human history.”
Looks like they're gonna have to find another name for the big map.
More on Facial Recognition: Years After Promising to Stop Facial Recognition Work, Meta Has a Devious New Plan
The post The Hot New AI Tool in Law Enforcement Is a Workaround for Places Where Facial Recognition Is Banned appeared first on Futurism.