Upgrade to Pro

WWW.TECHNOLOGYREVIEW.COM
How a new type of AI is helping police skirt facial recognition bans
Police and federal agencies have found a controversial new way to skirt the growing patchwork of laws that curb how they use facial recognition: an AI model that can track people using attributes like body size, gender, hair color and style, clothing, and accessories.  The tool, called Track and built by the video analytics company Veritone, is used by 400 customers, including state and local police departments and universities all over the US. It is also expanding federally: US attorneys at the Department of Justice began using Track for criminal investigations last August. Veritone’s broader suite of AI tools, which includes bona fide facial recognition, is also used by the Department of Homeland Security—which houses immigration agencies—and the Department of Defense, according to the company.  “The whole vision behind Track in the first place,” says Veritone CEO Ryan Steelberg, was “if we’re not allowed to track people’s faces, how do we assist in trying to potentially identify criminals or malicious behavior or activity?” In addition to tracking individuals where facial recognition isn’t legally allowed, Steelberg says, it allows for tracking when faces are obscured or not visible.  The product has drawn criticism from the American Civil Liberties Union, which—after learning of the tool through MIT Technology Review—said it was the first instance they’d seen of a nonbiometric tracking system used at scale in the US. Veritone gave us a demonstration of Track in which it analyzed people in footage from different environments, ranging from the January 6 riots to subway stations. You can use it to find people by specifying body size, gender, hair color and style, shoes, clothing, and various accessories. The tool can then assemble timelines, tracking a person across different locations and video feeds. It can be accessed through Amazon and Microsoft cloud platforms. VERITONE; MIT TECHNOLOGY REVIEW (CAPTIONS) In an interview, Steelberg said that the number of attributes Track uses to identify people will continue to grow. When asked if Track differentiates on the basis of skin tone, a company spokesperson said it’s one of the attributes the algorithm uses to tell people apart but that the software does not currently allow users to search for people by skin color. Track currently operates only on recorded video, but Steelberg claims the company is less than a year from being able to run it on live video feeds. Agencies using Track can add footage from police body cameras, drones, public videos on YouTube, or so-called citizen upload footage (from Ring cameras or cell phones, for example) in response to police requests. “We like to call this our Jason Bourne app,” Steelberg says. He expects the technology to come under scrutiny in court cases but says, “I hope we’re exonerating people as much as we’re helping police find the bad guys.” The public sector currently accounts for only 6% of Veritone’s business (most of its clients are media and entertainment companies), but the company says that’s its fastest-growing market, with clients in places including California, Washington, Colorado, New Jersey, and Illinois.  That rapid expansion has started to cause alarm in certain quarters. Jay Stanley, a senior policy analyst at the ACLU, wrote in 2019 that artificial intelligence would someday expedite the tedious task of combing through surveillance footage, enabling automated analysis regardless of whether a crime has occurred. Since then, lots of police-tech companies have been building video analytics systems that can, for example, detect when a person enters a certain area. However, Stanley says, Track is the first product he’s seen make broad tracking of particular people technologically feasible at scale. “This is a potentially authoritarian technology,” he says. “One that gives great powers to the police and the government that will make it easier for them, no doubt, to solve certain crimes, but will also make it easier for them to overuse this technology, and to potentially abuse it.” Chances of such abusive surveillance, Stanley says, are particularly high right now in the federal agencies where Veritone has customers. The Department of Homeland Security said last month that it will monitor the social media activities of immigrants and use evidence it finds there to deny visas and green cards, and Immigrations and Customs Enforcement has detained activists following pro-Palestinian statements or appearances at protests.  In an interview, Jon Gacek, general manager of Veritone’s public-sector business, said that Track is a “culling tool” meant to speed up the task of identifying important parts of videos, not a general surveillance tool. Veritone did not specify which groups within the Department of Homeland Security or other federal agencies use Track. The Departments of Defense, Justice, and Homeland Security did not respond to requests for comment. For police departments, the tool dramatically expands the amount of video that can be used in investigations. Whereas facial recognition requires footage in which faces are clearly visible, Track doesn’t have that limitation. Nathan Wessler, an attorney for the ACLU, says this means police might comb through videos they had no interest in before.  “It creates a categorically new scale and nature of privacy invasion and potential for abuse that was literally not possible any time before in human history,” Wessler says. “You’re now talking about not speeding up what a cop could do, but creating a capability that no cop ever had before.” Track’s expansion comes as laws limiting the use of facial recognition have spread, sparked by wrongful arrests in which officers have been overly confident in the judgments of algorithms.  Numerous studies have shown that such algorithms are less accurate with nonwhite faces. Laws in Montana and Maine sharply limit when police can use it—it’s not allowed in real time with live video—while San Francisco and Oakland, California have near-complete bans on facial recognition. Track provides an alternative.  Though such laws often reference “biometric data,” Wessler says this phrase is far from clearly defined. It generally refers to immutable characteristics like faces, gait and fingerprints rather than things that change, like clothing. But certain attributes, such as body size, blur this distinction.  Consider also, Wessler says, someone in winter who frequently wears the same boots, coat, and backpack. “Their profile is going to be the same day after day,” Wessler says. “The potential to track somebody over time based on how they’re moving across a whole bunch of different saved video feeds is pretty equivalent to face recognition.” In other words, Track might provide a way of following someone that raises many of the same concerns as facial recognition, but isn’t subject to laws restricting use of facial recognition because it does not technically involve biometric data. Steelberg said there are several ongoing cases that include video evidence from Track, but that he couldn’t name the cases or comment further. So for now, it’s unclear whether it’s being adopted in jurisdictions where facial recognition is banned. 
·33 Views
////////////////////////