Chris McLaughlin, a vice president with Evolv Technology, test the company's body scanner at Union Station subway station in Los Angeles Wednesday, Aug. 16, 2017. Passengers boarding subway trains in Los Angeles may soon be shuffled through airport-style body scanners that are aimed to detect firearms and explosives. A two-day pilot program by the Los Angeles Metropolitan Transportation Authority, Metro began Wednesday at Union Station. Officials say the machines can scan about 600 people per hour. (AP Photo/Mike Balsamo)

Un-Alarmed

AI Tries (and Fails) to Detect Weapons in Schools

Chris McLaughlin, a former vice president with Evolv Technology, tests the company's body scanner at Union Station in Los Angeles on Aug. 16, 2017. Photo: Mike Balsamo.AP

On Halloween day last year, a 17-year-old-student walked straight through an artificial intelligence weapons detection system at Proctor High School in Utica, New York. No alert went off.

The 17-year-old then approached a fellow student, pulled a hunting-style knife out of his backpack, and repeatedly stabbed the other student in the hands and back.

The Utica City School District had installed the $4 million weapons detection system across 13 of its schools earlier that summer, mostly with public funds. The scanners, from Massachusetts-based Evolv Technology, look like metal detectors but scan for “signatures” for “all the guns, all the bombs, and all the large tactical knives” in the world, Evolv’s CEO Peter George has repeatedly claimed.

In Utica, the 17-year-old’s weapon wasn’t the first knife, or gun, to bypass the system. Earlier that month, at a parents’ night, a law enforcement officer had walked through the system twice with his service revolver and was puzzled to find it was never detected. School authorities reached out to Evolv and were subsequently told to increase the sensitivity settings to the highest level.

The detector did finally go off: It identified a 7-year-old student’s lunch box as a bomb. On Halloween, however, it remained silent.

“They’ve tried to backtrack by saying, ‘Oh no, it doesn’t pick up all knives,’” said Brian Nolan, who had been appointed acting superintendent of the Utica City School District 10 days before the stabbing. “They don’t tell you — will it pick up a machete or a Swiss army knife? We’ve got like really nothing back from Evolv.”

Ultimately, Utica City School District removed and replaced the scanners from their high schools, costing the district another $250,000. In the elementary and middle schools, which retained Evolv scanners, three knives have been recovered from students — but not because the scanners picked them up, according to Nolan.

Stories about Evolv systems missing weapons have popped up nationwide. Last month, a knife fight erupted between students at Mifflin High School in Ohio. It’s not clear how the knives entered the building, but it was less than three months after the school district spent $3 million installing Evolv scanners.

As school shootings proliferate across the country — there were 46 school shootings in 2022, more than in any year since at least 1999 — educators are increasingly turning to dodgy vendors who market misleading and ineffective technology. Utica City is one of dozens of school districts nationwide that have spent millions on gun detection technology with little to no track record of preventing or stopping violence.

Evolv’s scanners keep popping up in schools across the country. In a video produced by the Charlotte-Mecklenburg district in North Carolina about its new $16.5 million system, students spoke about how the technology reassured them. “I know that I’m not going to be threatened with any firearms, any knives, any sort of metallic weapon at all,” one said.

“Private companies are preying on school districts’ worst fears and proposing the use of technology that’s not going to work and may cause many more problems than it seeks to solve.”

Over 65 school districts have bought or tested artificial intelligence gun detection from a variety of companies since 2018, spending a total of over $45 million, much of it coming from public coffers, according to an investigation by The Intercept.

“Private companies are preying on school districts’ worst fears and proposing the use of technology that’s not going to work,” said Stefanie Coyle, deputy director of the Education Policy Center at the New York Civil Liberties Union, or NYCLU, “and may cause many more problems than it seeks to solve.”

In December, it came out that Evolv, a publicly traded company since 2021, had doctored the results of their software testing. In 2022, the National Center for Spectator Sports Safety and Security, a government body, completed a confidential report showing that previous field tests on the scanners failed to detect knives and a handgun. When Evolv released a public version of the report, according to IPVM, a surveillance industry research publication, and underlying documents reviewed by The Intercept, the failures had been excised from the results. Though Evolv touted the report as “fully independent,” there was no disclosure that the company itself had paid for the research. (Evolv has said the public version of the report had information removed for security reasons.)

Five law firms recently announced investigations of Evolv Technology — a partner of Motorola Solutions whose investors include Bill Gates — looking into possible violations of securities law, including claims that Evolv misrepresented its technology and its capabilities to it.

“When you start peeling back the onion on what the technology actually does and doesn’t do, it’s much different than the reality these companies present,” said Donald Maye at IPVM. “And that is absolutely the case with Evolv.”

Evolv told The Intercept it would not comment on any specific situations involving their customers and declined to comment further. (Motorola Solutions did not respond to a request for comment.)

The overpromising of artificial intelligence products is an industrywide problem. The Federal Trade Commission recently released a blog post warning companies, “Keep your AI claims in check.” Among the questions was, “Are you exaggerating what your AI product can do?”

An employee for Evolv Technology, demonstrates the Evolv Express weapons detection system, which is showing red lights to flag a weapon he is wearing on his hip, Wednesday, May 25, 2022, in New York. (AP Photo/Mary Altaffer)

An employee of Evolv Technology demonstrates the Evolv Express weapons detection system, which is showing red lights to flag a weapon, on May 25, 2022, in New York.

Photo: Mary Altaffer/AP

Artificial intelligence gun detection vendors advertise themselves as the solution to the mass school shootings that plague the U.S. While various companies employ differing methods, the Evolv machines use cameras and sensors to capture people as they walk by, after which AI software compares them with object signatures that the system has created. When a weapon is present, the system is supposed to pick up the weapon’s signature and sound an alarm.

At an investor conference in June 2022, Evolv CEO George was asked if the company would have stopped the tragic school shooting in Uvalde, Texas, where 19 students and two teachers were killed. “The answer is when somebody goes through our system and they have a concealed weapon or an open carry weapon, we’re gonna find it, period,” he responded. “We won’t miss it.”

In January, the scanners caught a student trying to enter a high school with a handgun in Guilford, North Carolina. Subsequently, an Evolv spokesperson told WFMY News that their systems had uncovered 100,000 weapons in 2022. In a presentation for investors in the fourth quarter of 2022, George said the detection scanners, on average, stopped 400 guns per day.

There is little peer-reviewed research, however, showing that AI gun detection is effective at preventing shootings. And in the case of Uvalde, the shooter began firing his gun before even entering the school building — and therefore before having passed through a detector.

“The odds of that happening — someone walks in with a displayed gun — are really, really small. It just doesn’t make sense that that’s what you’re investing in.”

“The odds of that happening — someone walks in with a displayed gun — are really, really small,” said Andrew Guthrie Ferguson, a professor of law at American University’s law school and an expert on surveillance. “It just doesn’t make sense that that’s what you’re investing in.”

Even in airports with maximum security protocols, Evolv’s technology has proved to have gaping holes. When an official at Denver International Airport expressed interest in Evolv scanners, he asked a colleague at Oakland International Airport, which uses the machines.

“It is not an explosives detection machine per se,” wrote Douglas Mansel, the aviation security manager in Oakland, in an internal email obtained through a public records request and shared with The Intercept, “So if an employee (or law enforcement during a test) walks through with a brick of C4” — an explosive — “in their hands, the Evolv will not alarm.” (The Oakland Airport told The Intercept it does not comment on its security program.)

In a BBC interview in 2020, Evolv said the density of metal is one key indicator of a weapon’s presence. But the company firmly denies that their scanners are akin to metal detectors. “We’re a weapons detector, not a metal detector,” George said on a conference call in June 2021. (A large competitor of Evolv is CEIA, which manufactures metal detectors without AI, used in airports and schools.)

Yet in many cases, Evolv hasn’t picked up weapons. And researchers have also highlighted how metallic objects, such as laptops, repeatedly set the system off. “They go through great lengths to claim they are not a metal detector,” said Maye of IPVM. “To the extent to which AI is being used, it’s open to interpretation to the consumer.”

Despite claims by George that the system can scan up to 1,000 students in 15 minutes, in the Hemet Unified School District in California, false alarms slowed ingress to school buildings. The solution, according to Evolv, was to simply encourage educators to let students proceed.

Related

Detroit Cops Want $7 Million in Covid Relief Money for Surveillance Microphones

“They only need to clear the threat(s) and not figure out what alarmed the system,” wrote Amy Ferguson, customer manager at Evolv, in an internal email to the school system obtained through a public records request and shared with The Intercept. “I recommended not doing a loop back unless necessary. … Many students were looping back 2 or 3 times.” (The Hemet Unified School District did not respond to a request for comment.)

Across the country in Dorchester County Public Schools in Maryland, the system had 250 false alarms for every real hit in the period from September 2021 to June 2022, according to internal records obtained by IPVM. The school district spent $1.4 million on the Evolv software, which it bought from Motorola.

“It plays an important role in our efforts to keep our School District safe,” the district told The Intercept. “And we plan to expand its use within the District.”

Evolv isn’t the only company making bold claims about its sophisticated weapons detection system. ZeroEyes, a Philadelphia-based AI company, states in contracts that “our proactive solution saves lives.” Founded by Navy SEALs in 2018, the firm uses video analytics and object detection to pick up guns.

ZeroEyes’s website lists the timeline for the Sandy Hook shooting, arguing its technology could have materially reduced the response time. When a gun is visible on camera, an alert gets sent to a “24/7/365 ZeroEyes Operations Center Team,” with people monitoring the feed, who in turn confirm the gun and alert the school and police. It claims to do all of this in three to five seconds.

The human team is key to the group’s system, something critics say belies the weakness of the underlying AI claims. “This is one of the fundamental challenges these companies have. Like if they could fully automate it reliably, they wouldn’t need to have a human-in-the-loop,” said Maye. “The human-in-the-loop is because AI isn’t good enough to do it itself.”

“We have never suggested that AI alone is enough,” Olga Shmuklyer, spokesperson for ZeroEyes, told The Intercept. “We would never trust AI alone to determine whether a gun threat is real or fake, nor should anybody else.”

In addition to Philadelphia, the company also has an operations center in Honolulu, Hawaii, “to cater to different time zones.”

ZeroEyes seems determined to overcome its critics and is so far faring well. The company raised $20 million in 2021. According to co-founder Rob Huberty, in a LinkedIn post, the team’s mantra is “F*** you, watch me.”

“We are problem solvers, and this is a difficult problem,” said Shmuklyer, the spokesperson. “Without the mentality proposed in that post, we wouldn’t have a solution to offer to school districts around the country.”

During the pandemic, school shootings rose in tandem with a spike in gun violence in general. The sort of panic that ensued can lead to impulsive and ineffective action, according to safety experts.

“We are seeing some school boards and administrators making knee-jerk reactions by purchasing AI weapons detection systems,” said Kenneth Trump, president of National School Safety and Security Services. “Unfortunately, the purchase of the systems appears to be done with little-to-no professional assessment of overall security threats and needs.”

Schools in Colorado and Texas brought weapons detection software from a now-convicted fraudster. Barry Oberholzer developed SWORD in 2018 under the startup X.Labs, registered as Royal Holdings Technologies, which he claimed to be the first mobile phone case providing gun detection software.

“I can identify you and identify if you are carrying a gun in 1.5 seconds,” Oberholzer told WSFA 12 News in Alabama in February 2019. “You don’t even have to click. You just need to point the device at the person.”

Later that year, it was reported that Oberholzer was on the run from over two dozen fraud and forgery charges in South Africa. (Todd Dunphy, a board member of and investor in X.Labs, denied the charges on Oberholzer’s behalf and produced an unverified letter from South African authorities clearing him.)

His SWORD product was endorsed by former high-level U.S. officials.

Former FBI agent James Gagliano, who was listed as an adviser to X.Labs, praised the product as “next generation public safety threat-detection.” Charles Marino, a retired Secret Service special agent, was listed as the company’s national security adviser.

Marino said he invested in the company but has not been involved for years and did not work on the SWORD project. “He swindled everybody,” Marino told The Intercept, referring to the conspiracy conviction. “Look, you kiss a lot of frogs in this world.”

Gagliano said in an email that he severed ties with Oberholzer after hearing of the fraud charges. “I was as stunned as anyone,” he said. “Have had no contact with him since I learned of his indictment in the Summer of 2021. I was excited about the technology he was seeking to introduce to law enforcement.”

In June 2020, X.Labs announced the rebranding of SWORD to X1, a standing device and “full-featured weapons detection system” in partnership with another firm.

Last month, Oberholzer and his business partner Jaromy Pittario pleaded guilty in federal court to conspiracy to defraud investors and creditors. The Department of Justice accused Oberholzer of posing as Gen. David Petraeus, the former CIA director, while pitching the product to venture capital firms.

“Instead of attracting investors honestly, Oberholzer lied continuously to make his company more appealing to investors,” U.S. Attorney for the Southern District of New York Damian Williams said in a statement.

None of it deterred the company. Its scanners, despite problems, remain in schools — and X.Labs continues to cultivate new business. “All of the devices that are purchased by clients are in their possession and can be used as they see fit,” Dunphy said. “The company, like last year, is run by the board and is working with parties to complete the last phase of development for the purpose of slowing down mass shootings globally.”

Oberholzer is no longer involved with X.Labs, said Dunphy, the board member, who responded to emails addressed to Oberholzer.

“Mr Oberholzer is a professional helicopter pilot and his comings and goings has nothing to do with X.labs,” Dunphy said, “as he resigned from the company in February 2021.”

There is a reason districts in New York, such as Utica, have been a target of gun detection vendors. Most of this technology is being funded by taxpayer money and, in the Empire State, there is a lot to spend.

Under the Boards of Cooperative Educational Services aid, school purchases get reimbursed based on a school district’s poverty level. Utica City School District, which has a high poverty level, was reimbursed 93 cents on the dollar on the Evolv sale, according to acting superintendent Nolan.

The Boards of Cooperative Educational Services told The Intercept, “As a coalition of the state’s 37 Boards of Cooperative Educational Services, BOCES of NYS has neither authority nor oversight regarding the budgets, purchases, or reimbursement rates of any school district.” The regional Oneida-Herkimer-Madison Counties BOCES office — which covers the Utica school district — did not respond for comment.

While the district gets most of its money back after the disastrous purchase of the Evolv scanners, “New York state taxpayers are still on the hook for the system,” Nolan said.

The Smart Schools Bond Act, passed in 2014, also set aside $2 billion funding to “finance improved educational technology and infrastructure,” drawing the attention of vendors nationwide.

Related

Kathy Hochul Is Ready to Spend Millions on New Police Surveillance

“Folks in the school security industry got wind that New York State was sitting on this big pot of money that school districts had access to,” said Coyle of the NYCLU. “And that kind of opened the floodgates for companies to try to convince school districts to use that state funding to buy products they don’t need, they don’t know how to use, and are potentially harmful.”

New York isn’t the only state ready to spend a fortune. A 2019 Texas bill allocated $100 million in grants for schools seeking to purchase new equipment.

Federal Covid-19 relief dollars can also be directed to things like school security systems, through the Elementary and Secondary School Emergency Relief Fund. Companies, including ZeroEyes and a similar firms, advertise how schools can receive a grant for the “development and implementation of procedures and systems to improve the preparedness and response efforts of a school district.”

“We are targeting sales to all states,” Shmuklyer, of Zero Eyes, said. “A lack of funds should not be the reason why a school cannot be proactive in addressing the mass shooting problem.”

Experts argue schools are just a cheap training ground for technology vendors to test and improve their object detection software so that they can eventually sell it elsewhere.

“Part of the reason why these companies are offering schools the technologies at a relatively cheap price point is that they’re using the schools as their grounds for training,” said Ferguson, the American University professor. “And so those schools or students become data points in a large data set that’s actually improving the technology so they can sell it to other people in other places.”

“They keep saying how the artificial intelligence system they use gets refined after more usage, because they collect more data, more information. But what’s it going to take, 20 years?”

Acting superintendent Nolan himself was told by Evolv the system would get smarter over time with more use. “They keep saying how the artificial intelligence system they use gets refined after more usage, because they collect more data, more information,” he said. “But what’s it going to take, 20 years?”

The lack of regulation leads to a lack of transparency on the use of the data itself. “There’s no protections in place,” said Daniel Schwarz, privacy and technology strategist at NYCLU, “And it raises all these issues around what happens with the data. … Oftentimes, what we’ve caught out is that they actually worsen racial disparities and biases.”

FILE - ShotSpotter equipment overlooks the intersection of South Stony Island Avenue and East 63rd Street in Chicago on Tuesday, Aug. 10, 2021. In more than 140 cities across the United States in 2023, ShotSpotter’s artificial intelligence algorithm and its intricate network of microphones evaluate hundreds of thousands of sounds a year to determine if they are gunfire, generating data now being used in criminal cases nationwide. (AP Photo/Charles Rex Arbogast, File)

ShotSpotter (renamed SoundThinking) equipment overlooks the intersection of South Stony Island Avenue and East 63rd Street in Chicago on Aug. 10, 2021.

Photo: Charles Rex Arbogast/AP

Additionally, ShotSpotter — now renamed SoundThinking — a system of microphones which claims to use “sensors, algorithms and artificial intelligence” to detect the sound of gunfire, has received intense criticism for being overwhelmingly deployed in communities of color. The frequent false alarms of the systems has led to more aggressive policing, as well as the distortion of gunfire statistics.

An analysis by the MacArthur Justice Center found that 89 percent of ShotSpotter alerts in Chicago from 2019-2021 turned up no gun-related crime. “Every unfounded ShotSpotter deployment creates an extremely dangerous situation for residents in the area,” according to the report.

There has been extensive reporting on police departments and other agencies’ use of ShotSpotter nationwide — but not schools. Public records show Brockton Public Schools, in Massachusetts, for instance, bought access to the technology for three years in a row. The school system said in a statement that the public document showing its purchase of ShotSpotter was in error and referred instead to a purchase by the police department; the school spokesperson said Brockton schools received a separate donation of ShotSpotter, but never activated it. (The school system did not say who donated the system, and the police department did not respond to a request for comment.)

“Contrary to claims that the ShotSpotter product leads to over-policing, ShotSpotter alerts allow police to investigate a gunfire incident in a more precise area,” Sara Lattman, a SoundThinking spokesperson, said in a statement to The Intercept. “Additionally, ShotSpotter has maintained a low false positive rate, just 0.5%, across all customers in the last three years.”

For many advocates against gun violence, particularly in schools, gun control measures like an assault weapons ban would go a long way in curtailing the deadly effects of attacks. With Congress failing to enact such policies, experts argue that schools should refrain from turning to shoddy technology to support their students.

“We advise schools to focus on human factors: people, policies, procedures, planning, training, and communications,” said Trump, the National School Safety and Security Services head. “Avoid security theater.”

Vendors, though, continue to emphasize the risk of gun violence and rely on the steady drumbeat of attacks to generate fear in potential clients — and to make sales.

“While recent high visibility attacks at publicly and privately-owned venues and schools have increased market awareness of mass shootings,” said Evolv’s recent annual disclosure report, “if such attacks were to decline or enterprises of governments perceived the general level of attacks has declined, our ability to attract new customers and expand our sales to existing customers could be materially and adversely affected.”

The company even helps schools market the technology to their own communities. In an email from Evolv to the Charlotte-Mecklenburg school district, a bulleted list of talking points makes suggestions for how the school system might respond to public queries about the scanners. One of the talking points said, “Security approaches included multiple layers,” adding that “this approach recognizes the reality that no single layer or single technology is 100% effective.”

When reached for comment by The Intercept, Eddie Perez, a spokesperson for the Charlotte-Mecklenburg school district, quoted the talking point verbatim in an emailed response.

That hedged view is out of step with how people in the district itself speak about the system: as an absolute assurance of a gun-free safety. Students in the video produced by the school district said, “You get a certain reassurance that there are no dangerous weapons on campus.”

Correction: May 11, 2023
This story has been updated to use the correct spelling of ZeroEyes spokesperson Olga Shmuklyer’s name. It has also been updated to reflect a clarification received after publication from Brockton Public Schools in Massachusetts that the ShotSpotter system donated to the schools was not received from the police.

Join The Conversation