Memorials outside Marjory Stoneman Douglas High School in Parkland, Fla., in February 2018 after a shooting on campus left 17 people dead. (Matt McClain/The Washington Post)

Kimberly Krawczyk says she would do anything to keep her students safe. A year ago Thursday, the Parkland, Fla., high school math teacher barricaded students behind her classroom door during one of the deadliest mass shootings in U.S. history.

But one of the unconventional responses that Broward County Public Schools said could stop another tragedy has left her deeply unnerved: an experimental artificial-intelligence system that would surveil her students closer than ever before.

The school system in South Florida, one of the largest in the country, said last month it would install a camera-software combination called Avigilon that would allow security officials to track students based on their appearance. With one click, a guard could pull up video of everywhere else a student has been recorded on campus.

The 145-camera system, which administrators said will be installed around the perimeters of the schools deemed “at highest risk,” will also automatically alert a school-monitoring officer when it senses events “that seem out of the ordinary” and people “in places they are not supposed to be.”

The supercharged surveillance network has raised major questions for some students, parents and teachers, such as Krawczyk, who voiced concerns about its accuracy, invasiveness and effectiveness. Her biggest doubt: that the technology could ever understand a school campus like a human can.

“How is this computer going to make a decision on what’s the right and wrong thing in a school with over 3,000 kids?” said Krawczyk, a 15-year teacher who was on the third floor of what’s known as the freshman building at Marjory Stoneman Douglas High School when the shooting began. “We have cameras now every two feet, but you can’t get a machine to do everything a human can do. You can’t automate the school. What are we turning these schools into?”

The specter of student violence is pushing school leaders across the country to turn their campuses into surveillance testing grounds on the hope it’ll help them detect dangerous people they’d otherwise miss. The supporters and designers of Avigilon, the AI service bought for $1 billion last year by tech giant Motorola Solutions, say its security algorithms could spot risky behavior with superhuman speed and precision, potentially preventing another attack.

But the advanced monitoring technologies ensure that the daily lives of American schoolchildren are subjected to close scrutiny from systems that will automatically flag certain students as suspicious, potentially spurring a response from security or police forces, based on the work of algorithms that are hidden from public view.

The camera software has no proven track record for preventing school violence, some technology and civil liberties experts argue. And the testing of their algorithms for bias and accuracy — how confident the systems are in identifying possible threats — has largely been conducted by the companies themselves.

If the Avigilon contract wins final approval from county leaders in the coming weeks, the school district will spend more than $600,000 in federal and local funds activating the AI-powered system around the high school campuses “with the highest security incidents,” contracting records show. The camera system will run independently alongside another 10,000 cameras already recording across the county’s schools.

Many aspects of the program, however, remain a mystery, and it’s unclear how exactly the surveillance system’s data and performance will be regulated, measured or tested for potential flaws. The school district rejected a Washington Post request to see records relating to the project, including officials’ communications with the company, citing a broad Florida statute exempting any information related to surveillance systems from public-records law.

Avigilon’s technology is not a perfect panacea. Its “appearance search,” a two-year-old feature that would allow a school official to find or highlight people based on what they’re wearing, has an accuracy rate that varies widely depending on factors like lighting and time of year, said Mahesh Saptharishi, the chief technology officer at Motorola Solutions. The system would be less accurate, for instance, in wintertime, when students are going to school in heavy coats.

Its “unusual motion detection” feature is advertised by the company as a way to automatically sense when students are running toward a brawl or away from an attack. But some students wondered just how much the computer could comprehend about the chaos of a typical high school, where frenzied movements and sudden gatherings are an everyday event. One teacher asked whether the system would know the difference between a boyfriend and girlfriend kissing each other and two people about to start a fight.

Saptharishi said the technology is a tool for security staff, not the final decision-maker itself, and that its performance in Broward schools and other early adopters could help further refine the results.

“Today, I don’t know of any quantitative results that clearly show these tools are bar-none effective or bar-none ineffective,” Saptharishi said. But he said the company has researched the systems closely and continues to train them, including with data taken from some participating schools. “We believe they have a net positive human impact,” he said.

No school-security measure has grown more than the use of surveillance cameras, according to survey data from the National Center for Education Statistics, expanding from nearly 20 percent of all public schools in 1999, the year of the Columbine High School shooting in Colorado, to more than 80 percent in 2015.

But it’s unclear what effect the cameras have had on mass violence. The number of school shootings every year has remained flat or grown slightly over that period; there were 25 shootings last year, in what was the worst year for mass school violence in at least two decades, a Post analysis found.

Avigilon’s technology does not use facial-recognition software that can directly match a person’s identity to images in a database. Schools and community centers across the country are installing similar software in hopes of flagging or blocking entry to unauthorized visitors.

But in some ways, Avigilon is more powerful, because its “appearance search” capability allows people to be tracked as long as their body is visible to the camera — no facial identification required.

The artificial-intelligence software has been trained on millions of images to comprehend the basic look and movement of people, and its builders say it can now recognize students from afar by their appearance — taking into account the shape of their body, their hairstyle, their facial attributes and the look and color of their clothes.

It then uses those processed images to search through a vast array of other camera footage for other places that person appeared, building a precise timeline of their movement within seconds, a feat that’s nearly impossible for a human alone.

The company markets the technology as helping to revolutionize public surveillance, transforming video from old evidence to be looked at after a crime into a tool that can, in the moment, think and react on its own. The systems are advertised as hyper-observant, constantly watching and invulnerable to distraction — a necessity for scanning thousands of video streams 24 hours a day.

Schools “today are monitored by someone sitting in a communications center, looking at a video wall, when the attention span of the average human looking at a single camera and being able to detect events that are useful is about 20 minutes,” Saptharishi said. But “when something bad is happening … you need to be able in a matter of seconds to figure out where that person is right now.”

Motorola, which also makes body cameras and two-way radios, said when it bought Avigilon last year that its AI-powered surveillance systems could help capture a booming market for the military and police.

Avigilon representatives would not say how many schools they’re working in now, but an online list of clients includes hospitals, stadiums, restaurants and schools across Georgia, Missouri and Tennessee. The company said that education is one of Avigilon’s top markets worldwide.

Elizabeth Laird, a former education official in Louisiana and the District of Columbia, and a current senior fellow at the Center for Democracy & Technology, a think tank, said systems such as Avigilon have faced little public testing for their validity or long-term impact. As they multiply across campuses, she fears they could leave a chilling effect over a place where students are taught to think independently, express themselves and learn from their mistakes.

School officials, she added, often lack the experience or know-how to understand all the data these systems can gather — and the potential pitfalls if they get something wrong. Administrators pressured to do something, anything, to increase school security may regard this kind of technology as a cure-all, even when its implications aren’t entirely understood, she said.

“We’re seeing that the uses of AI and technology like this are coming with unintended consequences, things the education sector has not experienced before, that may endanger the students it intends to protect,” she said. Students could be mischaracterized as dangerous based on how they were dressed or where they were walking, she said. And security officials could be overwhelmed with false alarms, making it harder for them to focus on actual threats.

Michael Dorn, the executive director of Safe Havens International, a consulting firm that Broward County Public Schools hired to assess security, said video-analysis systems like Avigilon are becoming increasingly popular among schools seeking to maximize their surveillance capabilities, and security staff use them to watch for students hopping fences, climbing onto roofs or loitering around campus.

But that capacity for widespread monitoring has some students concerned that it could be overused to police students’ time, movement and activity.

“My fear is this will become targeted,” said Kenneth Preston, a Broward high school senior who has criticized the district’s spending. “Maybe Johnny isn’t performing exceedingly well, so let’s track him to see why. And you don’t even have to sit by those cameras to watch him. It’s a system that can be abused and will be abused.”

The Feb. 14, 2018, shooting in Parkland that left 17 students and school staff dead highlighted devastating flaws in local emergency response. A report commissioned by the state and released last month found that security-system failures had badly hamstrung law-enforcement and medical officials’ shooting reaction and rescue attempts. Deputies did not have real-time video access, school officials weren’t trained to play back footage, and first responders mistakenly believed they were watching live video of the shooter when it was on a 20-minute delay.

The report criticized a “significant misunderstanding and overapplication of several privacy laws” governing student health and education records that it said prevented school officials from more quickly assessing the potential threat. It also urged Florida lawmakers to consider changes to school-privacy laws and invest more thoroughly in surveillance and information-sharing technologies that could potentially fortify the schools against assault.

Lori Alhadeff, who was elected to the Broward school board after her 14-year-old daughter, Alyssa, was killed during the shooting, said that the school board has not been given a demonstration of Avigilon’s technology but that she believes the system is worth deploying on any campus where its monitoring capabilities could mean the difference between life and death.

“Parents are in the mind-set now, with all these different school shootings, that they want to send their kids to school and know they’ll come back home alive,” Alhadeff said. They’ll do “whatever it takes to make that happen.”

The desire for technological solutions to the human problem of school violence has emboldened a flood of tech and surveillance start-ups competing for public customers. Schools and other education-related buyers are the fifth-biggest market for surveillance systems across the world but the top market in the United States, with $2.7 billion in revenue in 2017, said Jon Cropley, a senior analyst at the market research firm IHS Markit.

Surveillance-camera algorithms are also increasingly being deployed in unconventional and sometimes unproven ways. The start-up Athena Security, for instance, offers a surveillance-camera software that it says can automatically detect when someone pulls out a gun or knife. The system, trained on a data set of weapon images, is active in a Pennsylvania high school.

Company chief Lisa Falzone said that the system has undergone no independent testing but that internal trials have shown it can be 99 percent accurate. Critics of similar technologies, however, have worried that an inaccuracy — alerting when someone pulls out a cellphone instead of a gun — could have fatal consequences.

Dorn, the school-security consultant, sees video-analyzing software as an increasingly necessary component of all school surveillance, but added that it has not been proved to stop mass violence and should be regarded as one tool among many.

There is still no replacement, he said, for the human touch: better training for teachers and deeper relationships between students and staff. But he said he worries that schools’ growing interest in advanced surveillance techniques could lead to students and parents being deceived about the limits of what the systems can do.

“I’ve been doing this for 36 years, and I’ve never seen as much bad stuff out there in terms of training concepts and gadgets — unsound stuff, where they’re just promising stuff folks want to hear,” Dorn said. “There’s a lot of fear, a lot of anxiety and a lot of money to be made.”