Dustin is posing as an active shooter armed with an assault rifle. If he thinks he's undetected looking to prey on the unsuspecting, he'd be completely wrong. We've tested a couple different model architectures and we use that over existing security cameras using different types of GPUs to be able to digest those video feeds, run analytics over it looking for a weapon and then sending the alert out," said Mike Lahiff, CEO of ZeroEyes. The alert goes out in a flash to law enforcers and administrators with video of Dustin's movements and location. "Instantly, I would get on my police radio and notify first responders that I have a possible threat on location.
ProPublica is a nonprofit newsroom that investigates abuses of power. Sign up for ProPublica's Big Story newsletter to receive stories like this one in your inbox as soon as they are published. Ariella Russcol specializes in drama at the Frank Sinatra School of the Arts in Queens, New York, and the senior's performance on this April afternoon didn't disappoint. While the library is normally the quietest room in the school, her ear-piercing screams sounded more like a horror movie than study hall. But they weren't enough to set off a small microphone in the ceiling that was supposed to detect aggression.
This story was co-published with ProPublica. Ariella Russcol specializes in drama at the Frank Sinatra School of the Arts in Queens, New York, and the senior's performance on this April afternoon didn't disappoint. While the library is normally the quietest room in the school, her ear-piercing screams sounded more like a horror movie than study hall. But they weren't enough to set off a small microphone in the ceiling that was supposed to detect aggression. A few days later, at the Staples Pathways Academy in Westport, Connecticut, junior Sami D'Anna inadvertently triggered the same device with a less spooky sound--a coughing fit from a lingering chest cold.
A school district in western New York is launching a first-of-its-kind facial recognition system, generating new privacy concerns about the powerful but controversial technology. The Lockport city school district is beginning implementation of the Aegis facial recognition system this week, officials said, with the technology expected to be fully up and running in time for the new school year in September. "Much to our dismay, school shootings continue to occur in our country. In many cases, these shootings involve students connected to the schools where these horrific incidents occur," superintendent Michelle Bradley said in a message to parents. "The Lockport city school district continues to make school security a priority."
San Francisco supervisors approved a ban on police using facial recognition technology, making it the first city in the U.S. with such a restriction. Facial recognition has enrolled in school. On Monday, a New York school district became one of the first in the U.S. to roll out facial recognition technology on campus using its students' faces as an added layer of security. The system of cameras can also be used to identify guns or flagged persons, such as expelled students and sex offenders, according to the school district. The Lockport City School District will pilot its Aegis system over the summer and will expand the technology to each of its eight schools before classes resume in the fall.
More and more frequently, schools across the United States are turning to artificial intelligence-backed solutions to stop tragic acts of student violence. Companies like Bark Technologies, Gaggle.net, and Securly, Inc., are using a combination of artificial intelligence (AI) and machine learning (ML) along with trained human safety experts to scan student emails, texts, documents, and in some cases, social media activity. They're looking for warning signs of cyber bullying, sexting, drug and alcohol use, depression, and to flag students who may pose a violent risk not only to themselves, but to classmates as well. Any potential problems discovered trigger alerts to school administration, parents, and law enforcement officials, depending on the severity. Bark ran a test pilot of its program with 25 schools in fall 2017.
On Feb. 14, 2018, a gunman killed 17 people at Marjory Stoneman Douglas High School in Parkland, Florida. The incident was the deadliest high school shooting in U.S. history, and in the year since, various tech companies across the nation have ramped up efforts to use artificial intelligence to prevent similar tragedies -- and they claim the systems are flagging many violent incidents before they happen. A new story by USA Today details several of the companies offering services that use AI to prevent school shootings. Bark's AI monitors students' text messages, emails, and social media accounts for signs of cyberbullying, drug use, depression, and other possible safety concerns, sending automatic alerts to officials in more than 1,100 school districts when it notes something suspicious.
Kimberly Krawczyk says she would do anything to keep her students safe. But one of the unconventional responses the local Broward County school district has said could stop another tragedy has left her deeply unnerved: an experimental artificial-intelligence system that would surveil her students closer than ever before. The South Florida school system, one of the largest in the country, said last month it would install a camera-software system called Avigilon that would allow security officials to track students based on their appearance: With one click, a guard could pull up video of everywhere else a student has been recorded on campus. The 145-camera system, which administrators said will be installed around the perimeters of the schools deemed "at highest risk," will also automatically alert a school-monitoring officer when it senses events "that seem out of the ordinary" and people "in places they are not supposed to be." The supercharged surveillance network has raised major questions for some students, parents and teachers, like Krawczyk, who voiced concerns about its accuracy, invasiveness and effectiveness.
Students from Marjory Stoneman Douglas High School walk through the Florida state Capitol in Tallahassee. Schools are increasingly turning to artificial intelligence-backed solutions to stop tragic acts of student violence such as the shooting at the Marjory Stoneman Douglas High School in Parkland, Florida, a year ago. Bark Technologies, Gaggle.Net, and Securly Inc. are three companies that employ AI and machine learning to scan student emails, texts, documents, and in some cases, social media activity. They look for warning signs of cyber bullying, sexting, drug and alcohol use, depression, and to flag students who may pose a violent risk not only to themselves, but classmates. When potential problems are found, and depending on the severity, school administrators, parents -- and under the most extreme cases -- law enforcement officials, are alerted.
According to a story in The Japan Times, the school will feed the AI information about 9,000 suspected bullying cases reported by Otsu's elementary and junior high schools between 2012 and 2018. This information will include details on the students involved -- their ages, genders, absenteeism records, and academic achievements -- as well as when and where any bullying incidents took place. "Through an AI theoretical analysis of past data, we will be able to properly respond to cases without just relying on teachers' past experiences," Otsu Mayor Naomi Koshi said, according to The Japan Times. The hope is that the AI will allow school officials to identify the bullying cases that are likely to escalate in seriousness so that they can intervene and diffuse the situation before it's too late. "Bullying may start from low-level friction in relationships, but can get worse day by day," an Otsu education board official said, according to The Japan Times.