In the wake of all-too-common school shootings, school and district leaders are confronted with decisions about how to prevent–or respond to–violent incidents. Some are turning to facial recognition in schools as a way to track visitors and keep schools safe.
Technology is a double-edged sword, and it’s no different when applied to school security. Some argue that advanced emotion-detecting AI technologies and facial recognition in schools infringe on privacy and can’t always identify people correctly or aid in prevention, while others see the technologies as yet another tool to keep students and educators safe.
Schools in Florida’s Broward County plan to use an experimental surveillance system in order to boost safety and security efforts in a district now known for the Feb. 14, 2018 shooting at Marjory Stoneman Douglas High School, where 17 were killed.
The AI-powered system from Avigilon is a combination of cameras and software and can be used to track people based on their appearance, according to news reports.
In July, RealNetworks announced that it would make SAFR for K-12, its AI and machine learning based facial recognition solution, available for free to K-12 schools.
The SAFR system uses existing IP-based cameras and readily available hardware to recognize staff, students, and visitors in real time. Company representatives say the system encrypts all facial data and images to ensure privacy, and all facial data and images remain exclusively within the school’s domain as part of, or complementary to, an existing school ID system.
St. Therese Catholic Academy in Seattle is well into its first year of a pilot using SAFR.
Principal Matt DeBoer says the implementation was driven by the idea that a close-knit community can help prevent tragedy. St. Therese is using SAFR to become better acquainted with parent and community volunteers, along with having greater awareness of who is in the school building each day.
“It’s given us access and awareness in a way we didn’t have before, and it’s made the community stronger,” DeBoer says. While SAFR is offered free to schools, St. Therese had to secure funding to update its infrastructure to support the system.
The SAFR system allows for individual designation, and anyone who has not been scanned into the system automatically defaults to a “stranger” status. A disgruntled former employee or members of a former school family who may have left the school on poor terms can be classified as a “concern” or a “threat” if they’re already in the system.
St. Therese’s system is only used for adults, and while the school’s students are obviously on camera, the SAFR system itself doesn’t recognize them or scan them.
DeBoer says scanning into the system wasn’t a requirement, but no staff members opted out. He also says that while there is research questioning facial recognition’s reliability when it comes to recognizing people of color, the diverse staff at St. Therese have found the system reliable.
Staff members asked questions about how their information would be handled, but they weren’t particularly concerned about misuse of confidential information.
“We have a lot of confidential information we’re responsible for, and this information doesn’t go anywhere outside of St. Therese’s SAFR database. It starts here and ends here,” DeBoer says.
“This increased awareness lets us all breathe a little easier and lets us focus on what we should be helping our students do–helping them learn and be the best people they can be,” DeBoer adds.
Concerns about privacy while using facial recognition in schools
Facial recognition cameras aren’t without the above-mentioned controversy, though, as many groups say the systems are inaccurate and can incorrectly identify minorities.
In June, the ACLU called facial recognition systems “invasive and error-prone technology” in a blog post focusing on the potential impact such a system could have on a school district.
This Washington Post article notes that it isn’t clear how facial recognition in schools could have prevented previous school shootings and violent incidents–but this doesn’t prevent schools from investing in them. The article also references an MIT study about the likelihood of various facial recognition systems incorrectly identifying minorities.
The increased focus on using cameras and AI facial recognition for security comes alongside the Federal Commission on School Safety’s resource guide, which offers recommendations in three broad categories.
Prevent: Preventing school violence through various avenues, including character education and creation of a positive school climate; mental health; threat assessment; school discipline; and law enforcement.
Protect and mitigate: Protecting students and teachers and mitigating the effects of violence through actions such as training; tapping into knowledge held by military and law enforcement veterans; and assessing building and campus security.
Respond and recover: Responding to and recovering from attacks, and using training, such as active shooter training, to guide responses.
Many organizations praised the commission’s first steps in bringing awareness to school safety issues, but pointed out that most districts across the U.S. do not have the financial resources or manpower to meet even the basic recommendations in the report.
- New research paints an alarming picture of crises facing rural students - December 6, 2023
- 5 things to know about AI in classrooms - November 30, 2023
- Gen Z youth on nondegree paths feel workforce ready - November 30, 2023