A recent opinion-editorial in The Hill casts a harsh light on the unfolding state of “digital authoritarianism” in America’s schools. Public schools are increasingly adopting artificial intelligence to monitor students and shape curricula. This trend could ultimately have the effect of stifling education, invading privacy, and changing the attitude of adult American society about pervasive surveillance.
Civil rights attorney Clarence Okoh writes that “controversial, data-driven technologies are showing up in public schools nationwide at alarming rates.” These technologies include AI-enabled systems such as “facial recognition, predictive policing, geolocation tracking, student device monitoring and even aerial drones.”
A report compiled by the Center for Democracy & Technology found that over 88 percent of schools use student device monitoring, 33 percent use facial recognition and 38 percent share student data with law enforcement. These surveillance technologies enable schools to punish students with greater frequency.
A study conducted by Johns Hopkins University found that students at high schools with prominent security measures have lower math scores, are less likely to attend college, and are suspended more often compared to students in schools with less surveillance. The study claims to factor out social and economic background data. Okoh highlights a Florida case in which the sheriff’s office has purportedly used a secret predictive policing program against vulnerable schoolchildren. At some point, the punishments of “predictive” behavior could easily become a self-fulfilling prophecy.
A group known as the People Against the Surveillance of Children and Overpolicing (PASCO) has found through litigation and open records requests that the Pasco, Florida, sheriff’s office has a secret youth database that contains the names of up to 18,000 children each academic year. According to PASCO, the sheriff’s office built this database using an algorithm that assessed confidential student records: everything from grades, attendance records, and histories of child abuse are used to assess a student’s risk of falling into “a life of crime.”
Many programs directly target minority students. Wisconsin uses a dropout prevention algorithm that uses race as a risk factor. Such modeling, mixed with surveillance software in schools, could have a demonstrably harsher impact on minority students, leading to higher suspension rates.
State legislators need to drill down into these reports and verify these claims. One avenue to explore is the role of parents in this process. Are they informed of these practices? Have they been sufficiently heard on them? We have sympathy for why some schools would feel the need to resort to such tactics. But if these facts pan out, then artificial intelligence has just opened up a new front – not just in the war on privacy, but one that could seal the fate of children’s future lives.
It also could have an impact on adult society as well. An elementary school student might not understand what it means to be surveilled 24/7 and could become accustomed to it over time. This could lead to a generation of Americans who are inured to ever-present monitoring. If digital authoritarianism becomes the norm in school, it will soon become the norm in society.
PPSA looks forward to further developments in this story.