Surveillance – 社区黑料 America's Education News Source Tue, 24 Feb 2026 22:05:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Surveillance – 社区黑料 32 32 Amazon-owned Ring and Flock Broke Up. Privacy Experts Ask: Should Schools, Too? /article/the-worlds-biggest-e-commerce-co-split-with-flock-should-schools-do-the-same/ Sat, 21 Feb 2026 11:30:00 +0000 /?post_type=article&p=1028951 School (in)Security is our biweekly briefing on the latest school safety news, vetted by Mark KeierleberSubscribe here.

Milo went missing. 

Yet it wasn鈥檛 the lost puppy that gave people the jitters 鈥 it was the promise behind the story: that a communitywide web of home security systems could transform a neighborhood into a 鈥淪earch Party.鈥

Eamonn Fitzmaurice/社区黑料 (Source: Ryan Murphy/Getty Images)

The Super Bowl commercial set off public backlash against two leading surveillance companies: Amazon, which owns Ring doorbell cameras, and Flock Safety, which makes license plate reader cameras. Within days, the e-commerce giant announced it was ditching a planned partnership with Atlanta-based Flock.

Privacy advocates said the breakup represented a rare, high-profile retreat from the expansion of surveillance-driven policing 鈥 and that school leaders should take note.

鈥淭he fact that Amazon is reconsidering their relationship with Flock should be a very large and glaring sign that schools should also perhaps reconsider that relationship,鈥 said Kristin Woelfel, policy counsel for equity in civic technology at the nonprofit Center for Democracy and Technology.

In an investigation last week, 社区黑料 revealed that police nationwide routinely tapped into school district Flock cameras to assist President Donald Trump鈥檚 mass immigration crackdown, which has also led to public outcry and protest over the U.S. Department of Homeland Security鈥檚 unprecedented surveillance tactics.

You can also listen to me talk about my latest reporting on the and on on San Francisco’s KALW public radio.


In the news

The latest in Trump鈥檚 immigration crackdown: A Georgia elementary school teacher was killed this week while driving to work when a man being chased by federal immigration agents rammed into her vehicle. | 

  • Conservative advocacy group Defending Education has built a database of some 700 school districts nationally that have adopted policies restricting federal immigration agents’ access to campuses. | 
  • U.S. Department of Homeland Security spokeswoman Tricia McLaughlin, who repeatedly denied that federal agents were targeting schools, is stepping down. | 
Meta CEO Mark Zuckerberg leaves Los Angeles Superior Court this week. (Photo by Wally Skalij/Getty Images)

Instagram and other Meta-owned social media apps have navigated youth safety 鈥渋n a reasonable way,鈥 company CEO Mark Zuckerberg testified Wednesday in a courtroom filled with parents who have accused the company and other tech giants of hooking their children on the platforms and decimating their mental health. | 

鈥榃orried that I was going to die鈥: Georgia high schoolers opened up this week about the horrors of getting shot during the 2024 Apalachee High School shooting that led to the deaths of two teachers and two students. Students鈥 testimonies came during a criminal trial accusing the alleged shooter鈥檚 father of recklessness and failure to prevent the tragedy. | 

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

Should schools call child protective services on students who are chronically absent? Debate has ensued. | 

  • A Georgia father has been arrested on allegations that each of his two sons has missed nearly 400 days of school. One is an elementary school student, while the other is in middle school. | 

In a significant departure from past years, the Education Department鈥檚 civil rights division didn鈥檛 close any sexual harassment and assault cases involving K-12 schools in 2025, after the Trump administration slashed the agency and purged its caseload. | 


ICYMI @The74


Emotional Support

社区黑料 is proud to announce we鈥檝e hired Simon and Max, who joined reporter Lauren Wagner a few weeks ago at our growing Nebraska bureau.

]]>
Amazon鈥檚 Ring Cuts Ties with Surveillance Camera Co. Used by ICE. Will Schools? /article/amazons-ring-cuts-ties-with-surveillance-camera-co-used-by-ice-will-schools/ Fri, 20 Feb 2026 11:30:00 +0000 /?post_type=article&p=1028742 Updated Feb. 24, clarification appended Feb. 20

Milo went missing. 

Yet it wasn鈥檛 the lost puppy that gave people the jitters 鈥 it was the promise behind the story: That a communitywide web of home security systems could transform a neighborhood into a 鈥淪earch Party.鈥

The Super Bowl commercial against two leading surveillance companies, Amazon, which owns Ring doorbell cameras, and Flock Safety, which makes license plate reader cameras. Within days, the e-commerce giant announced it was ditching a planned partnership with Atlanta-based Flock.

Privacy advocates said the breakup represented a rare, high-profile retreat from the expansion of surveillance-driven policing 鈥 and that school leaders should take note.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淭he fact that Amazon is reconsidering their relationship with Flock should be a very large and glaring sign that schools should also perhaps reconsider that relationship,鈥 said Kristin Woelfel, policy counsel for equity in civic technology at the nonprofit Center for Democracy and Technology. 

In an investigation last week, 社区黑料 revealed that police nationwide routinely tapped into school district Flock cameras to assist President Donald Trump鈥檚 mass immigration crackdown, which has also led to public outcry and protest over the U.S. Department of Homeland Security鈥檚

Ring鈥檚 planned integration with Flock Safety would have allowed homeowners to share their camera feeds with the police. The company said the collaboration was never launched but it still plans to roll out 鈥淪earch Party鈥 to homeowners, first for 鈥渇inding dogs鈥

In statements, the two companies described the , with Ring saying it

Some 100 school districts across the country have contracted with Flock, according to government procurement records. Their cameras are designed to capture license plate numbers, timestamps and other identifying details, which are uploaded to a cloud server. Flock customers, including schools, can decide whether to share their information with other police agencies in the company鈥檚 national network. 

Typical Flock automated license plate reader, mounted to a pole and powered by a solar panel (Wikipedia, CC)

Woelfel鈥檚 warning lands amid of automated license plate readers and their use by federal immigration agents to track down targets. Flock audit logs obtained by 社区黑料 and interviews reveal local police departments nationwide are searching school district-run surveillance networks to aid the DHS in immigration enforcement cases. 

The logs were from Texas school districts that contract with Flock and showed that law enforcement agencies far beyond their borders 鈥 including in Florida, Georgia, Indiana and Tennessee 鈥 routinely conducted searches on the districts’ campus feeds, tagging reasons such as 鈥淚mmigration (criminal)鈥 and 鈥淚mmigration (civil/administrative).鈥 Multiple law enforcement officials acknowledged the searches were done at the request of federal immigration agents, with one saying the local assist was given without hesitation. 

Ring spokesperson Emma Daniels said the company doesn鈥檛 contract with school districts directly. The company鈥檚 鈥渢erminated integration with Flock鈥 is specific to a tool that allows local police 鈥渢o request video footage from Ring users in a specific area during a defined time period鈥 to help in investigations related to 鈥渁 car theft, a burglary or other local safety concerns.鈥

Flock spokesperson Holly Beilin said her company was not involved in the 鈥淪earch Party鈥 feature promoted in the Super Bowl ad and its planned Ring collaboration 鈥渉ad nothing to do with any of our school customers.鈥 Those customers rely on the automated license plate readers to navigate parent custody logistics and in parking lots where 鈥渕ost incidents of violence at schools take place.鈥 In December, district s to investigate a rash of car break-ins in school parking lots.

Immigration and Customs enforcement agents have during school pick-up and drop-off to target immigrant families. 

Beilin said she didn鈥檛 know how frequently school-owned Flock networks were being queried on behalf of ICE, but that the company had rolled out that allows customers to disable immigration-related searches on their devices. 

Kristin Woelfel

鈥淚f school district police, or, frankly any police, decides that that is against their policy, they can turn that search filter on,鈥 Beilin told 社区黑料. 鈥淪o any of those searches would be filtered out.鈥 

There is no evidence from 社区黑料鈥檚 analysis that the Texas school districts use the devices for their own immigration-related investigations, but the audit logs raise questions about how broadly school safety data are being fed into the far-reaching surveillance tool. 

That school Flock cameras are being accessed by out-of-state police officers for immigration enforcement is 鈥渁 really serious privacy issue for children and families鈥 Woelfel said. 

鈥淵ou have to think about what effect it鈥檚 ultimately going to have on the community,鈥 she continued. 鈥淓ven in places without Flock cameras, people are afraid to drop their kids off at school,鈥 because of heightened immigration enforcement and the Trump administration’s policy change that lifted longstanding restrictions against immigration enforcement in or around schools and other 鈥渟ensitive locations.鈥 

Amazon-owned home security company Ring ended a partnership with surveillance vendor Flock Safety after a Super Bowl commercial led to public backlash. (Photo by Joe Raedle/Getty Images)

鈥楥an鈥檛 believe we have that here鈥

For 16-year-old Zachary Schwartz, a high schooler from San Francisco, backlash to the Ring ad validated something he鈥檚 been telling people for months: Flock鈥檚 presence in communities nationwide has grown far too vast and most Americans don鈥檛 even realize it. 

鈥淵ou hear about tracking systems in other countries, like China, which are more authoritarian,鈥 Schwartz said. 鈥淎nd it鈥檚 like, 鈥榃hoa, I can鈥檛 believe we have that here.鈥 

Schwartz said he fell down the Flock rabbit hole after watching , which sent him digging into its widespread use in his own city. He learned the San Francisco Police Department shared its feeds with law enforcement officers nationwide, including for immigration enforcement, in apparent . Activists have also elevated concerns about weak cybersecurity safeguards and faulty findings that

Schwartz built a website, , to drive attention to Flock鈥檚 presence. He also circulated posters across San Francisco urging residents to learn about the cameras constantly watching them.

鈥淚f you鈥檙e driving on a major roadway, you鈥檙e being tracked in the city,鈥 Schwartz said. 鈥淚t would be pretty hard to avoid it while going to school if you鈥檙e going by car or by a bus.鈥 

San Francisco high schooler Zachary Schwartz hung up posters across the city alerting residents to Flock Safety automated license plate reader cameras. (Courtesy Zachary Schwartz)

社区黑料 reached out to 30 districts to learn more about how they use Flock and whether they鈥檝e assessed how their data are shared. Few responded and almost all declined to comment. Several, including Indiana鈥檚 Center Grove Community School Corporation, said they ended their contracts with Flock without providing details about why. 

One district that did respond was Minnetonka Public Schools, 12 miles southwest of Minneapolis, where the Trump administration鈥檚 mass deployment of immigration agents last month resulted in the fatal shootings of two citizens, closed Minneapolis Public Schools for two days and forced multiple districts in the Twin Cities area to offer remote learning for students too afraid to come to school.

District spokesperson JacQueline Getty said Minnetonka school officials use Flock license plate readers primarily to ensure people who have been banned from campus don鈥檛 trespass on school property. She didn鈥檛 elaborate on whether district Flock data are shared directly with outside law enforcement agencies or if their data have been leveraged to assist federal immigration agents. 

鈥淲e cooperate with our local law enforcement department when there is a need to do so, such as if our reader pings a stolen vehicle entering our lot,鈥 Getty said in an email. 鈥淥ur primary goal is campus safety, and the district has benefited from identifying people who should not be on district property.鈥

At Indiana University in Bloomington, in a January protest criticizing the city鈥檚 use of Flock license plate readers. In at the University of Wisconsin-Madison, the campus it 鈥渦ses a limited number鈥 of Flock cameras for campus safety but has 鈥渆nabled specific settings within our system to prevent searches related to immigration enforcement.鈥 

鈥楾he future that we really want?鈥

The controversy comes on the heels of efforts at Flock to security. Security vendor Raptor Technologies announced last year an initiative to implement Flock cameras into a product designed to enhance safety during afternoon dismissal. 

Raptor Technologies, which counts roughly 40% of U.S. school districts as its customers, offers software that screens school visitors.

鈥淏y working with both schools and local law enforcement, Flock helps create safe corridors for student travel 鈥 whether that鈥檚 monitoring activity along walking routes, at bus stops or on nearby roads,鈥 Flock said in . 

In 2024, Raptor聽suffered a cybersecurity lapse that exposed millions of sensitive records 鈥斅爄ncluding districts鈥 active-shooter plans and students鈥 medical records 鈥斅爐o the internet.

“Raptor Technologies does not share, sell or disclose any data collected on our platform with third parties or government agencies,” a company spokesperson said in a statement after this article was published.

“We do not provide access to our systems or customer records other than as directed by customers or pursuant to a valid government order,” according to the statement. Although Raptor tools integrate with other companies’ security offerings, the spokesperson said it is up to districts to “determine what data, if any, is shared, the scope of what is shared and whether an integration is enabled.”

Schwartz, the San Francisco high schooler, said students learn about mass surveillance at school by reading books like George Orwell鈥檚 classic 1984. Yet when government overreach 鈥渉appens right in front of us,鈥 he said, 鈥渕any people don鈥檛 see it.鈥

In a place where Bay Area technology companies routinely roll out their latest wares, people are starting to wake up, he said. 

鈥淚t also means that we see the future before it happens sometimes,鈥 Schwartz said, 鈥渁nd we can decide 鈥極h, is this the future that we really want?鈥欌

Clarification: Flock鈥檚 licensed plate reader cameras were not part of the company鈥檚 since-cancelled integration with Ring. The subhead on this story has been updated to make that distinction clearer.

]]>
ICE Taps into School Security Cameras to Aid Trump鈥檚 Immigration Crackdown /article/ice-taps-into-school-security-cameras-to-aid-trumps-immigration-crackdown-74-investigation-shows/ Tue, 10 Feb 2026 11:30:00 +0000 /?post_type=article&p=1028296 This story was co-published with聽

Police departments across the U.S. are quietly leveraging school district security cameras to assist President Donald Trump’s mass immigration enforcement campaign, an investigation by 社区黑料 reveals. 

Hundreds of thousands of audit logs show police are searching a national database of automated license plate reader data, including from school cameras, for immigration-related investigations.

The audit logs originate from Texas school districts that contract with Flock Safety, an Atlanta-based company that manufactures artificial intelligence-powered license plate readers and other surveillance technology. Flock鈥檚 cameras are designed to capture license plate numbers, timestamps and other identifying details, which are uploaded to a cloud server. Flock customers, including schools, can decide whether to share their information with other police agencies in the company鈥檚 national network. 

Multiple law enforcement leaders acknowledged they conducted the searches in the audit logs to help the U.S. Department of Homeland Security enforce federal immigration laws, with one saying the local assist was given without hesitation. The Trump administration鈥檚 aggressive DHS crackdown, which , has had a significant impact on schools

Educators, parents and students have been swept up, with immigrant families being targeted during . School parking lots are one place the cameras at the center of these searches can be found, along with other locations in the wider community, such as mounted on utility poles at intersections or along busy commercial streets.

The data raises questions about the degree to which campus surveillance technology intended for student safety is being repurposed to support immigration enforcement, whether school districts understand how broadly their data is being shared with federal agents and if meaningful guardrails exist to prevent misuse. 

鈥淭his just really underscores how far-reaching these systems can be,鈥 said Phil Neff, research coordinator at the University of Washington Center for Human Rights. Out-of-state law enforcement agencies conducting searches that are unrelated to campus safety but include school district security cameras 鈥渞eally strains any sense of the appropriate use of this technology.鈥

Flock devices have been installed by more than 100 public school systems nationally, government procurement records show, and audit logs from six Texas school districts show campus camera feeds are captured in a national database that police agencies across the country can access. School district Flock cameras are queried far more often by out-of-state police officers than by the districts themselves, according to the records.

School police officers use Flock cameras to investigate 鈥渞oad rage,鈥 鈥渟peeding on campus,鈥 鈥渧andalism鈥 and 鈥渃riminal mischief,鈥 records show. There is no evidence school districts themselves use the devices for immigration-related purposes 鈥 or that they鈥檙e aware other agencies do so. 

Typical Flock automated license plate reader, mounted to a pole and powered by a solar panel (Wikipedia, CC)

and previously revealed that police agencies nationwide were tapping into Flock camera feeds to help federal immigration officials track targets. In some cases, local law enforcement agencies enabled direct sharing of their networks with U.S. Border Patrol. 

Immigration officials鈥 unprecedented use of surveillance tactics to carry out their controversial mission has . That school district cameras are part of that dragnet has not been previously reported. 

Randi Weingarten, president of the American Federation of Teachers, called the revelation an 鈥渆gregious end run around the Constitution鈥 that will add to the pressure on Congress to rein in U.S. Customs and Immigration Enforcement. By accessing campus feeds, she said, immigration authorities are violating the rights of students, parents and educators 鈥渢o be free from unreasonable search and seizure.鈥 

The teachers union in September after it ended a longstanding policy against conducting immigration enforcement actions in and around  

鈥淪chools are sacred spaces 鈥 and ICE knows it needs a judicial warrant to access them,鈥 Weingarten said in a statement. The teachers union filed its lawsuit, she said, 鈥渟o schools remain safe and welcoming places, not targets for warrantless surveillance and militarized raids.鈥

聽High school students in Bloomfield, New Jersey, walk out of class on Feb. 3 to protest heightened federal immigration enforcement actions in the state. (Photo by Kyle Mazza/Anadolu via Getty Images)

The scale of it is phenomenal

At the Huffman Independent School District northeast of Houston, records reveal it was the campus police chief鈥檚 administrative assistant who granted U.S. Border Patrol access to district Flock Safety license plate readers in May.  

Police departments nationwide also routinely tapped into the eight Flock cameras installed at the 30,000-student Alvin Independent School District south of Houston. Over a one-month period from December 2025 through early January, more than 3,100 police agencies conducted more than 733,000 searches on the district鈥檚 cameras, 社区黑料鈥檚 analysis of public records revealed. Of those, immigration-related reasons were cited 620 times by 30 law enforcement agencies including ones in Florida, Georgia, Indiana and Tennessee. 

Dr. Ronald E. McNair Junior High School in the Alvin Independent School District. (Djmaschek, Wikipedia)

Flock offers a list of standardized reasons that agencies must choose from when running a search. For the Alvin school district鈥檚 cameras, immigration-related reasons identified by 社区黑料 include 鈥淚mmigration (civil/administrative)鈥 and 鈥淚mmigration (criminal).鈥 

The data put into focus the scale of digital surveillance at school districts nationally and 鈥渏ust how dangerous these tools are,鈥 said Ed Vogel, a researcher and organizer with The NOTICE Coalition: No Tech Criminalization in Education.

鈥淭he scale of it is phenomenal, and it鈥檚 something that I think is difficult for individual people in their cities, towns and communities to fully appreciate,鈥 said Vogel, who鈥檚 also with the surveillance-monitoring Lucy Parsons Labs in Chicago. 

The Flock camera audit logs and other public records about their use by school districts were provided exclusively to 社区黑料 by The NOTICE Coalition, a national network of researchers and advocates seeking to end mass youth surveillance. 社区黑料 also filed public records requests to obtain information on schools鈥 use of Flock cameras and conducted an analysis to reveal the extent of the immigration-related searches. Those findings were shared with the law enforcement agencies and school districts mentioned in this story. 

Three of the 10 agencies that conducted the most immigration-related searches in the Alvin school district logs participate in the 287(g) program, which deputizes local officers to perform certain immigration enforcement functions and has also become a point of controversy. The program has during Trump鈥檚 second term.

Alvin school district Police Chief Michael Putnal directed all questions to district spokesperson Renae Rives, who provided public records to 社区黑料 but did not acknowledge multiple requests for comment. 

Amanda Fortenberry, the spokesperson for the Huffman school district, said in an email the district is 鈥渞eviewing the matters you referenced,鈥 but declined to comment further.

Flock Safety, which across 7,000 networks nationally, didn鈥檛 respond to 社区黑料鈥檚 requests for comment, nor did the Department of Homeland Security.

鈥榃e will assist them 鈥 no questions asked鈥

Camera settings information obtained by 社区黑料 through public records requests suggests that Alvin school district police officers are unable to search their own devices for immigration-related purposes. But the school system allows such queries routinely from out-of-state police officers, audit logs reveal. 

Flock searches for civil immigration reasons that appeared in the Alvin school logs, such as trying to locate someone who is unlawfully present in the U.S., were more than two times more frequent than those conducted for investigations involving immigrants suspected or convicted of committing a crime. 

Also included among the reasons given for immigration-related searches are 鈥淚.C.E.,鈥 in reference to Immigration and Customs Enforcement, 鈥淓RO proactive crim case research,鈥 an apparent reference to ICE鈥檚 Enforcement and Removal Operations division and 鈥淐BP Investigation,鈥 an apparent reference to U.S. Customs and Border Protection.

Lt. Blake Hitchcock

In Carrollton, Georgia, officers routinely use Flock鈥檚 nationwide lookup to track suspects outside their jurisdiction, Lt. Blake Hitchcock said in an interview. Immigration-related searches that appear in the Alvin school district鈥檚 audit log by the Carrollton Police Department were conducted to assist federal agents at the request of the Department of Homeland Security, Hitchcock said. He declined to elaborate on specifics.

Federal agents 鈥渨ere working directly鈥 with a Carrollton police officer who had access to the Flock cameras 鈥渁nd they asked him to run it and they did,鈥 Hitchcock said. If federal agents ask his office to help them with an immigration case, Hitchcock said, 鈥渨e will assist them 鈥 no questions asked.鈥

Flock searches are typically broad national queries, and officers do not select individual cameras, he explained. Instead, with each search request, the system automatically checks every camera that Flock customers share with the nationwide database, including those operated by school districts.

Because a school district is part of the national lookup, Hitchcock said, its cameras will be searched any time another participating agency conducts a nationwide inquiry. He said Flock’s nationwide search is helpful to track people who 鈥済o from jurisdiction to jurisdiction to commit crimes.鈥 He pointed to in 2020 when Carrollton officers used Flock cameras to rescue a 1-year-old who was kidnapped at gunpoint some 60 miles away. 

In Galveston, Texas, Constable Justin West confirmed that immigration-related searches that appeared in the Alvin school district鈥檚 audit logs from his department were tied to the county鈥檚 participation in the federal 287(g) program.

County deputies with federal immigration enforcement powers 鈥渉ave been working on arresting targeted criminal illegal aliens,鈥 West wrote in an email, and use Flock cameras 鈥渢o determine locations and travel patterns of the illegal aliens being sought.鈥 

Galveston deputies鈥 Flock searches that appeared in the Alvin school district audit logs led to several arrests, West said, while several of the investigations remain ongoing. Flock logs show the Galveston County searches were conducted for both criminal and civil immigration investigations. 

While the Trump administration maintains its immigration crackdown centers on removing dangerous criminals, surged to 43% in January. and immigrants with no pending civil immigration actions against them have similarly been detained. 

Other agencies that participate in the 287(g) program that were heavily represented in the Alvin ISD logs include the Texas Department of Public Safety and the Florida Fish and Wildlife Conservation Commission, each of which conducted more than 60 immigration-related searches that queried the school district鈥檚 cameras in the one-month period. 

The Texas Department of Public Safety and the Florida Fish and Wildlife Conservation Commission were among four agencies that did not respond to 社区黑料’s inquiries about their searches. The other two were the Lowndes County Sheriff’s Office in Georgia and the Greene County Sheriff’s Office in Ohio. 

In Mesquite, Texas, searches labeled 鈥淚mmigration (criminal)鈥 were 鈥渃onducted as part of an investigation to locate a suspect wanted on felony criminal charges,鈥 Lt. Curtis Phillip said in an email. While the suspect 鈥渨as believed to be unlawfully present in the United States,鈥 Phillip said his department doesn鈥檛 use Flock cameras 鈥渇or the purpose of enforcing federal civil immigration law.鈥

鈥淲hen a search is conducted across the shared network, the activity may appear in the audit records of all participating system owners, even when the investigation itself is unrelated to schools or school-based activity,鈥 Phillip said. 鈥淭here is no efficient mechanism to exclude specific entities, such as school districts, from those searches.鈥

In Grant County, Indiana, 238 immigration-related searches included in the Alvin ISD audit logs were conducted 鈥渂y one of our deputies鈥 as part of 鈥渁 confidential investigation,鈥 Jay Kay, chief deputy of the county sheriff鈥檚 office, said in an email. He didn鈥檛 elaborate further. The reason given for the search in the Flock audit log was 鈥淚mmigration (civil/administrative) – Test.鈥 

It鈥檚 not clear whether every search tagged as immigration-related necessarily was. John Samples, captain of the Little Elm, Texas, police department, said a detective selected 鈥渋mmigration鈥 as a search reason while assisting the Department of Homeland Security on a sex crimes investigation and a separate terrorism-related case. That word choice, Samples said, was 鈥渘ot the best course of action鈥 and will be 鈥渃orrected on our end.鈥  

The police department in Texas City, Texas, denied it used the system to enforce federal immigration laws. While the agency monitors 鈥渟everal thousand Flock Cameras across the United States,鈥 Captain Brandon Shives said his department’s searches in the Alvin ISD logs should not have been categorized as immigration-related and that it was the result of a “clerical error.鈥

鈥榊our community and beyond鈥

Flock Safety has repeatedly stated that it does not provide the Department of Homeland Security with direct access to its cameras and that all data-sharing decisions are made by local customers, including school districts. 

鈥淚CE cannot directly access Flock cameras or data,鈥 the company . 鈥淟ocal public safety agencies sometimes collaborate with federal partners on serious crimes such as human trafficking, child exploitation or multi-jurisdictional violent crime,鈥 but decisions about 鈥渉ow data is shared are made by the customer that owns the data, not by Flock.鈥

The company acknowledged in August it ran pilot programs with the DHS to assist federal human trafficking and fentanyl distribution investigations but that 鈥渁ll ongoing federal pilots have been paused鈥 after the initiative faced scrutiny and legal pushback.

鈥淵ou know how maybe your grandparents approve every friend request they get on Facebook? It鈥檚 like that. It鈥檚 always been like that.鈥

Dave Maass, investigations director, Electronic Frontier Foundation

Public records provided by the Alvin school district, which began purchasing Flock cameras in 2023 and has since spent more than $50,000 on its eight devices, include Flock marketing materials that tout the ability to share data with other police agencies. 

鈥淣ot only do we place cameras where you need them,鈥 the document notes, 鈥渨e offer access to available cameras in your community and beyond your jurisdiction.鈥

In fact, nationwide sharing is a staple of Flock鈥檚 business model, said Dave Maass, director of investigations at the nonprofit Electronic Frontier Foundation. Maass has spent the last decade researching how police use automated license plate readers like Flock and, in at least one case last year, in Texas, where the procedure is illegal. 

鈥淭hat鈥檚 something that鈥檚 a selling point for them,鈥 Maass said, adding that his research has shown that police agencies agree to provide outside officers access to their Flock data with little deliberation. 

鈥淵ou know how maybe your grandparents approve every friend request they get on Facebook?鈥 Maass said. 鈥淚t鈥檚 like that. It鈥檚 always been like that. You鈥檒l have an agency that will request access to other places and other places will just not even question it. They鈥檒l just hit 鈥榮ure, approve.鈥欌

鈥楢 unique level of responsibility to protect their students鈥

Flock Safety provides audit logs that allow law enforcement customers to see how their automated license plate reader cameras are being used. The reports 鈥渟upport accountability and public trust by making usage patterns visible and reviewable,鈥 the company said in the recent blog post. 

None of the law enforcement officials contacted by 社区黑料 said they used the audit logs to ensure people with access to their data queried the information for legitimate and legal purposes. Given the overwhelming volume of law enforcement searches that are included in the Alvin school district audit logs in just a month, Maass said, such reviews would be practically impossible.

Adam Wandt

Adam Wandt, an attorney and associate professor at New York City鈥檚 John Jay College of Criminal Justice, said license plate readers can be invaluable tools for solving serious crimes and finding missing persons. 

But he also acknowledged the devices present significant privacy concerns and questioned whether the broad sharing of school-controlled camera data violates federal student privacy rules. The revelation that school-owned Flock cameras are being queried for immigration enforcement purposes, he said, 鈥渨ill cause significant discussions to be had in the near future within many school districts鈥 that contract with the company. 

鈥淪chool districts are in a unique position, they have a unique level of responsibility to protect their students in specific ways,鈥 including their privacy, Wandt said.

Vogel of the NOTICE Coalition said students and parents should demand transparency from their school districts about whether they employ Flock license plate readers and whether the data from those cameras are being fed to immigration agents. 

鈥淭hese are just tools, and whoever has control over them gets to define how they’re used,鈥 Vogel said. 鈥淚 have a feeling that immigration enforcement was not one of the reasons that was discussed when they said, 鈥榃e need to get a contract with Flock Safety.鈥欌

]]>
Experts on Kids & Social Media Weigh the Pros and Cons of 鈥楪rowing Up in Public鈥 /article/experts-on-kids-social-media-weigh-the-pros-and-cons-of-growing-up-in-public/ Wed, 17 Jan 2024 13:30:00 +0000 /?post_type=article&p=720576 Parents are more concerned than ever about their kids’ social media habits, worried about everything from oversharing and cyberbullying to anxiety, depression, sleep and study time. 

Recent surveys of young people show that parents鈥 concerns may be justified: More than half of U.S. teens spend at least four hours a day on these apps. Girls, who are , spend an average of nearly an hour more on them per day than boys. Many parents are searching for support. 

Perhaps more than anyone, Carla Engelbrecht and Devorah Heitner are qualified to offer it. They鈥檝e spent years puzzling over how families can help understand media from the inside out, and how schools both help and hurt kids鈥 ability to cope.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Engelbrecht is a longtime children鈥檚 media developer. A veteran of Sesame Workshop and PBS Kids Interactive, she spent seven years at Netflix, most recently as its director of product innovation. Engelbrecht was behind the network鈥檚 Black Mirror 鈥溾 episode in 2018, which allowed viewers to choose among five possible endings. 

Carla Engelbrecht (second from right) appears onstage with colleagues during a Netflix event on Black Mirror’s 鈥淏andersnatch鈥 episode in 2019. Engelbrecht, who was director of product innovation for the streaming service, is now testing a social media platform for children under 13. (Charley Gallay/Getty Images for Netflix)

Engelbrecht is now in public beta testing for , a new social media platform for kids under 13. She calls it a 鈥渃ourse correction鈥 for young people鈥檚 social media, aiming to teach them to be more mindful, thoughtful and responsible online.

Heitner is an who specializes in helping parents and educators understand how digital technology, especially social media and interactive gaming, shape kids鈥 realities. Her books include 2016鈥檚 and her new work . 

Speaking to either one would be enlightening, but we decided to facilitate a broader conversation by inviting them to come together (virtually) to share insights and offer a bit of advice for both parents and schools. 

Their conversation with 社区黑料鈥檚 Greg Toppo was wide-ranging, covering the effects of the pandemic, the pressures kids feel online and the women鈥檚 experiences communicating with their own children.

Devorah Heitner spoke in 2017 at the Roads to Respect Conference in Los Angeles. Heitner鈥檚 new book explores the impact of modern technology on childhood, including the effects of increased adult supervision of kids through tracking devices. (Joshua Blanchard/Getty Images for Rape Treatment Center)

The solutions they offer aren鈥檛 simple. In Heitner鈥檚 words, parents seeking to learn more about their kids鈥欌 media usage should pull back their surveillance and 鈥渓ead with curiosity.鈥 

The conversation has been edited for length and clarity.

社区黑料: Devorah, tell us a little bit about your new book.

Devorah Heitner: I wrote Growing Up in Public because I was speaking for years about Screenwise in schools and all these other environments, and people said, “O.K., I get that we want to think about quality over quantity with screen time. But we also want to understand what kids鈥 subjective experience is and not just focus on how many minutes are good or bad.鈥

People lie about that anyway. People are sort of oblivious to their own screen use sometimes and get over-focused on their kids鈥. A lot of adults are recognizing: If I could have had a Tumblr or a Twitter or Instagram as a kid, I could have really done a lot of damage to my prospects and opportunities by so openly sharing.

What are we doing to our reputations?

As I started digging into that question, I recognized that parents are really part of the surveillance culture with kids. So are schools, with grading apps like or [which keep track of kids鈥 location, among other functions]. I really started understanding in a fuller way how kids are scrutinized. Kids are growing up very searchable, very public, and some of that is awesome. They have a platform, they can be activists. Some of it is problematic. 

The title of your book, Growing Up in Public, says so much about kid鈥檚 lives these days. I saw this term the other day: not FOMO, “Fear of Missing Out,” but FOMU, “.” Are those competing interests for young people?

Heitner: Well, there’s definitely a fear of messing up and especially being called out. There’s a lot of “gotcha” culture going on, and kids documenting each others’ screw-ups. And as much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated outside of that context.

I think it’s modeled by adults, but this kind of “gotcha” culture is very insidious and terrifying. And it should be terrifying. 

Carla, tell us a little bit about yourself.

Carla Engelbrecht: I’m a longtime product developer and researcher in the kids’ space. I’ve spent a lot of time making products for kids. I’ve seen for years kids wanting access to Twitter and Facebook and MySpace and , all through the generations of social media. And they always want what is not made for them. They’re aspirational.

Kids are just plopped into this. And just as you wouldn’t give a new driver the keys to the car and just say, “Go!” 鈥 you need to teach them how to drive 鈥 there’s the same concept for me with media use. We need to teach our kids. Parents don’t know what they’re doing, because none of us have really been through this before, and they abstain. They need support in learning how to do this. Where Devorah talks about things from that guidance perspective, I’m looking at: How can we build a product for kids that helps them learn? 

It seems to me like Betweened is a site for parents as much as anybody. 

Engelbrecht: There’s definitely two audiences here. There’s absolutely a path where I could build a product for kids and launch them onto it. But I wouldn’t be addressing all the pain points.

Kids want short-form content. They want to create. They want to connect with their peers. In order to successfully set kids up to do that, parents need tools, too. And so it is really a product for both kids and parents.

Carla mentioned all these different apps coming down the road. Devorah, I’m thinking about you saying to someone recently how you鈥檝e been working on this book for five years. A lot has changed in five years. We didn’t have TikTok five years ago. 

Heitner: Screenwise came out in the fall of 2016, which was a memorable time for many reasons: a lot of social forces happening in our world with Trump’s election. 

And then you have the pandemic in 2020. That’s around the time I had sold the book and was trying to interview people. Suddenly, I’m not in schools anymore. I’m on Zoom with kids, which is a whole research problem: How do you get a wider range of kids, not just the super-compliant kids who show up to a Zoom? And the pandemic was an accelerant to a lot of things happening already with kids in tech.

鈥淧arents are really part of the surveillance culture with kids. So are schools.鈥

Devorah Heitner

It was certainly not the beginning of kids being too young and not [the federal Children鈥檚 Online Privacy Protection Act gives parents control over what information websites can collect from their kids]. But it accelerated, and there was kind of a push toward things like Kids Messenger [on Facebook] and other things that I even experimented with at the time. 

The pandemic started when my son was 10. We were like, “Oh, what can we do to help him communicate with friends?” We experimented with Messenger. It was a fail for us, but I also talked to the people at and [two mobile phone companies marketed for children]. There are people, in different ways, trying to come up with solutions because they have understood that both the adult apps and the adult devices, like a smartphone that does all the things, might not be the ideal thing to give a 10-year-old. 

What’s changed since 2016 is there used to be more worry about one-to-one computing in schools. Now, every school pretty much is one-to-one. It’s really the outlier schools that don’t have tech or aren’t giving kids individual tech. Even as late as 2015, 2016, I was helping schools negotiate that with parents. And parents were like, “I don’t know. I’m not sure about screen time. I don’t know if I want my kid getting a Chromebook.”

Try to find a school now that doesn’t give kids iPads or Chromebooks or something. That’s probably one of the bigger differences. And then just the explosion in server-based gaming like Roblox and Minecraft and the ways kids interact in those digital communities. You see a lot of very complicated, weird ideas among adults who care about children. Like “I’ll wait until eighth grade to give a kid a phone. Meanwhile,my third-grader plays Roblox on a server with strangers.” 

Engelbrecht: Or has access to text messaging through their iPad.

Heitner: Exactly. And they’re very smugly waiting till eighth grade and I’m like, “For what? For your kid to make voice calls?鈥 That’s the one thing they don’t want to do.

Carla, you come from a game design background. People have lots of terrible takes about video games, which I’m sure you’re used to. How has that background informed what you’re doing and what Betweened looks like?

Engelbrecht: A lot of people come to video games and they’re just like, 鈥淭hey’re evil,鈥 or 鈥淭hey’re awful,鈥 or 鈥淭hey’re violent.鈥 And you can say the same thing about television. You can also say the same thing if you only eat broccoli. Anything in excess is not good for you 鈥 like running a marathon every day. I take a very pragmatic approach to most things we can actually find good in.

When I look at video games, I can’t classify them as evil. I instead look for the good things. And it’s the same with social media. Social media as part of a balanced media diet gives parents a lot of opportunities to connect, gives kids a lot of opportunity to express creativity and develop skills. 

鈥淭here wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.鈥

Carla Engelbrecht

I’ll give you an example on the games side of things: Years ago, I did a South by Southwest talk called 鈥淲hat Can Teach Us About Parenting.鈥 Left 4 Dead is not a game that kids should ever play. It鈥檚 a violent, first-person zombie apocalyptic shooter. It鈥檚 also one of the most beautifully designed cooperative games ever. I’m terrible with thumb sticks on video game controllers. I can’t walk in a straight line in a video game. I’m not great at the actual zombie-killing side of things. But I’m really good at running around and picking up health packs and checking in on people who have been damaged by zombies.

So there are different roles that people can play. I can still participate in the game, even though the primary way of playing Left 4 Dead is not what works for me. 

Also, if I’m playing with people, it fosters communication. I have to talk to people and someone needs to say. “Hey, I need help,” and I can come over. That’s what I’m looking for in games and social media: What are those underlying skills that, with a thoughtful perspective, you can leverage for good?

I wanted to switch gears a little bit and talk about something you mentioned earlier, Devorah: casual surveillance. I think about the stories we hear about parents not even just surveilling their kids 鈥 tracking their phones or their cars 鈥 but just keeping up in a way that we never even dreamed of. I wonder: Where did this come from? And how do you think a site like Betweened is going to help? 

Engelbrecht: I wish I knew exactly where it came from, but it certainly seems it’s symptomatic of the same thing: Everything has just kind of crept up on us. It’s like, as phones started to be introduced, we just thought, “Oh, well, I need to charge my phone, so I’ll charge it next to my bed.” And then the next thing you know, you’re checking it first thing when you wake up. It’s this slippery slope without the mindfulness of what it’s doing. Something has to happen to stop you, to make you take a step back and think, “How far have I gone? What boundaries have I crossed or what new boundary do I need to establish?” And to Devorah’s earlier point, the pandemic accelerated a lot of this.

Heitner: Part of it is we do it because we can. Even in relationships. I’ve known my husband since before we each had cell phones, but we didn’t used to check in as often because we didn’t have cell phones. It had to really rise to the level of an emergency before I would call him at work.

鈥淎s much as you patiently explain, as I have to my own 14-year-old, the concept of mutually assured destruction, if you’re on a group text with somebody for long enough, both of you have probably said a few things you don’t want repeated.鈥

Devorah Heitner

Remember the days of 9-to-5 office jobs? He left in the morning and was at his job. I was a grad student then and I would go up to Northwestern and not even really have any reachability by phone. Now we have phones, and the expectation is pretty much down-to-the-minute: If I’m 11 minutes late, I’ll probably text and say, “I’m 11 minutes late.” There’s just so much expectation for contact and communication and knowing where other people are. We don’t use location surveillance for that, but a lot of families do, and a lot of people have watches and will check into each other’s location on watches.

Because it’s there, people do it. And then there’s also just tremendous worry right now about kids. Given that we as a society think it’s a good idea for everyone to have assault weapons, parents are a little nervous. That anxiety creeps into everything.

My older daughter is 31, and I remember getting her first cell phone when she was 12 or 13. I remember the intense peer pressure she felt to have a phone. And I really didn’t like it at all. But I kind of justified it by saying to myself, “This is going to keep her safe.” And I remember thinking to myself, “You’re so full of shit. You’re just really trying to smooth things over.” And I guess I wonder: As parents, do we have an overextended sense of peril about our kids these days?

Heitner: There鈥檚 a sense of peril. Also, the Internet and online news and targeted algorithms just fuel that worry and outrage. It’s a bit of a vicious cycle.

Engelbrecht: In some ways, it’s almost like there are more risks that could stick with you. There wasn’t social media when I was in college. A bad decision in college couldn’t chase me through my entire life. In that sense, there are risks that feel much larger.

I think about my daughter and I don’t want something to chase her for her entire life. That part of it feels very real. And then it feels out of control. I don’t have the tools or know exactly how I can best help her except for having hard conversations and trying to put some bumpers around her. But there’s not a lot of tools to put the bumpers around her.

Devorah, one of the things you have said is that the kind of surveillance a lot of parents are undertaking is really undermining the trust their kids feel, and backfiring because kids won’t open up to them when they really need to. Can you talk a little bit more about that?

Heitner: You just see kids really getting focused on going deeper underground. If their parents are like, “I’m going to get Bark and read every single thing they text,” then you see some kids who are like, “O.K., I need to go deeper underground, I need a VPN or to only text on Snapchat, or I need to do something where I can be more evasive.” And that concerns me, because then there’s no way to make use of the parent when the parent might be useful.

Engelbrecht: I think about how to create space to allow the kid to have a second chance at telling me the truth. For example, if there’s an empty bag of gummies and the kid is the only one who could have eaten it but says they didn’t, how can I create space to talk about making mistakes versus lying or intentionally hiding the truth? Saying, “I’m going to ask what happened to the gummis again, but first I want you to take a moment to think about your answer 鈥 it’s OK to change your answer, because I want to understand the truth. We all make mistakes and we can talk about it. But intentionally hiding the truth has consequences.”

If I later find out that the child lied, then there’s consequences. The hope is that eventually, a parent can say, “If you end up at a party where there’s alcohol, don’t drive home. Call me for a ride home. If you try to hide that there was alcohol and make poor decisions, then there’s additional consequences.”

鈥淚 don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it鈥檚 time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.鈥

Carla Engelbrecht

It’s important to be able to say, 鈥淚 made a mistake鈥 and talk about what to do from there. Hopefully, that provides an alternative to the arms race of increasingly sneaky strategies that Devorah described.

Heitner: That makes a lot of sense. I was just going to say: The surveillance 鈥 schools just push it really hard. Every time I go to a school, they’re like, “Are you logged into ?鈥 or 鈥淎re you logged into ?” They’re just really pushing it so hard.

Are schools culpable in this? Sounds like you’d say, 鈥淵es.鈥 I don’t know if you’d call it surveillance, though. One of the functions of schools is to keep track of things, right?

Heitner: But what about the location tracking? My kid has to scan a QR code to get into the cafeteria. I skipped lunch every day of high school and ate with my drama club friends in the theater. Was that so bad? They have 3,500 kids QR-coding themselves into study hall. It’s pretty locked down. It鈥檚 pretty Big Brother, or if you read Cory Doctorow. 

Engelbrecht: Homework tracking means having full visibility of my daughter when part of what she needs to learn is the executive function skills to actually be able to plan and follow through and do her homework. I don’t want to be in the place where I’m policing her homework. Now that she’s in seventh grade, it鈥檚 time for her to be learning those skills before there’s the consequences of missing your homework in high school or college.

So to me, it’s kind of that same thing: The information is there. Should it be provided? How do you use it? And, for me it’s: How do we better equip administrators, teachers or parents to stop and think about how to leverage this information? So maybe a kid who’s consistently missing their homework, yes, the parents should have more visibility as part of a support program to get the kid back on track and help them learn the skills. But to Devorah’s point, it doesn’t mean everyone needs to be badging into lunch.

Devorah, your message to parents is: There are all these things happening. There are all these things you have to keep track of. There are lots and lots of risks to kids being on social media, especially teenagers. But you shouldn’t panic. And I wanted to just throw this out to both of you: Instead of panicking, what should parents do? 

Heitner: Carla, you’re talking about creating a new community space for kids that’s more of a learning space, and that’s one alternative. Another alternative, in addition to, or potentially instead of, for parents who don’t have access to that, is just leaning into one or two spaces they really want to mentor their kids in.

Maybe their kid’s really involved in Minecraft. And if they want to join [a free voice, chat, gaming and communications app], the parents are waiting and saying, “O.K. You can join your library Discord with or your school Minecraft club on Discord, but not general Discord.”

Two 9-year-olds play the open world computer game Minecraft. Parenting expert Devorah Heitner urges parents to know more about what their kids are doing online without resorting to surveillance. (Getty Images)

Parents will tell me their kids are playing or they’re on YouTube. But I’m like, 鈥淲hat channels? It鈥檚 just like if somebody says, “I’m watching TV.” Well, what are you watching? Because that really is a big differentiator in terms of the experience.

Engelbrecht: It goes back to your 鈥淔ear of Messing Up.鈥 I think so much about how it’s important for parents to wade in and get involved with their kids. This has been the advice for decades, whatever the newfangled thing was. I was just doing some writing about encouraging parents to actually do with their kids. It’s an opportunity to bond. It actually requires some planning and practice. It’s physical activity. I assume most parents are like me, that they’re not a great dancer and it’s uncomfortable and you don’t want to mess up.

But modeling that I’ll do something that’s out of my comfort zone and connect with you over something that I know you enjoy, can be very simple. It doesn’t mean a parent has to suddenly learn all aspects of Roblox or Discord, because they can be intimidating. But just find an entry point and connect with the child and participate with them. It just has so many benefits. It’s true whether they’re into Tonka trucks or Roblox. Parenting means, “Get in there with your kid.”

Devorah, you use the phrase, “Lead with curiosity.”

Engelbrecht: Oh, I love that.

Heitner: You want to be curious and have your kid share it with you. Their expertise and experience as well and their discernment 鈥 what do they like or not like about this app? How would they change it if they could? Staying curious is an alternative to spying 鈥 being curious and asking kids to be curious even about their own experience. Do I actually feel less stressed when I scroll this app? That’s maybe a lot of mindfulness to expect of kids, who have a lot going on and a lot coming at them. But it’s important for all of us to be curious about how our experience is going.

Engelbrecht: That’s one of the ways I’ve been thinking about it from a product perspective: just how to help build in some scaffolds for mindfulness 鈥 things like when you start an app, actually having a timer that’s like, 鈥淗ow long do you want to spend on it right now?鈥

I set a timer for myself when I use TikTok because I spend a very long time on it. So being able to put that in there as a scaffold, to start being mindful and thoughtful about it. We’re posting content, but we’re actually not posting endless scrolls where you could spend all day.

I don’t want to prioritize the traditional tech metric of “time on task.” To me, success is like, 鈥淵ou can come and use Betweened for 20 minutes and then know you can come back another day and there’s lots of interesting stuff for you.鈥 But it’s not all-consuming, must-do-this-all-the-time. And that’s a different perspective on tech products. It’s not how most products are developed.

]]>
Gaggle Drops LGBTQ Keywords from Student Surveillance Tool Following Bias Concerns /article/gaggle-drops-lgbtq-keywords-from-student-surveillance-tool-following-bias-concerns/ Fri, 27 Jan 2023 12:15:00 +0000 /?post_type=article&p=703034 Digital monitoring company Gaggle says it will no longer flag students who use words like 鈥済ay鈥 and 鈥渓esbian鈥 in school assignments and chat messages, a significant policy shift that follows accusations its software facilitated discrimination of LGBTQ teens in a quest to keep them safe.

A spokesperson for the company, which describes itself , cited a societal shift toward greater acceptance of LGBTQ youth 鈥 rather than criticism of its product 鈥 as the impetus for the change as part of a 鈥渃ontinuous evaluation and updating process.鈥

The company, which uses artificial intelligence and human content moderators to sift through billions of student communications each year, has long defended its use of LGBTQ-specific keywords to identify students who might hurt themselves or others. In arguing the targeted monitoring is necessary to save lives, executives have pointed to the prevalence of bullying against LGBTQ youth and data indicating they鈥檙e than their straight and cisgender classmates. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But in practice, Gaggle鈥檚 critics argued, the keywords put LGBTQ students at a heightened risk of scrutiny by school officials and, on some occasions, the police. Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity 鈥 often called outing 鈥 as a result of digital activity monitoring, according to released in August by the nonprofit Center for Democracy and Technology. The survey encompassed the impacts of multiple monitoring companies who contract with school districts, such as GoGuardian, Gaggle, Securly and Bark. 

Gaggle鈥檚 decision to remove several LGBTQ-specific keywords, including 鈥渜ueer鈥 and 鈥渂isexual,鈥 from its dictionary of words that trigger alerts was first reported in . It follows extensive reporting by 社区黑料 into the company鈥檚 business practices and sometimes negative effects on students who are caught in its surveillance dragnet. 

Though Gaggle鈥檚 software is generally limited to monitoring school-issued accounts, including those by Google and Microsoft, the it can scan through photos on students鈥 personal cell phones if they plug them into district laptops.

The keyword shift comes at a particularly perilous moment, as Republican lawmakers in multiple states . Legislation has looked to curtail classroom instruction about sexual orientation and gender identity, ban books and classroom curricula featuring LGBTQ themes and prohibit transgender students from receiving gender-affirming health care, participating in school athletics and using restroom facilities that match their gender identities. Such a hostile political climate and pandemic-era disruptions, a recent youth survey by The Trevor Project revealed, has contributed to an uptick in LGBTQ youth who have seriously considered suicide. 

The U.S. Education Department received 453 discrimination complaints involving students鈥 sexual orientation or gender identity last year, according to data provided to 社区黑料 by its civil rights office. That鈥檚 a significant increase from previous years, including in 2021 when federal officials received 249 such complaints. The Trump administration took and complaints dwindled. In 2018, the Education Department received just 57 complaints related to sexual orientation or gender identity discrimination.

The increase in discrimination allegations involving sexual orientation or gender identity are part of , according to data obtained by The New York Times. The total number of complaints for 2021-22 grew to 19,000, a historic high and more than double the previous year. 

In September, 社区黑料 revealed that Gaggle had donated $25,000 to The Trevor Project, the nonprofit that released the recent youth survey and whose advocacy is focused on suicide prevention among LGBTQ youth. The arrangement was framed on Gaggle鈥檚 website as a collaboration to 鈥渋mprove mental health outcomes for LGBTQ young people.鈥 

The revelation was met with swift backlash on social media, with multiple Trevor Project supporters threatening to halt future donations. Within hours, the group announced it had returned the donation, acknowledging concerns about Gaggle 鈥渉aving a role in negatively impacting LGBTQ students.鈥 

The Trevor Project didn鈥檛 respond to requests for comment on Gaggle鈥檚 decision to pull certain LGBTQ-specific keywords from its systems. 

In a statement to 社区黑料, Gaggle spokesperson Paget Hetherington said the company regularly modifies the keywords its software uses to trigger a human review of students鈥 digital communications. Certain LGBTQ-specific words, she said, are no longer relevant to the 24-year-old company鈥檚 efforts to protect students from abuse and were purged late last year.

鈥淎t points in time in the not-too-distant past, those words were weaponized by bullies to harass and target members of the LGBTQ+ community, so as part of an effective methodology to combat that discriminatory harassment and violence, those words were once effective tools to help identify dangerous situations,鈥 Hetherington said. 鈥淭hankfully, over the past two decades, our society evolved and began a period of widespread acceptance, especially among the K-12 student population that Gaggle serves. With that evolution and acceptance, it has become increasingly rare to see those words used in the negative, harassing context they once were; hence, our decision to take these off our word/phrases list.鈥

Hetherington said Gaggle will continue to monitor students鈥 use of the words 鈥渇aggot,鈥 鈥渓esbo,鈥 and others that are 鈥渃ommonly used as slurs.鈥 A previous review by 社区黑料 found that Gaggle regularly flagged students for harmless speech, like profanity in fictional articles submitted to a school鈥檚 literary magazine, and students鈥 private journals. 

Anti-LGBTQ activists have , and privacy advocates warn that in the era of 鈥淒on鈥檛 Say Gay鈥 laws and abortion bans, information gleaned from Gaggle and similar services could be weaponized against students.

Gaggle executives have minimized privacy concerns and claim the tool saved more than 1,400 lives last school year. That statistic hasn鈥檛 been independently verified and there鈥檚 a dearth of research to suggest digital monitoring is an effective school-safety tool. A recent survey found a majority of parents and teachers believe the benefits of student monitoring outweigh privacy concerns. The Vice News documentary included the perspective of a high school student who was flagged by Gaggle for writing a paper titled 鈥淓ssay on the Reasons Why I Want to Kill Myself but Can鈥檛/Didn鈥檛.鈥 Adults wouldn鈥檛 have known she was struggling without Gaggle, she said. 

鈥淚 do think that it鈥檚 helpful in some ways,鈥 the student said, 鈥渂ut I also kind of think that it鈥檚 鈥 I wouldn鈥檛 say an invasion of privacy 鈥 but if obviously something gets flagged and a person who it wasn鈥檛 intended for reads through that, I think that鈥檚 kind of uncomfortable.鈥 

Student surveillance critic Evan Greer, director of the nonprofit digital rights group said the tweaks to Gaggle鈥檚 keyword dictionary are unlikely to have a significant effect on LGBTQ teens and blasted the company鈥檚 stated justification for the move as being 鈥渙ut of touch鈥 with the state of anti-LGBTQ harassment in schools. Meanwhile, Greer said that LGBTQ youth frequently refer to each other using 鈥渞eclaimed slurs,鈥 reappropriating words that are generally considered derogatory and remain in Gaggle鈥檚 dictionary. 

鈥淭his is just like lipstick on a pig 鈥 no offense to pigs 鈥 but I don鈥檛 see how this actually in any meaningful way mitigates the potential for this software to nonconsensually out LGBTQ students to administrators,鈥 Greer said. 鈥淚 don鈥檛 see how it prevents the software from being used to invade the privacy of students in a wide range of other circumstances.鈥

Gaggle and its competitors 鈥 including , and 鈥 have faced similar scrutiny in Washington. In April, Democratic Sens. Elizabeth Warren and Ed Markey argued in a report that the tools could be misused to discipline students and warned they could be used disproportionately against students of color and LGBTQ youth. 

Jeff Patterson

In , Gaggle founder and CEO Jeff Patterson said the company cannot test the potential for bias in its system because the software flags student communications anonymously and the company has 鈥渘o context or background on students,鈥 including their race or sexual orientation. They also said their monitoring services are not meant to be used as a disciplinary tool. 

In the survey released last summer by the Center for Democracy and Technology, however, 78% of teachers reported that digital monitoring tools were used to discipline students. Black and Hispanic students reported being far more likely than white students to get into trouble because of online monitoring. 

In October, the White House cautioned school districts against the 鈥渃ontinuous surveillance鈥 of students if monitoring tools are likely to trample students鈥 rights. It also directed the Education Department to issue guidance to districts on the safe use of artificial intelligence. The guidance is expected to be released early this year.

Evan Greer (Twitter/@evan_greer)

As an increasing number of districts implement Gaggle for bullying prevention efforts, surveillance critic Greer said the company has failed to consider how adults can cause harm.

鈥淭here is now a very visible far-right movement attacking LGBTQ kids, and particularly trans kids and teenagers,鈥 Greer said. 鈥淚f anything, queer kids are more in the crosshairs today than they were a year ago or two years ago 鈥 and that鈥檚 why this surveillance is so dangerous.鈥

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741. For LGBTQ mental health support, contact The Trevor Project鈥檚 toll-free support line at 866-488-7386.

]]>
DHS Sec. Mayorkas: Relationships, Not Tech, Central to Creating Safe Schools /article/dhs-sec-mayorkas-relationships-not-tech-central-to-creating-safe-schools/ Fri, 11 Nov 2022 12:15:00 +0000 /?post_type=article&p=699629 Homeland Security Secretary Alejandro Mayorkas leads an agency 鈥 born in the aftermath of the Sept. 11, 2001, terrorist attacks 鈥 perhaps best known for mass surveillance and rigid airport security checkpoints. But to Mayorkas, the key to keeping students safe at school rests with strong relationships. 

Time and again, gunmen have before opening fire in schools, including fascinations with violence and a history of trauma. As cryptic 鈥 and at times explicit 鈥 social media posts emerge post-attacks, conversations often center on missed opportunities to intervene. It takes a vigilant community, Mayorkas said, to break the cycle. 

鈥淲e’re seeing individuals potentially with mental health problems, grievances, and they have manifested their challenges outwardly, they have spoken about violence,鈥 he told 社区黑料. 鈥淲hat we鈥檝e seen is expressions of an interest in violence and an expression of a planning or plotting to conduct an attack. And we need to educate people on identifying those signs, those expressions and also what to do about it to seek help for those individuals.鈥

Amid a surge in mass school shootings, districts nationwide have pumped more than $3 billion into school security. Campus police have become commonplace, active-shooter drills have grown routine and, for students across the U.S., digital surveillance has been normalized. The Department of Homeland Security has endorsed 鈥渢hreat assessment,鈥 a process where educators, mental health professionals and the police analyze a student鈥檚 behaviors and statements to determine if they, as Mayorkas put it, are 鈥渄escending down a path towards violence.鈥

The environment has created a balancing act for school leaders who are charged with keeping schools safe while protecting students鈥 civil liberties. 

The department recently invited 社区黑料 to interview Mayorkas about this complicated landscape ahead of its first-ever National Summit on K-12 School Safety and Security. Mayorkas fielded questions about the sharp uptick in mass school shootings, the botched police response in Uvalde, Texas, and a massive ransomware attack that targeted the Los Angeles Unified School District.

The conversation has been edited for length and clarity.

We’re seeing an uptick in active mass shootings, including those that are targeting schools. What are some of the trends that you’re seeing within these campus attacks and what are some of the key strategies that your agency and other federal agencies are using to combat this increase in violence? 

So, Mark, tragically 2022 saw the greatest number of school shootings in our nation’s history. I think it was just over 250. And we have a multifaceted approach to it, of course, to educate and empower schools to understand how they can be safe environments. 

Every child, every person in this country and frankly around the world, deserves a safe, secure, supportive environment in which to be educated. And so we have our Cybersecurity and Infrastructure Security Agency, CISA as it is known, that has a website that is dedicated to this critical mission set. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


We have the United States Secret Service鈥檚鈥 , the NTAC, that provides resources to schools about how they can maintain a safe environment. We have critical grant programs that fund innovative efforts to really build resilience, and to help prevention models, as well as our , CP3, which is developing a one-stop shop that identifies federal resources for schools to access.

We have a lot of different efforts underway throughout our department and throughout the administration.

Absolutely. So let’s jump into the threat assessment one. You mentioned the Secret Service. They’ve done this study basically finding that mass school shooters almost always have observable traits before the attack. And it’s basically a 鈥淪ee Something, Say Something鈥 kind of mantra. Can you talk a little bit about threat assessments, and identifying people who might present a serious risk, but doing so in a way that doesn’t trample on people’s civil rights?

That’s right. So it’s very important, Mark, the last part of your question. We have a statutorily created and a statutorily created . It’s very important that we keep those fundamental rights well protected and do not in any way infringe upon them. 

Indeed, if we take a look at recent events, the assailants in Uvalde, Texas, in Buffalo 鈥 and Highland Park, these individuals exhibited signs that were observable to individuals around them. And the key is to empower people to educate people about how to identify those characteristics when somebody’s descending down a path that has a connectivity to violence, and really intervene. And to intervene not in a way that delivers accountability, but rather assistance, support. 

We’re seeing individuals potentially with mental health problems, grievances, and they have manifested their challenges outwardly, they have spoken about violence. More generally what we’ve seen is expressions of an interest in violence and an expression of a planning or plotting to conduct an attack. And we need to educate people on identifying those signs, those expressions and also what to do about it to seek help for those individuals.

You mentioned Uvalde and I’m really curious on your thoughts about the law enforcement response to the tragedy. More than 350 officers from local, state and federal agencies descended on the school. And ultimately, officers under your watch were the ones who were able to stop the gunman. But I’m curious about the delay. It took more than an hour for law enforcement officers to ultimately confront the gunman. I’m curious if you have any insight into the factors that led to that delay, and what lessons educators, law enforcement officials and anybody in the security space can take from that police response?

I think there are going to be a lot of lessons learned from the response in Uvalde. That response has been the subject of a number of investigations and some of those investigations are, in fact, ongoing. So I think I’m going to refrain from commenting upon the reported delays in the response. 

That was an unspeakable tragedy and I think there are different responses in different situations. There is a great body of training and active shooter training and how law enforcement should respond. I think the critical part is to take a look at every incident 鈥 unfortunately they occur all too often 鈥 and to learn from them to refine those best practices, to make sure that we’re disseminating those best practices throughout the law enforcement community. And not just the law enforcement community, but the health care community and the like. 

One of the things that we鈥檝e focused on in this administration is an all-of-government and all-of-community response to this threat. So we are engaged with the Department of Education, we’re engaged with the Department of Health and Human Services, we’re looking at local community groups, parent associations, school systems, local health, mental health networks and providers. This really requires an all-of-community response to the fact that individuals are expressing their infirmities, their challenges, through acts of violence and through acts of violence targeted at children.

One of the interesting things about the response to the shooting has been a lot of concern about law enforcement officers in schools. The federal government has put a lot of money over the last several decades into putting police officers in schools. I鈥檓 curious what your response is to advocates who鈥檝e been calling for police-free schools? 

This is a very difficult issue and it’s an issue that we do encounter not only in the school system but also in other contexts as well. This is a conversation I’ve had with faith leaders about how to make places of congregation, of learning, of worship, welcoming, open and the like, and also safe and secure, to not be foreboding. 

I don’t think it’s a one-size-fits-all. I think we have to take a look at the safety imperative. I am not opposed to having security guards in schools, I myself. But how they are deployed, how they are integrated into the fabric of the school community, I think is vitally important. 

We’re going to talk a little bit about cybersecurity and dark corners of the internet. In Uvalde the school district used a company called Social Sentinel, basically to monitor social media and try to identify potentially threatening social media posts. School districts across the country use a large range of different surveillance tools to basically monitor how kids interact on the internet and to try to identify violence before it happens. But the White House recently came out with what they’re calling a Bill of Rights for AI, and it basically says to schools, 鈥榣imit the continuous surveillance of students if it has a potential to infringe on their civil rights.鈥 I’m curious on your thoughts on this idea of monitoring students鈥 behaviors on social media and other internet platforms to identify threats of violence?

The key is to create with one’s children an open line of communication so that one can learn what type of online activity one’s child is engaging in. So an open, communicative environment is absolutely critical, as is digital literacy so children can understand what is credible and what is not credible. 

We can employ privacy settings 鈥 parents, not the government 鈥 the parents can employ privacy settings and understand what their children are doing and communicate about it. It’s really important that children who are online are educated with respect to their own behavior and the behavior of others. I think that is what is key, that open, communicative environment, an environment of digital literacy and an environment where if children see something, they understand what it is they are seeing and know how to respond to it. And also, for parents, friends, relatives, school teachers and the like to pick up on the signs when a child is descending down a path towards violence.

If we’re talking to parents here for a second, what do you think are some of the most critical signs that folks should be looking out for?

It gets very difficult and I would really defer to mental health professionals and the like but let me give you a few examples. If we are dealing with an individual who expresses an intent to commit violence, who expresses a fascination with violence and begins to withdraw from societal communications with friends and the like, I think it is time to communicate, to ask questions, to engage with that child to learn more.

Many communities in the last few months haven’t even experienced shootings 鈥 but have been told that they are. A bunch of schools across the country in the last few months are being subjected to swatting calls.

Swatting is a very dangerous phenomenon that we’re seeing an increase of. That prank call to emergency personnel to deploy when, in fact, they’re not needed. That’s a criminal activity and it really puts innocent people at risk.

I’m curious when you can tell us about the surge right now. It appears that many of these are connected. Can you give us any insight into what’s going on and why schools are suddenly experiencing a surge in these kinds of calls?

One of the things that’s of concern when it comes to swatting, and it鈥檚 also applicable to malicious cyber activity, is the ease of replication. That if a swatting incident occurs in one geography, others may be motivated, unfortunately, to do the very same thing in a very different venue. We seek to prevent it. We work with the state, local, tribal territorial partners, campus law enforcement, to educate students, to educate people about the danger of swatting. It’s not an innocent prank call. It’s the deployment of precious law enforcement resources and could have unintended consequences. Education and prevention are key here.

Speaking of cybersecurity, the Los Angeles school district, America’s second-largest school district, was just the victim of a ransomware attack. They ultimately did not pay the ransom and as a result had some of their data posted on the dark web. I’m curious what you can tell us about the threat actors who we’re holding LAUSD ransom and in general the threat actors who are targeting schools?

We’ve seen a tremendous rise in ransomware over the last several years by criminal actors. They target not only schools, they target hospitals, law enforcement organizations, businesses, the range of victims is quite wide. We caution, we recommend that victim entities not pay the ransom. We are very well aware of the precarious situation in which they find themselves when they’re held hostage to a ransomware actor. But we have only increased our defenses, really only enhanced our defenses, and also strengthened law enforcement鈥檚 response to it.

Now, if I’m a school leader and I’m the victim of a ransomware attack or some sort of cyber threat, what kind of assistance can I receive from the federal government? What role do you play in helping school districts respond to this?

Our Cybersecurity and Infrastructure Security Agency, CISA, is very well equipped to assist a ransomware victim as is the Federal Bureau of Investigation, the United States Secret Service. We have a whole suite of capable agencies that can assist in identifying the intrusion, assisting in expelling the intruder, helping in patching the vulnerability that the intruder exploited and, of course, holding hopefully the intruder accountable. And the FBI has done an extraordinary job in investigating and identifying bad actors.

A recent Pew poll found that about a third of parents are very or extremely worried about a school shooting occurring at their child’s school. You’re a parent. I’m curious, have you had these similar concerns from a parental perspective? And to what degree do you think that parents should be concerned about a shooting unfolding at their school?

It’s a tragic state of affairs when parents are concerned about sending their children to school because of a potential attack that impairs the safety and security of their children. 

It is important for schools to train their personnel and their students on how to respond in the case of an active shooter. When I was a child in Los Angeles, California, where I spent much of my youth, we were trained on responding to fires, to earthquakes, even to a bomb. School shooting was not in the panoply of threats to which we were trained to respond. Now, tragically, it is, and schools need to train and parents need to communicate in an informed way with their children 鈥 not in a way to create hysteria 鈥 but in a way to create vigilance and alertness.

Online platforms like forums have been used over the last several years to radicalize young people, whether that be to become mass school shooters, or to go down a path of white nationalism. I’m curious if you can elaborate a little bit on the landscape of these online forums and ways that we can combat that without stepping on the First Amendment? 

So the threats, the diversity of the threats is much broader than what you identified, of course. And this is where I spoke earlier about the need to communicate with children, with youth, who are impressionable, to be able to create a safe environment where they feel comfortable communicating with what they’re seeing. For parents to be vigilant in terms of privacy settings, to really develop digital literacy amongst our youth so that they can understand what is credible, what is not credible, what is threatening, and what is innocent.

We really have to do that, and we’re working in partnership with industry, with the private sector, with think tanks about how to best build that digital literacy. This also requires an all-of-community response, it is not for the government exclusively to engage in this. 

We are working with online gaming companies to really build a safe environment to really instruct children about the perils of the online environment, to really guard against cyberbullying as well as extremism that seeks to draw people to violence.

What is it about the gaming community? It’s interesting that you’re specifically reaching out to people in that space. Why? 

Well, we’re reaching out much more broadly. We engage with social media companies, we engage with thought leaders that are important voices. The gaming community reaches so many children, they鈥檙e a critical partner in developing a safe and secure ecosystem so people can understand the benefits of, as well as the perils of, the online environment. 

Our increased connectivity is a tremendous tool for achieving prosperity. It also brings risks to it.

Thank you so much for taking the time to field these questions and talk about this really important topic. Is there anything else that I haven’t asked, that you think is important? 

I want to return to a point of sadness and a point of vigilance. The point of sadness is, of course, we’re speaking about school safety and the fact that it is such a phenomenon right now. 

On the other hand, the community 鈥 and the federal government is a member of that community, but the community is much broader 鈥 is very, very alert to this phenomenon, and very vigilant in addressing it in a really productive and constructive way.

]]>
Can Educators and Police Predict the Next School Shooter? /article/can-educators-and-police-predict-the-next-school-shooter/ Wed, 02 Nov 2022 19:24:27 +0000 /?post_type=article&p=699197 Every school shooting can be stopped 鈥 but educators and police must identify youth with an affinity for violence and spring to action before a single shot is fired.

That鈥檚 the message that federal law enforcement officials touted Tuesday during a first-ever hosted by the Cybersecurity and Infrastructure Security Agency, a division of the Department of Homeland Security. While a demographic profile of school shooters doesn鈥檛 exist, according to , soon-to-be gunmen exhibit signs that can be identified prior to attacks 鈥 such as a fixation on violence or a history of depression. 

Officials endorsed 鈥渢hreat assessment,鈥 an approach pioneered by the Secret Service that鈥檚 become a common but controversial strategy in schools to predict future perpetrators and prevent targeted campus violence. The Secret Service is part of Homeland Security.

鈥淲e鈥檝e seen the tragedies that have happened when that information, on behavior that objectively elicits concern, was not acted on,鈥 said Lina Alathari, chief of the . 鈥淏ut we also need to make sure we鈥檙e setting a lower threshold for what we want to intervene with 鈥 such as being bullied, depression, suicidality 鈥 because we鈥檝e also seen those in the background of these students that resorted to violence.鈥

Lina Alathari

Following the mass school shooting in May at a Uvalde, Texas, elementary school, districts statewide are set to receive for campus security, including for the creation of threat assessment teams. 

Yet the deployment of such teams, which generally include school administrators, mental health officials and police officers, has civil rights groups on edge. Critics warn the approach could misidentify struggling students as future gunmen and unnecessarily push them into the juvenile justice system. While school shootings remain statistically rare, student behaviors that are factors in threat assessments 鈥 like alcohol use and a history of mental health issues 鈥 are exceedingly common.

Such concerns were largely downplayed at this week鈥檚 summit, a three-day virtual event where law enforcement officials, educators and other experts gathered to offer recommendations in responding to a range of campus security risks, including mass school shootings, cyber attacks and online extremism. 

Steven Driscoll, the threat assessment center鈥檚 assistant chief, stressed that the approach is not 鈥渂ased on profiles or identifying types of students鈥 but rather a focus on identifying threatening behaviors and intervening early. 

鈥淪chools need training not only on the behavioral threat assessment process best practices but also on things like implicit biases which have historically permeated a variety of school-based programs,鈥 Driscoll said. 

In a letter to the Education Department last year, a coalition of 50 student civil rights groups warned that the adoption of threat assessment in schools is 鈥渓ikely pushing many children of color and children with disabilities out of school, into the school-to-prison pipeline.鈥 

鈥淭hese 鈥榯hreat assessments鈥 are likely to target large numbers of children who aren鈥檛 actual threats 鈥 including disproportionate numbers of children of color and children with disabilities 鈥 and cause them significant and lasting harm, while doing little or nothing to increase safety in schools,鈥 according to the letter, which was signed by groups including the National Center for Youth Law, the National Disability Rights Network and the Council of Parent Attorneys and Advocates. 鈥淚n addition, they may refer children to services that do not exist.鈥

Last year, a analyzed 67 school violence plots that were thwarted between 2006 and 2018, finding that plotters in each case were met with criminal charges or arrests. Yet the 鈥減rimary objective鈥 of threat assessments is not to administer discipline, the report notes, but to 鈥渋dentify students in crisis or distress and provide robust interventions, before their behavior escalates to the point of criminality.鈥 

Amy Lowder, the director of student safety and well-being at a suburban Charlotte, North Carolina school district, acknowledged during the summit that threat assessments conducted improperly can have detrimental effects on youth, including unnecessary student expulsions and juvenile justice referrals. That鈥檚 why it鈥檚 important, she said, for threat assessment teams to take 鈥渁 whole-child approach in gathering the necessary information鈥 about students causing concerns. 

Meanwhile, Greg Johnson, a high school principal from West Liberty, Ohio, said that school leaders must balance students鈥 civil rights against their need to ensure campuses are secure. Johnson was principal of West Liberty High School in 2017 when a classmate. 

鈥淵ou鈥檝e got that balance because you want to support student rights and individual rights but you also want to keep people safe and that鈥檚 a huge responsibility,鈥 Johnson said. 鈥淭hat鈥檚 a huge responsibility to keep your students safe.鈥 

In an interview with 社区黑料 that opened the summit, Homeland Security Secretary Alejando Mayorkas noted that there have been more than , more than any other year on record. 

Given the reality that school shooters often leak their plans to friends or online, summit panelists also endorsed a need to monitor students on the internet 鈥 a practice that has raised a separate set of civil rights and digital privacy concerns. That鈥檚 why it鈥檚 important for districts to employ experts in digital analyses, said Colton Easton, the project and training manager at Safer Schools Together, a Canadian-based, for-profit company that offers threat-assessment training and a team of threat analysts to assist districts in investigations. 

鈥淢aybe a student made a threat involving a gun and we see that gun posted on TikTok, we would consider that behaviors consistent with the threat and law enforcement could obtain a search warrant and remove access to the means,鈥 Easton said. 鈥淭oday, digital leakage is that golden ticket for school safety and threat assessment teams.鈥

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

]]>
The School (in)Security Newsletter Spotlights Student Safety and Civil Rights /article/the-school-insecurity-newsletter-coming-soon-to-an-inbox-near-you/ Thu, 22 Sep 2022 16:45:00 +0000 /?post_type=article&p=696812 Following the tragic shooting in Uvalde, Texas, this spring, the back-to-school season has been particularly fraught for parents, whose fears for their children鈥檚 safety on campus have surged to their . 

That’s why we’ve launched the School (in)Security newsletter to highlight to heightened parental anxieties in starkly divergent ways: from for students to militarized campus police with collapsible rifles strapped to their chests.听

Mark Keierleber

Edited by Investigative Reporter Mark Keierleber, the newsletter is a twice-monthly hub for the most critical news and information about the rights, safety and well-being of students in K-12 schools nationally.听

Keeping kids safe at school while safeguarding their individual rights is complex terrain. Keierleber, who has covered the school security industry, online student surveillance and student civil rights issues for years, will elevate the best journalism, research and advocacy devoted to exploring the tension. From to new anti-LGBTQ rules, this is a moment when student thought and expression is often on a collision course with adult interests in their schools and communities. We want to be at the center of that conversation. 

Sign-up for the School (in)Security newsletter.

Get the most critical news and information about students' rights, safety and well-being delivered straight to your inbox.

By providing your email address, you consent to receiving newsletters from 社区黑料. We will not sell your information to third parties. Read our Privacy Policy.

Sign up here to subscribe. Have a tip? Click here to send an email to Mark.

]]>
With 鈥楧on鈥檛 Say Gay鈥 Laws & Abortion Bans, Student Surveillance Raises New Risks /article/with-dont-say-gay-laws-abortion-bans-student-surveillance-raises-new-risks/ Thu, 08 Sep 2022 10:30:00 +0000 /?post_type=article&p=696150 While growing up along the Gulf Coast in Mississippi, Kenyatta Thomas relied on the internet and other teenagers to learn about sex.

Thomas and their peers watched videos during high school gym class that stressed the importance of abstinence 鈥 and the horrors that can come from sex before marriage. But for Thomas, who is bisexual and nonbinary, the lessons didn鈥檛 explain who they were as a person. 

鈥淚t was very confusing trying to navigate understanding who I am and my identity,鈥 said Thomas, now a student at Arizona State University. It was on the internet that Thomas learned about a whole community of young people with similar experiences. Blog posts on Tumblr helped them make sense of their place in the world and what it meant to be bisexual. 鈥淚 was able to find the words to understand who I am 鈥 words that I wouldn’t be able to piece together in a sentence if the internet wasn鈥檛 there.鈥 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


But now, as states adopt anti-LGBTQ laws and abortion bans, the digital footprint that Thomas and other students leave may come back to harm them, privacy and civil rights advocates warn, and it could be their school-issued devices that end up exposing them to that legal peril.

For years, schools across the U.S. have used digital surveillance tools that collect a trove of information about youth sexuality 鈥 intimate details that are gleaned from students鈥 conversations with friends, diary entries and search histories. Meanwhile, student information collected by student surveillance companies are regularly shared with police, according to a recent survey conducted by the nonprofit Center for Democracy and Technology. These two realities are concerning to Elizabeth Laird, the center鈥檚 director of equity in civic technology. Following the Supreme Court鈥檚 repeal of Roe v. Wade in June, she said information about youth sexuality could be weaponized. 

 鈥淩ight now 鈥 without doing anything 鈥 schools may be getting alerts about students鈥 who are searching the internet for resources related to reproductive health,鈥 Laird said. 鈥淚f you are in a state that has a law that criminalizes abortion, right now this tool could be used to enforce those laws.鈥

Teens across the country are already to fill the void for themselves and their peers in the current climate. Thomas, the ASU student and an outspoken reproductive justice activist, said that while students are generally aware that school devices and accounts are monitored, the repeal of Roe has led some to take extra privacy precautions. 

Kenyatta Thomas, an Arizona State University student and activist, participates in an abortion-rights protest. (Photo courtesy Kenyatta Thomas)

鈥淚 have switched to using Signal to talk to friends and colleagues in this space,鈥 they said, referring to the . 鈥淭he fear, even though it鈥檚 been common knowledge for basically my generation鈥檚 entire life that everything you do is being surveilled, it definitely has been amplified tenfold.鈥

Police have long used social media and other online platforms to investigate people for breaking abortion rules, including where police obtained a teen鈥檚 private Facebook messages through a search warrant before charging the then-17-year-old and her mother with violating the state鈥檚 ban on abortions after 20 weeks of pregnancy. 

LGBTQ students face similar risks as lawmakers in Florida and elsewhere impose rules that prohibit classroom discussions about sexuality and gender. This year alone, lawmakers have proposed 300 anti-LGBTQ bills and about a dozen have . They so-called 鈥淒on鈥檛 Say Gay鈥 laws in Florida and Alabama that ban classroom discussions about gender and sexuality and require school officials to tell the parents of children who share that they may be gay or transgender. 

In a survey, a fifth of LGBTQ students told the Center for Democracy and Technology that they or another student they knew had their sexual orientation or gender identity disclosed without their consent due to online student monitoring. They were more likely than straight and cisgender students to report getting into trouble for their web browsing activity and to be contacted by the police about having committed a crime. 

LGBTQ youth are nearly twice as likely as their straight and cisgender classmates to search for health information online, according to . But as anti-LGBTQ laws proliferate, student surveillance tools should reconsider collecting data about youth sexuality, Christopher Wood, the group鈥檚 co-founder and executive director, told 社区黑料. 

鈥淩ight now, we are not in a landscape or an environment where that is safe for a company to be doing,鈥 Wood said. 鈥淚f there is a remote possibility that the information that they are trying to provide to help a student could potentially lead them into more harm, then they need to be looking at that very carefully and considering whether that is the appropriate direction for a company to be taking.鈥

Digital student monitoring tools have a negative disparate impact on LGBTQ youth, according to a recent student survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

鈥楨xtraordinarily concerned鈥

For decades, has required school technology to block access to images that are obscene, child pornography or deemed 鈥渉armful to minors,鈥 and schools have used web-filtering software to prevent students from accessing sexually explicit content. But in some cases, the filtering to block pro-LGBTQ websites that aren鈥檛 explicit, including those that offer crisis counseling.  

Many student monitoring tools, which saw significant growth during the pandemic, go far beyond web filtering and employ artificial intelligence to track students across the web to identify issues like depression and violent impulses. The tools can sift through students鈥 social media posts, follow their digital movements in real time and scan files on school-issued laptops 鈥 from classroom assignments to journal entries 鈥 in search of warning signs. 

They鈥檝e also come under heightened scrutiny. In a report this year, Democratic Sens. Elizabeth Warren and Ed Markey warned that schools鈥 widespread adoption of the tools could trample students鈥 civil rights. By flagging words related to sexual orientation, the report notes, LGBTQ youth could be subjected to disproportionate disciplinary rates and be unintentionally outed to their parents. 

In in July, Warren and Markey cautioned that the tools could pose new risks following the repeal of Roe and asked four leading student surveillance companies 鈥 GoGuardian, Gaggle, Securly and Bark 鈥 whether they flag students for using keywords related to reproductive health, such as 鈥減regnant鈥 and 鈥渁bortion.鈥

鈥淲e are extraordinarily concerned that your software could result in punishment or criminalization of students seeking contraception, abortion or other reproductive health care,鈥 Markey and Warren wrote. 鈥淲ith reproductive rights under attack nationwide, it would represent a betrayal of your company鈥檚 mission to support students if you fail to provide appropriate protections for students鈥 privacy related to reproductive health information.鈥

Student activity monitoring tools are more often used to discipline students than protect them from violence and mental health crises, according to a recent teacher survey by the nonprofit Center for Democracy and Technology. (Photo courtesy Center for Democracy and Technology)

The scrutiny is part of a larger concern over digital privacy in the post-Roe world. In August, the Federal Trade Commission and accused the company of selling the location data from hundreds of millions of cell phones that could be used to track peoples鈥 movements. Such precise location data, the , 鈥渕ay be used to track consumers to sensitive locations, including places of religious worship, places that may be used to infer an LGBTQ+ identification, domestic abuse shelters, medical facilities and welfare and homeless shelters.鈥 

School surveillance companies have acknowledged their tools track student references to sex but sought to downplay the risks they pose to students. Bark spokesperson Adina Kalish said the company began to immediately purge all data related to reproductive health after a leaked Supreme Court draft opinion suggested Roe鈥檚 repeal was imminent 鈥 despite maintaining a 30-day retention period for most other data. 

鈥淏y immediately and permanently deleting data which contains a student鈥檚 reproductive health data or searches for reproductive health information, such data is not in our possession and therefore not produce-able under a court order, subpoena, etc.,鈥 Bark CEO Brian Bason , which the company shared with 社区黑料. 

GoGuardian spokesperson Jeff Gordon said its tools 鈥渃annot be used by educators or schools to flag reproductive health-related search terms鈥 and its web filter cannot 鈥渇lag reproductive health-related searches.鈥 Securly didn鈥檛 respond to requests for comment. Last year its web-filtering tool categorized health resources for LGBTQ teens as pornography. 

Gaggle founder and CEO Jeff Patterson to the senators that his company does not 鈥渃ollect health data of any kind including reproductive health information,鈥 specifying that the monitoring tool does not flag students who use the terms 鈥減regnant, abortion, birth control, contraception or Planned Parenthood. 鈥 

Yet tracking conversations about sex is a primary part of Gaggle’s business 鈥 more than references to suicide, violence or drug use, according to nearly 1,300 incident reports generated by the company for Minneapolis Public Schools during a six-month period in 2020. The reports, obtained by 社区黑料, showed that 38% were prompted by content that was pornographic or sexual in nature, including references to 鈥渟exual activity involving a student.鈥 Students were regularly flagged for using keywords like 鈥渧irginity,鈥 鈥渞ape,鈥 and, simply, 鈥渟ex.鈥 

Patterson, the Gaggle CEO, has acknowledged that a student鈥檚 private diary entry about being raped wasn鈥檛 off limits. In touting the tool鈥檚 capabilities, he told 社区黑料 his company uncovered the girl鈥檚 diary entry, where she discussed how the assault led to self-esteem issues and guilt. Nobody knew she was struggling until Gaggle notified school officials about what they鈥檇 learned from her diary, Patterson said. 

鈥淭hey were able to intervene and get this girl help for things that she couldn鈥檛 have dealt with on her own,鈥 Patterson said.

Any information that surveillance companies collect about students鈥 sexual behaviors could be used against them by police during investigations, privacy experts warned. And it鈥檚 unclear, Laird said, how long the police can retain any data gleaned from the tools. 

鈥楧on鈥檛 Say Gay鈥

Internet search engines are 鈥減articularly potent鈥 tools to track the behaviors of pregnant people, by the nonprofit Surveillance Technology Oversight Project. In 2017, for example, a with second-degree murder of her stillborn fetus after police scoured her browser history and identified a search for an abortion pill. 

While GoGuardian and other companies offer web filtering to schools, Gaggle has sought to differentiate itself. In his letter to the senators, Patterson said the company 鈥 which sifts through files and chat messages on students鈥 school-issued Microsoft and Google accounts 鈥 is not a web filter and therefore 鈥渄oes not track students鈥 online searches.鈥 Yet Patterson鈥檚 assurance to lawmakers appears misleading. The company acknowledges on its website that it partners with several web-filtering companies, including Linewize, to analyze students鈥 online searches. By working in tandem, flags triggered by Linewize鈥檚 web filtering 鈥渃an be sent straight to the Gaggle Safety Team,鈥 if the material 鈥渟hould be forwarded to the school or district.鈥 

In an email, Gaggle spokesperson Paget Hetherington said that in 鈥渁 very small number of school systems,鈥 the company reviews alerts from web filters before they鈥檙e sent to school officials to 鈥渁lleviate the large number of false positives鈥 and ensure that 鈥渙nly the most critical and imminent issues are being seen by the district.鈥 

Gaggle has also faced scrutiny for including LGBTQ-specific keywords in its algorithm, including 鈥済ay鈥 and 鈥渓esbian.鈥 Patterson said the heightened surveillance of LGBTQ youth is necessary because they face a disproportionately high suicide rate, and Hetherington shared examples where the keywords were used to spot cyberbullying incidents. 

But critics have accused the company of discrimination. Wood of the nonprofit LGBT Tech said that anti-LGBT activists have used surveillance to target their opponents for generations. Prior to the seminal 1969 riots after New York City police raided the Stonewall Inn gay bar, LGBTQ spaces and made arrests for 鈥渋nferring sexual perversion鈥 and 鈥渟erving gay people.鈥 From the colonial era and into the 19th century, anti-sodomy laws carried the death penalty and police used the rules to investigate and incarcerate people suspected of same-sex intimate behaviors. 

Now, in the era of 鈥淒on鈥檛 Say Gay鈥 laws, digital surveillance tools could be used to out LGBTQ students and put them in danger, Wood said. Student surveillance companies can claim their decision to include LGBTQ terminology is designed to help students, but historically such data have 鈥渂een used against us in very detrimental ways.鈥 

Companies, he said, are unable to control how officials use that information in an era 鈥渨here teachers and administrators and other students are encouraged to out other students or blame them or somehow get them in trouble for their identity.鈥 In Texas, Republican Gov. Greg Abbott calling on child protective services to investigate as child abuse any parents who provide gender-affirming health care to their transgender children. 

鈥淭hey can鈥檛 control what鈥檚 going to happen in Florida or Texas and they can鈥檛 control what鈥檚 going to happen in an individual home,鈥 where students could be subjected to abuse, Wood said. 鈥淎ny person in their right mind would be horrified to learn that it was their technology that ended up harming a youth or driving a youth to the point of feeling so isolated that they felt the only way out was suicide.鈥 

When private thoughts become public

Susan, a 14-year-old from Cincinnati, knows firsthand how surveillance companies can target students for discussing their sexuality. In middle school, she was assigned to write a 鈥渢ime capsule鈥 letter to her future self. 

Until Susan retrieved the letter after high school graduation, her teacher said that no one 鈥 not even him 鈥 would read it. So Susan, who is now a freshman and asked to remain anonymous, used the private space to question her gender identity. 

But her teacher鈥檚 assurance wasn鈥檛 quite true, she learned. Someone had been reading the letter 鈥 and would soon hold it against her. 

In an automated May 2021 email, Gaggle notified her that the letter to her future self was 鈥渋dentified as inappropriate鈥 and urged her to 鈥渞efrain from storing or sharing inappropriate content.鈥 In a 鈥渟econd warning,鈥 sent to her inbox, she was told a school administrator was given 鈥渁ccess to this violation.鈥 After a third alert, she said, access to her school email account was restricted. She said the experience left her with 鈥渁 sense of betrayal from my school.鈥 She said she had no idea words like 鈥済ay鈥 or 鈥渟ex鈥 could get flagged by Gaggle鈥檚 algorithm.

Susan, a student from Cincinnati, received an email alert from Gaggle notifying her that her classroom assignment, a 鈥渢ime capsule鈥 letter to her future self, had been 鈥渋dentified as inappropriate.鈥 (Courtesy Susan)

鈥淚t鈥檚 frustrating to know that this program finds the need to have these as keywords, and quite depressing,鈥 she said. 鈥淭here鈥檚 always going to be oppression against the community somewhere, it seems, and it鈥檚 quite disheartening.鈥 

School administrators reviewed the time capsule letter and determined it didn鈥檛 contain anything inappropriate, her mother Margaret said. While Susan lives in an LGBTQ-affirming household, Thomas, who grew up in Mississippi, warned that鈥檚 not the case for everyone.

鈥淭hat鈥檚 not just the surveillance of your activities, that鈥檚 the surveillance of your thoughts,鈥 Thomas said of Susan鈥檚 experience. 鈥淚 know that wouldn鈥檛 have gone very well for me and I know for a lot of young people that would place them in a lot of danger.鈥

Such harms could be exacerbated, Margaret said, if authorities use student data to enforce Ohio鈥檚 strict abortion ban, which has already become the subject of national debate after a 10-year-old girl traveled to Indiana for an abortion. A 27-year-old man and accused of raping the child. 

Cincinnati Public Schools spokesman Mark Sherwood said in an email that 鈥渓aw enforcement is immediately contacted鈥 if the district receives an alert from Gaggle suggesting that a student poses 鈥渁n imminent threat of harm to self or others.鈥 

Given the state of abortion rules in Ohio, Susan said she鈥檚 concerned that student conversations and classroom assignments that discuss gender and sexuality could wind up in the hands of the police. She lost faith in school-issued technology after her assignment got flagged by Gaggle. 

鈥淚 just flat out don鈥檛 trust adults in positions of power or authority,鈥 Susan said. 鈥淵ou don鈥檛 really know for sure what their true motives are or what they could be doing with the tools they have at their disposal.鈥

]]>
Survey Reveals Extent that Cops Surveil Students Online 鈥 in School and at Home /article/survey-reveals-extent-that-cops-surveil-students-online-in-school-and-at-home/ Wed, 03 Aug 2022 04:01:00 +0000 /?post_type=article&p=694119 When Baltimore students sign into their school-issued laptops, the police log on, too. 

Since the pandemic began, Baltimore City Public Schools officials have with GoGuardian, a digital surveillance tool that promises to identify youth at risk of harming themselves or others. When GoGuardian flags students, their online activities are shared automatically with school police, giving cops a conduit into kids鈥 private lives 鈥 including on nights and weekends.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Such partnerships between schools and police appear startlingly widespread across the country with significant implications for youth, according to . Nearly all teachers 鈥 89% 鈥 reported that digital student monitoring tools like GoGuardian are used in their schools. And nearly half 鈥 44% 鈥 said students have been contacted by the police as a result of student monitoring. 

The pandemic has led to major growth in the number of schools that rely on activity monitoring software to uncover student references to depression and violent impulses. The tools, offered by a handful of tech companies, can sift through students鈥 social media posts, follow their digital movements in real-time and scan files on school-issued laptops 鈥 from classroom assignments to journal entries 鈥 in search of warning signs. 

Educators say the tools help them identify youth who are struggling and get them the mental health care they need at a time when youth depression and anxiety are spiraling. But the survey suggests an alternate reality: Instead of getting help, many students are being punished for breaking school rules. And in some cases, survey results suggest, students are being subjected to discrimination. 

The report raises serious questions about whether digital surveillance tools are the best way to identify youth in need of mental health care and whether police officers should be on the front lines in responding to such emergencies. 

鈥淚f we鈥檙e saying this is to keep students safe, but instead we鈥檙e using it punitively and we鈥檙e using it to invite law enforcement literally into kids鈥 homes, is this actually achieving its intended goal?鈥 asked Elizabeth Laird, a survey author and the center鈥檚 director of equity in civic technology. 鈥淥r are we, in the name of keeping students safe, actually endangering them?鈥

Among teachers who use monitoring tools at their schools, 78% said the software has been used to flag students for discipline and 59% said kids wound up getting punished as a result. Yet just 45% of teachers said the software is used to identify violent threats and 47% said it is used to identify students at risk of harming themselves. 

Center for Democracy and Technology

The findings are a direct contradiction of the stated goal of student activity monitoring, Laird said. School leaders and company executives have long maintained that the tools are not a disciplinary measure but are designed to identify at-risk students before someone gets hurt.

The Supreme Court鈥檚 recent repeal of Roe v. Wade, she said, further muddles police officers鈥 role in student activity monitoring. As states implement anti-abortion laws, that data from student activity monitoring tools could help the police identify youth seeking reproductive health care. 

鈥淲e know that law enforcement gets these alerts,鈥 she said. 鈥淚f you are in a state where they are looking to investigate these kinds of incidents, you鈥檝e invited them into a student鈥檚 house to be able to do that.鈥

A tale of discrimination

In Baltimore, counselors, principals and school-based police officers receive all alerts generated by GoGuardian during school hours, according to by The Real News Network, a nonprofit media outlet. Outside of school hours, including on weekends and holidays, the responsibility to monitor alerts falls on the police, the outlet reported, and on numerous occasions officers have shown up at students鈥 homes to conduct wellness checks. On , students have been transported to the hospital for emergency mental health care. 

In a statement to 社区黑料, district spokesperson Andre Riley said that GoGuardian helps officials 鈥渋dentify potential risks to the safety of individual students, groups or schools,鈥 and that 鈥減roper accountability measures are taken鈥 if students violate the code of conduct or break laws.

鈥淭he use of GoGuardian is not simply a prompt for a law enforcement response,鈥 Riley added.

Leading student surveillance companies, including GoGuardian, have maintained that their interactions with police are limited. In April, Democratic Sens. Elizabeth Warren and Ed Markey warned in a report that schools鈥 reliance on the tools could violate students鈥 civil rights and exacerbate 鈥渢he school-to-prison pipeline by increasing law enforcement interactions with students.鈥 Warren and Markey focused their report on four companies: GoGuardian, Gaggle, Securly and Bark. 

In , Gaggle executives said the company contacts law enforcement for wellness checks if they are unable to reach school-based emergency contacts and a child appears to be 鈥渋n immediate danger.鈥 In on the company鈥檚 website, school officials in Wichita Falls, Texas, Cincinnati, Ohio, and Miami, Florida, acknowledged contacting police in response to Gaggle alerts.

In some cases, school leaders ask Securly to contact the police directly and request they conduct welfare checks on students, the to lawmakers. Executives at Bark said 鈥渢here are limited options鈥 beyond police intervention if they identify a student in crisis but they cannot reach a school administrator. 

鈥淲hile we have witnessed many lives saved by police in these situations, unfortunately many officers have not received training in how to handle such crises,鈥 in its letter. 鈥淚rrespective of training there is always a risk that a visit from law enforcement can create other negative outcomes for a student and their family.鈥 

In its , GoGuardian states the company may disclose student information 鈥渋f we believe in good faith that doing so is necessary or appropriate to comply with any law enforcement, legal or regulatory process.鈥 

Center for Democracy and Technology

Meanwhile, survey results suggest that student surveillance tools have a negative disparate impact on Black and Hispanic students, LGBTQ youth and those from low-income households. In a letter on Wednesday to coincide with the survey鈥檚 release, a coalition of education and civil rights groups called on the U.S. Department of Education to issue guidance warning schools that their digital surveillance practices could violate federal civil rights laws. Signatories include the American Library Association, the Data Quality Campaign and the American Civil Liberties Union.

鈥淭his is becoming a conversation not just about privacy, but about discrimination,鈥 Laird said. 鈥淲ithout a doubt, we see certain groups of students having outsized experiences in being directly targeted.鈥

In a youth survey, researchers found that student discipline as a result of activity monitoring fell disproportionately along racial lines, with 48% of Black students and 55% of Hispanic students reporting that they or someone they knew got into trouble for something that was flagged by an activity monitoring tool. Just 41% of white students reported having similar experiences. 

Nearly a third of LGBTQ students said they or someone they know experienced nonconsensual disclosure of their sexual orientation or gender identity 鈥 often called outing 鈥 as a result of activity monitoring. LGBTQ youth were also more likely than straight and cisgender students to report getting into trouble at school and being contacted by the police about having committed a crime. 

Some student surveillance companies, like Gaggle, monitor references to words including 鈥済ay鈥 and 鈥渓esbian,鈥 a reality company founder and CEO Jeff Patterson has said was created to protect LGBTQ youth, who face a greater risk of dying by suicide. But survey results suggest the heightened surveillance comes with significant harm to youth, and Laird said if monitoring tools are designed with certain students in mind, such as LGBTQ youth, that in itself is a form of discrimination.听

Center for Democracy and Technology

In its letter to the Education Department鈥檚 Office for Civil Rights Wednesday, advocates said the disparities outlined in the survey run counter to federal laws prohibiting race-, sex- and disability-based discrimination. 

鈥淪tudent activity monitoring is subjecting protected classes of students to increased discipline and interactions with law enforcement, invading their privacy, and creating hostile environments for students to express their true thoughts and authentic identities,鈥 the letter states. 

The Education Department鈥檚 civil rights division, they said, should condemn surveillance practices that violate students鈥 civil rights and launch 鈥渆nforcement action against violations that result in discrimination.鈥

Lawmakers consider youth privacy

The report comes at a moment of increasing alarm about student privacy online. In May, the Federal Trade Commission announced plans to crack down on tech companies that sell student data for targeted advertising and that 鈥渋llegally surveil children when they go online to learn.鈥 

It also comes at a time of intense concern over students鈥 emotional and physical well-being. While the pandemic has led to a greater focus on youth mental health, the May mass school shooting in Uvalde, Texas, has sparked renewed school safety efforts. In June, President Joe Biden signed a law with modest new gun-control provisions and an influx of federal funding for student mental health care and campus security. The funds could lead to more digital student surveillance.

The results of the online survey, which was conducted in May and June, were likely colored by the Uvalde tragedy, researchers acknowledged. A majority of parents and students have a favorable view of student activity monitoring during school hours to protect kids from harming themselves or others, researchers found. But just 48% of parents and 30% of students support around-the-clock surveillance. 

鈥淪chools are under a lot of pressure to find ways to keep students safe and, like in many aspects of our lives, they are considering the role of technology,鈥 Laird said. 

Last week, the Senate designed to improve children鈥檚 safety online, including new restrictions on youth-focused targeted advertising. The effort comes a year after a showing that the social media app Instagram had a harmful effect on youth mental well-being, especially teenage girls. One bill, the Kids Online Safety Act, would require tech companies to identify and mitigate any potential harms their products may pose to children, including exposure to content that promotes self-harm, eating disorders and substance abuse.

Yet the legislation has faced criticism from privacy advocates, who argue it would mandate digital monitoring similar to that offered by student surveillance companies. Among critics is the Electronic Frontier Foundation, a nonprofit focused on digital privacy and free speech. 

鈥淭he answer to our lack of privacy isn鈥檛 more tracking,鈥 the . The legislation 鈥渋s a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is 鈥榥ot in their best interest,鈥 as defined by the government, and interpreted by tech platforms.鈥 

Attorney Amelia Vance, the founder and president of Public Interest Privacy Consulting, said she worries the provisions will have a negative impact on at-risk kids, including LGBTQ students. Students from marginalized groups, she said, 鈥渨ill now be more heavily surveilled by basically every site on the internet, and that information will be available to parents鈥 who could discipline teens for researching LGBTQ content. She said the legislation could force tech companies to censor content to avoid potential liability, essentially making them arbiters of community standards. 

鈥淲hen you have conflicting values in the different jurisdictions that the companies operate in, oftentimes you end up with the most conservative interpretations, which right now is anti-LGBT,鈥 she said.

]]>
FTC Targets Ed Tech Companies that 鈥業llegally Surveil Children鈥 /article/ftc-announces-plan-to-target-ed-tech-tools-that-illegally-surveil-children/ Fri, 20 May 2022 21:53:00 +0000 /?post_type=article&p=589724 The Federal Trade Commission announced ramped-up enforcement of education technology companies that sell student data for targeted advertising and that 鈥渋llegally surveil children when they go online to learn,鈥 in violation of federal student privacy rules.

鈥淚t is against the law for companies to force parents and schools to surrender their children鈥檚 privacy rights in order to do schoolwork online or attend class remotely,鈥 the federal agency said in a media release Thursday. 鈥淯nder the federal Children鈥檚 Online Privacy Protection Act (COPPA), companies cannot deny children access to educational technologies when their parents or school refuse to sign up for commercial surveillance.鈥 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Through a , the commission signaled its intent to 鈥渟crutinize compliance鈥 with COPPA, the federal law that limits the data that technology companies can collect on children under 13 without parental consent. The statement, approved through a unanimous bipartisan vote by the five commissioners, reminds education technology companies that they are prohibited from using student data for commercial purposes, including for marketing and advertising, should not retain student data for a period longer than what鈥檚 deemed 鈥渞easonably necessary,鈥 and must have sufficient security to ensure data remain confidential. Additionally, tech companies must not exclude students who do not disclose more personal information 鈥渢han is reasonably necessary for the child to participate in that activity.鈥 

The policy statement comes at a critical moment for education technology companies. When the pandemic shuttered schools nationally and forced children into remote learning, their place in the education landscape grew exponentially as educators relied more heavily on their services. But they鈥檝e also faced scrutiny for their data collection practices, particularly in the wake of high-profile breaches. recently notified students that their personal data was compromised in a breach at the company Illuminate Education. The hack exposed the personal information of some , the nation鈥檚 largest school district.

The FTC statement does not introduce any new rules, yet it makes clear that education technology and student privacy are an enforcement priority. Weak enforcement of student privacy rules has been a longstanding problem, said Cody Venzke, senior counsel at the nonprofit Center for Democracy and Technology.

Suggesting that the federal government had gone too easy on ed tech companies in the past, President Joe Biden criticized student surveillance practices on Thursday and signaled his support for greater student privacy protections. 

鈥淲hen children and parents access online educational products, they shouldn鈥檛 be forced to accept tracking and surveillance to do so,鈥 Biden said in a statement. The FTC, he said, 鈥渨ill be cracking down on companies that persist in exploiting our children to make money.鈥 

Among the services and applications that saw significant growth during the pandemic are those that monitor students鈥 online activities on school-issued devices and technology. Company executives say their digital products are critical to identify youth who are at risk of harming themselves or others, but critics argue the surveillance violates students鈥 privacy rights. 

社区黑料 has reported extensively on the expanding presence of such student surveillance companies, including Gaggle, which sifts through billions of student communications on school-issued Google and Microsoft accounts each year in search of references to violence and self-harm. Company executives say the tools save live,s but critics argue they could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

In one recent story, former content moderators on the front lines of Gaggle鈥檚 student monitoring efforts raised significant questions about the company鈥檚 efficacy and its effects on students鈥 civil rights. The former moderators reported insufficient safeguards to protect students鈥 sensitive data, a work culture that prioritized speed over quality, limited training and frequent exposure to explicit content that left some traumatized. 

In , FTC Chair Lina Khan said that 鈥渃ommercial surveillance cannot be a condition of doing schoolwork.鈥 

鈥淭hough widespread tracking, surveillance and expansive use of data across contexts have become increasingly common practices across the broader economy,鈥 Khan said, the policy makes clear that federal law 鈥渇orbids companies from wholesale extending these practices into the context of schools and learning.鈥 

The FTC鈥檚 comments on surveillance, Venzke said in an email, suggest that the agency will scrutinize the practices of education technology vendors that collect 鈥渢roves of sensitive information about students’ lives, including student activity monitoring software vendors.鈥 

鈥淪tudent activity monitoring companies must ensure they are taking appropriate steps to not only secure the sensitive data they collect on students, but also to ensure that they are collecting only the absolute minimum data that they need to achieve a legitimate educational purpose 鈥 and then that they delete the data when it is no longer needed,鈥 Venzke said.

A Gaggle spokesperson didn鈥檛 immediately respond to a request for comment. In on Thursday, the company noted that it takes 鈥渄ata security very seriously,鈥 only uses student information for educational purposes, has a strict data retention policy and has comprehensive security standards. The post said the company does not sell student data or engage in targeted advertising. 

Numerous companies have faced fines in recent years for violating the federal privacy law. In 2019, for example, YouTube paid to settle allegations it collected childrens鈥 data without parental consent and used it for targeted advertising. that same year to settle similar allegations. 

Amelia Vance

Despite the commission鈥檚 harsh critique of surveillance, the enforcement of student privacy rules will likely go beyond companies that monitor students online, said attorney Amelia Vance. the co-founder and president of Public Interest Privacy Consulting. She interpreted the FTC announcement to broadly encompass 鈥渟urveillance capitalism,鈥 where personal data are collected and sold for profit. However, she noted that Gaggle and other monitoring companies could have particular problems. In its announcement, the FTC said it is unreasonable for education technology companies to retain student data 鈥渇or speculative future potential purposes.鈥

鈥淪o much of the monitoring information collected and kept, especially when it comes to tracking the mental health of students, it could easily, arguably be speculative,鈥 she said. 鈥淭hat could cause confusion from companies about what obligations they have to either collect certain data or not collect certain data or not retain certain data even when the school has asked for it.鈥 

The FTC announcement follows a recent investigation into student monitoring companies by Democratic Sens. Elizabeth Warren and Ed Markey, which warned of surveillance companies鈥 potential harms and called on the Federal Communications Commission to clarify the provisions of another federal law, the Children鈥檚 Internet Protection Act, which requires schools to monitor students鈥 online activities.

In response to the FTC statement, a bipartisan group of senators cautioned that threats to online privacy have reached 鈥渁 crisis point.鈥 

鈥淲e applaud the FTC鈥檚 attention to this urgent problem and its acknowledgment that a child鈥檚 education should never come at the expense of their privacy,鈥 said a statement released by Markey, fellow Democratic Sen. Richard Blumenthal and Republican Sens. Bill Cassidy and Cynthia Lummis. 鈥淭he FTC鈥檚 policy statement is an important step in the right direction, but it is not a replacement for legislative action.鈥

]]>
Schools Bought Security Cameras to Fight COVID. Did it Work? /article/from-face-mask-detection-to-temperature-checks-districts-bought-ai-surveillance-cameras-to-fight-covid-why-critics-call-them-smoke-and-mirrors/ Wed, 30 Mar 2022 11:01:00 +0000 /?post_type=article&p=587174 This story is part of a series produced in partnership with exploring the increasing role of artificial intelligence and surveillance in our everyday lives during the pandemic, including in schools.

When students in suburban Atlanta returned to school for in-person classes amid the pandemic, they were required to cover their faces with cloth masks like in many places across the U.S. Yet in this 95,000-student district, officials took mask compliance a step further than most. 

Through a network of security cameras, officials harnessed artificial intelligence to identify students whose masks drooped below their noses. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淚f they say a picture is worth a thousand words, if I send you a piece of video 鈥 it鈥檚 probably worth a million,鈥 said Paul Hildreth, the district鈥檚 emergency operations coordinator. 鈥淵ou really can鈥檛 deny, 鈥極h yeah, that鈥檚 me, I took my mask off.鈥”

The school district in Fulton County had installed the surveillance network, by , years before the pandemic shuttered schools nationwide in 2020. Under a constant fear of mass school shootings, districts in recent years have increasingly deployed controversial surveillance networks like cameras with facial recognition and gun detection.

With the pandemic, security vendors switched directions and began marketing their wares as a solution to stop the latest threat. In Fulton County, the district used Avigilon鈥檚 鈥淣o Face Mask Detection鈥 technology to identify students with their faces exposed. 

During remote learning, the pandemic ushered in a new era of digital student surveillance as schools turned to AI-powered services like remote proctoring and in search of threats and mental health warning signs. Back on campus, districts have rolled out tools like badges that track students鈥 every move

But one of the most significant developments has been in AI-enabled cameras. Twenty years ago, security cameras were present in 19 percent of schools, according to . Today, that . Powering those cameras with artificial intelligence makes automated surveillance possible, enabling things like temperature checks and the collection of other biometric data.

Districts across the country have said they鈥檝e bought AI-powered cameras to fight the pandemic. But  as pandemic-era protocols like mask mandates end, experts said the technology will remain. Some educators have stated plans to leverage pandemic-era surveillance tech for student discipline while others hope AI cameras will help them identify youth carrying guns. 

The cameras have faced sharp resistance from civil rights advocates who questioned their effectiveness and argue they trample students鈥 privacy rights.

Noa Young, a 16-year-old junior in Fulton County, said she knew that cameras monitored her school but wasn鈥檛 aware of their high-tech features like mask detection. She agreed with the district鈥檚 now-expired mask mandate but felt that educators should have been more transparent about the technology in place.

鈥淚 think it鈥檚 helpful for COVID stuff but it seems a little intrusive,鈥 Young said in an interview. 鈥淚 think it鈥檚 strange that we were not aware of that.鈥

鈥楽moke and mirrors鈥

Outside of Fulton County, educators have used AI cameras to fight COVID on multiple fronts. 

In Rockland Maine鈥檚 Regional School Unit 13, officials used federal pandemic relief money to procure a network of cameras for contact tracing. Through advanced surveillance, the cameras by allow the 1,600-student district to identify students who came in close contact with classmates who tested positive for COVID-19. In its , Verkada explains how districts could use federal funds tied to the public health crisis to buy its cameras for contact tracing and crowd control. 

At a district in suburban Houston, officials spent nearly $75,000 on AI-enabled cameras from , a surveillance company owned in part by the Chinese government, and deployed thermal imaging and facial detection to identify students with elevated temperatures and those without masks. 

The cameras can screen as many as 30 people at a time and are therefore 鈥渓ess intrusive鈥 than slower processes, said Ty Morrow, the Brazosport Independent School District鈥檚 head of security. The checkpoints have helped the district identify students who later tested positive for COVID-19, Morrow said, although has argued Hikvision鈥檚 claim of accurately scanning 30 people at once is not possible. 

鈥淭hat was just one more tool that we had in the toolbox to show parents that we were doing our due diligence to make sure that we weren鈥檛 allowing kids or staff with COVID into the facilities,鈥 he said.  

Yet it鈥檚 this mentality that worries consultant Kenneth Trump, the president of Cleveland-based National School Safety and Security Services. Security hardware for the sake of public perception, the industry expert said, is simply 鈥渟moke and mirrors.鈥

鈥淚t鈥檚 creating a fa莽ade,鈥 he said. 鈥淧arents think that all the bells and whistles are going to keep their kids safer and that鈥檚 not necessarily the case. With cameras, in the vast majority of schools, nobody is monitoring them.鈥

鈥榊ou don鈥檛 have to like something鈥

When the Fulton County district upgraded its surveillance camera network in 2018, officials were wooed by Avigilon鈥檚 AI-powered 鈥淎ppearance Search,鈥 which allows security officials to sift through a mountain of video footage and identify students based on characteristics like their hairstyle or the color of their shirt. When the pandemic hit, the company鈥檚 mask detection became an attractive add-on, Hildreth said.

He said the district didn鈥檛 actively advertise the technology to students but they likely became aware of it quickly after students got called out for breaking the rules. He doesn鈥檛 know students鈥 opinions about the cameras 鈥 and didn鈥檛 seem to care. 

鈥淚 wasn鈥檛 probably as much interested in their reaction as much as their compliance,鈥 Hildreth said. 鈥淵ou don鈥檛 have to like something that鈥檚 good for you, but you still need to do it.鈥

A Fulton County district spokesman said they weren鈥檛 aware of any instances where students were disciplined because the cameras caught them without masks. 

After the 2018 mass school shooting in Parkland, Florida, pitched its cameras with AI-powered 鈥済un detection鈥 as a promising school safety strategy. Similar to facial recognition, the gun detection system uses artificial intelligence to spot when a weapon enters a camera鈥檚 field of view. By identifying people with guns before shots are fired, the service is 鈥渓ike Minority Report but in real life,鈥 a company spokesperson wrote in an email at the time, referring to the that predicts a dystopian future of mass surveillance. During the pandemic, the company rolled out thermal cameras that a company spokesperson wrote in an email could 鈥渁ccurately pre-screen 2,000 people per hour.鈥

The spokesperson declined an interview request but said in an email that Athena is 鈥渘ot a surveillance company鈥 and did not want to be portrayed as 鈥渟pying on鈥 students. 

Among the school security industry鈥檚 staunchest critics is Sneha Revanur, a 17-year-old high school student from San Jose, California, who founded to highlight the dangers of artificial intelligence on civil liberties. 

Revanur said she鈥檚 concerned by districts鈥 decisions to implement surveillance cameras as a public health strategy and that the technology in schools could result in harsher discipline for students, particularly youth of color. 


Sneha Revanur

Verkada offers a cautionary tale about the potential harms of pervasive school surveillance and student data collection. Last year, when a hack exposed the live feeds of 150,000 surveillance cameras, including those inside Tesla factories, jails and at Sandy Hook Elementary School in Newtown, Connecticut. The Newtown district, which suffered a mass school shooting in 2012, said compromising information about students. The some educators from contracting with the California-based company. 

After a back-and-forth with the Verkada spokesperson, the company would not grant an interview or respond to a list of written questions. 

Revanur called the Verkada hack at Sandy Hook Elementary a 鈥渟taggering indictment鈥 of educators鈥 rush for 鈥渄ragnet surveillance systems that treat everyone as a constant suspect鈥 at the expense of student privacy. Constant monitoring, she argued, 鈥渃reates this culture of fear and paranoia that truly isn鈥檛 the most proactive response to gun violence and safety concerns.鈥 

In Fayette County, Georgia, the district spent about $500,000 to purchase 70 Hikvision cameras with thermal imaging to detect students with fevers. But it and disabled them over their efficacy and Hikvision鈥檚 ties to the Chinese government. In 2019, the U.S. government , alleging the company was implicated in China鈥檚 鈥渃ampaign of repression, mass arbitrary detention and high-technology surveillance鈥 against Muslim ethnic minorities.

 The school district declined to comment. In a statement, a Hikvision spokesperson said the company 鈥渢akes all reports regarding human rights very seriously鈥 and has engaged governments globally 鈥渢o clarify misunderstandings about the company.鈥 The company is 鈥渃ommitted to upholding the right to privacy,鈥 the spokesperson said. 

Meanwhile, Regional School Unit 13鈥檚 decision to use Verkada security cameras as a contact tracing tool could run afoul of in Maine schools. The district didn鈥檛 respond to requests for comment. 

Michael Kebede, the ACLU of Maine鈥檚 policy counsel, cited recent studies on facial recognition鈥檚 flaws in and and called on the district to reconsider its approach. 

鈥淲e fundamentally disagree that using a tool of mass surveillance is a way to promote the health and safety of students,鈥 Kobede said in a statement. 鈥淚t is a civil liberties nightmare for everyone, and it perpetuates the surveillance of already marginalized communities.鈥

Security officials at the Brazosport Independent School District in suburban Houston use AI-enabled security cameras to screen educators for elevated temperatures. District leaders mounted the cameras to carts so they could be used in various locations across campus. (Courtesy Ty Morrow)

White faces

In Fulton County, school officials wound up disabling the face mask detection feature in cafeterias because it was triggered by people eating lunch. Other times, it identified students who pulled their masks down briefly to take a drink of water. 

In suburban Houston, Morrow ran into similar hurdles. When white students wore light-colored masks, for example, the face detection sounded alarms. And if students rode bikes to school, the cameras flagged their elevated temperatures. 

鈥淲e鈥檝e got some false positives but it was not a failure of the technology,鈥 Hildreth said. 鈥淲e just had to take a look and adapt what we were looking at to match our needs.鈥

With those lessons learned, Hildreth said he hopes to soon equip Fulton County campuses with AI-enabled cameras that identify students who bring guns to school. He sees a future where algorithms identify armed students 鈥渋n the same exact manner鈥 as Avigilon鈥檚 mask detection. 

In a post-pandemic world, Albert Fox Cahn, founder of the nonprofit , worries the entire school security industry will take a similar approach. In February, educators in Waterbury, Connecticut, a new network of campus surveillance cameras with weapons detection. 

鈥淲ith the pandemic hopefully waning, we鈥檒l see a lot of security vendors pivoting back to school shooting rhetoric as justification for the camera systems,鈥 he said. Due to the potential for errors, Cahn called the embrace of AI gun detection 鈥渞eally alarming.鈥 

Disclosure: This story was produced in partnership with . It is part of a reporting series that is supported by the which works to build vibrant and inclusive democracies whose governments are accountable to their citizens. All content is editorially independent and overseen by Guardian and 74 editors.

]]>
Opinion: School Surveillance of Students Via Laptops May Do More Harm Than Good /article/school-surveillance-of-students-via-laptops-may-do-more-harm-than-good/ Wed, 19 Jan 2022 14:01:00 +0000 /?post_type=article&p=583295 Ever since the start of the pandemic, more and more public school students are using laptops, tablets or similar devices issued by their schools.

The percentage of teachers who reported their schools had provided their students with such devices doubled from 43% before the pandemic to , a September 2021 report shows.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


In one sense, it might be tempting to celebrate how schools are doing more to keep their students digitally connected during the pandemic. The problem is, schools are not just providing kids with computers to keep up with their schoolwork. Instead 鈥 in a trend that could easily be described as Orwellian 鈥 the vast majority of schools are also using those devices to keep tabs on what students are doing in their personal lives.

Indeed, reported that their schools had installed artificial intelligence-based surveillance software on these devices to monitor students鈥 online activities and what is stored in the computer.

This student surveillance is taking place 鈥 at taxpayer expense 鈥 in cities and school communities throughout the United States.

For instance, in the Minneapolis school district, school officials paid over $355,000 to use tools provided by until 2023. Three-quarters of incidents reported 鈥 that is, cases where the system flagged students鈥 online activity 鈥 took place outside school hours.

In Baltimore, where the public school system uses the surveillance app, police officers are when the system detects students typing keywords related to self-harm.

Safety versus privacy

Vendors claim these tools from self-harm or online activities that could lead to trouble. However, and have raised questions about those claims.

Vendors often how their and the type of data used to train them.

Privacy advocates fear these tools may harm students by and .

As a researcher who and issues in various , I know that intrusive surveillance techniques cause to students, and .

Artificial intelligence not intelligent enough

Even the to understand human language and . This is why student surveillance systems pick up a lot of instead of real problems.

In some cases, these surveillance programs have flagged students discussing music deemed suspicious and even students 鈥淭o Kill a Mockingbird.鈥

Harm to students

When students know they are being monitored, they are to share true thoughts online and are more careful about what they search. This can discourage vulnerable groups, such as students with mental health issues, from getting needed services.

When students know that their every move and everything read and written is watched, they are also . In general, surveillance has a negative impact on students鈥 . It also of the skills and mindset needed to exercise their rights.

More adverse impact on minorities

U.S. schools minority students. African American students鈥 chances of being suspended are than that of their white peers.

After evaluating flagged content, , who take disciplinary actions on a case-by-case basis. The in schools鈥 use of these tools could lead to further harm for minority students.

The situation is worsened by the fact that Black and Hispanic students rely . This in turn makes minority students more likely to be monitored and exposes them to greater risk of some sort of intervention.

When both minority students and their white peers are monitored, the former group is more likely to be penalized because the training data used in developing artificial intelligence programs often . Artificial intelligence programs languages . This is due to the in the datasets used to train such programs and .

Leading AI models are that those written by others. They are 2.2 times more likely to flag tweets written in African American slang.

These tools also affect sexual and gender minorities more adversely. Gaggle has reportedly flagged because they are associated with pornography, even though the terms are often used to describe one鈥檚 identity.

Increased security risk

These surveillance systems also increase students鈥 cybersecurity risks. First, to comprehensively monitor students鈥 activities, surveillance vendors compel students to install a set of certificates known as root certificates. As the highest-level security certificate installed in a device, a root certificate to determine the entire system鈥檚 security. One drawback is that these certificates .

Gaggle, which scans digital files of more than each year, installs such certificates. This tactic of installing certificates is similar regimes, , use to and that cybercriminals use to .

Second, surveillance system vendors use insecure systems that hackers can exploit. In March 2021, computer security software company McAfee found in student monitoring system vendor Netop鈥檚 Vision Pro Education software. For instance, Netop did not .

The software was used by over 9,000 schools worldwide to monitor millions of students. The vulnerability .

Finally, personal information of students that is stored by the vendors is . In July 2020, criminals stole 鈥 including names, email addresses, home addresses, phone numbers and passwords 鈥 by hacking online proctoring service ProctorU. This data was then leaked online.

Schools would do well to look more closely at the harm being caused by their surveillance of students and to question whether they actually make students more safe 鈥 or less.The Conversation

This article is republished from under a Creative Commons license. Read the .

]]>
Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it鈥檚 鈥楴ot That Smart鈥 /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone鈥檚 gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since 鈥済raduated鈥 from weekly therapy sessions and has found a better headspace, but that didn鈥檛 stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope 鈥 intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song 鈥淵our Heart is a Muscle the Size of Your Fist鈥 helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was 鈥渁 reminder to keep on loving, keep on fighting and hold on for your life.鈥 (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, 社区黑料 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company鈥檚 digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle鈥檚 surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word 鈥渟uicide,鈥 context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment 鈥 that his mental health had improved 鈥 was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 鈥淚 was trying to be vulnerable with this teacher and be like, 鈥楬ey, here鈥檚 a thing that鈥檚 important to me because you asked,鈥 Logsdon-Wallace said. 鈥淣ow, when I鈥檝e made it clear that I鈥檓 a lot better, the school is contacting my counselor and is freaking out.鈥

Jeff Patterson, Gaggle鈥檚 founder and CEO, said in a statement his company does not 鈥渕ake a judgement on that level of the context,鈥 and while some districts have requested to be notified about references to previous suicide attempts, it鈥檚 ultimately up to administrators to 鈥渄ecide the proper response, if any.鈥  

鈥楢 crisis on our hands鈥

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students鈥 online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students鈥 emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by 社区黑料 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic鈥檚 effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

鈥淏efore the pandemic, we had a crisis on our hands,鈥 he said. 鈥淚 believe there鈥檚 a tsunami of youth suicide headed our way that we are not prepared for.鈥 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there鈥檚 to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace鈥檚 mother Alexis Logsdon didn鈥檛 know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

鈥淭hat was an example of somebody describing really good coping mechanisms, you know, 鈥業 have music that is one of my soothing activities that helps me through a really hard mental health time,鈥欌 she said. 鈥淏ut that doesn鈥檛 matter because, obviously, this software is not that smart 鈥 it鈥檚 just like 鈥榃oop, we saw the word.鈥欌 

鈥楻andom and capricious鈥

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications 鈥 an experience she described as 鈥渞eally scary.鈥

鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of 鈥淚nappropriate Use鈥 while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school鈥檚 literary journal and, according to her, Gaggle had ultimately flagged profanity in students鈥 fictional article submissions. 

鈥淭he link at the bottom of this email is for something that was identified as inappropriate,鈥 Gaggle warned in its email while pointing to one of the fictional articles. 鈥淧lease refrain from storing or sharing inappropriate content in your files.鈥 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn鈥檛 catch everything. Even as she got flagged when students shared documents with her, the articles鈥 authors weren鈥檛 receiving similar alerts, she said. And neither did Gaggle鈥檚 AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle鈥檚 monitoring system is 鈥渞andom and capricious,鈥 and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

鈥淲ith such a seemingly random service, that doesn鈥檛 seem to 鈥 in the end 鈥 have an impact on improving student health or actually taking action to prevent suicide and threats鈥 she said in an interview. 鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times 鈥渄oes not properly indicate the author of a document and assigns a random collaborator.鈥

鈥淲e are hoping Google will improve this functionality so we can better protect students,鈥 Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn鈥檛 notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she鈥檇 shoot her 鈥減uny little brain with my grandpa鈥檚 rifle.鈥

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter鈥檚 teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

鈥淚 didn鈥檛 hear a word from Gaggle about it,鈥 she said. 鈥淚f I hadn鈥檛 brought it to the teacher鈥檚 attention, I don鈥檛 think that anything would have been done.鈥 

The incident, which occurred in April, fell outside the six-month period for which 社区黑料 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it 鈥渄oes not have any insight into the steps the district took to address this particular matter.鈥 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials 鈥渨ould never discuss with a community member any communication flagged by Gaggle.鈥 

鈥淭hat unrelated but concerned parent would not have been provided that information nor should she have been,鈥 she wrote in an email. 鈥淭hat is private.鈥 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

鈥楾he big scary algorithm鈥

When identifying potential trouble, Gaggle鈥檚 algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they鈥檙e delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That鈥檚 where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 This is not Netflix鈥檚 recommendation engine. This is not Spotify.鈥
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

鈥淵ou鈥檙e going to get 25,000 emails saying that a student dropped an F-bomb in a chat,鈥 she said. 鈥淲hat鈥檚 the utility of that? That seems pretty low.鈥 

She said that Gaggle鈥檚 utility could be impaired because it doesn鈥檛 adjust to students鈥 behaviors over time, comparing it to Netflix, which recommends television shows based on users鈥 ever-evolving viewing patterns. 鈥淪omething that doesn鈥檛 learn isn鈥檛 going to be accurate,鈥 she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle鈥檚 marketing materials appear to overhype the tool鈥檚 sophistication to schools, she said. 

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 she said. 鈥淭his is not Netflix鈥檚 recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.鈥 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart.鈥
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle鈥檚 proprietary algorithm is updated regularly 鈥渢o adjust to student behaviors over time and improve accuracy and speed.鈥 The tool monitors 鈥渢housands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.鈥 

Ultimately, the algorithm to identify keywords is used to 鈥渘arrow down the haystack as much as possible,鈥 Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart,鈥 he said. 

In Minneapolis, officials denied that Gaggle infringes on students鈥 privacy and noted that the tool only operates within school-issued accounts. The district鈥檚 internet use policy states that students should 鈥渆xpect only limited privacy,鈥 and that the misuse of school equipment could result in discipline and 鈥渃ivil or criminal liability.鈥 District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor 鈥渢he online activities of minors.鈥 

Patterson suggested that teachers aren鈥檛 paying close enough attention to keep students safe on their own and 鈥渟ometimes they forget that they鈥檙e mandated reporters.鈥 On the , Patterson says he launched the company in 1999 to provide teachers with 鈥渁n easy way to watch over their gaggle of students.鈥 Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company鈥檚 role in meeting it. As technology becomes a key facet of American education, Patterson said that schools 鈥渉ave a moral obligation to protect the kids on their digital playground.鈥 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student 鈥渢racking鈥 through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn鈥檛 be 鈥渃onstrued to require the tracking of internet use by any identifiable minor or adult user.鈥 In , her group urged the government to clarify the Children鈥檚 Internet Protection Act鈥檚 requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they鈥檙e concerned the tools 鈥渕ay extend beyond鈥 the law鈥檚 intent 鈥渢o surveil student activity or reinforce biases.鈥 Around-the-clock surveillance, they wrote, demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥 

鈥淓scalations and mischaracterizations of crises may have long-lasting and harmful effects on students鈥 mental health due to stigmatization and differential treatment following even a false report,鈥 the senators wrote. 鈥淔lagging students as 鈥榟igh-risk鈥 may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.鈥

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd鈥檚 murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle鈥檚 algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by 社区黑料 offer a limited window into Gaggle鈥檚 potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students鈥 digital communications are forwarded to police in rare circumstances. The Minneapolis district鈥檚 internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district鈥檚 Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district鈥檚 director of emergency management, safety and security, said that law enforcement is not a 鈥渞egular partner,鈥 when responding to incidents flagged by Gaggle. It doesn鈥檛 deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by 社区黑料.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

鈥淓ven if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,鈥 Matlock said, though it鈥檚 unclear if any students have faced legal consequences. 鈥淚t鈥檚 the question as to why they鈥檙e doing it,鈥 and to raise the issue with their parents.

Gaggle鈥檚 keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including 鈥済ay, and 鈥渓esbian.鈥 On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident 鈥渄isgusting and horribly messed up.鈥 

鈥淭hey have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it鈥檚 going to be false-positive because they are acting as if the word gay is inherently sexual,鈥 he said. 鈥淲hen people are just talking about being gay, anything they鈥檙e writing would be flagged.鈥 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in 社区黑料鈥檚 data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

鈥淭hat鈥檚 definitely really messed up, especially when the school is like 鈥極h no, no, no, please keep these Chromebooks over the summer,鈥欌 an invitation that gave students 鈥渢he go-ahead to use them鈥 for personal reasons, he said.

鈥淓specially when it鈥檚 during a pandemic when you can鈥檛 really go anywhere and the only way to talk to your friends is through the internet.鈥

]]>
An Inside Look at Spy Tech Used on Students During Remote Classes 鈥 and Beyond /article/gaggle-spy-tech-minneapolis-students-remote-learning/ Tue, 14 Sep 2021 10:30:00 +0000 /?post_type=article&p=577556 A week after the pandemic forced Minneapolis students to attend classes online, the city school district鈥檚 top security chief got an urgent email, its subject line in all caps, alerting him to potential trouble. Just 12 seconds later, he got a second ping. And two minutes after that, a third.

In each instance, the emails warning Jason Matlock of 鈥淨UESTIONABLE CONTENT鈥 pointed to a single culprit: Kids were watching cartoon porn.

Over the next six months, Matlock got nearly 1,300 similar emails from Gaggle, a surveillance company that monitors students’ school-issued Google and Microsoft accounts. Through artificial intelligence and a team of content moderators, Gaggle tracks the online behaviors of millions of students across the U.S. every day. The sheer volume of reports was overwhelming at first, Matlock acknowledged, and many incidents were utterly harmless. About 100 were related to animated pornography and, on one occasion, a member of Gaggle鈥檚 remote surveillance team flagged a fictional story that referenced 鈥渦nderwear.鈥

Hundreds of others, however, suggested imminent danger.

In emails and chat messages, students discussed violent impulses, eating disorders, abuse at home, bouts of depression and, as one student put it, 鈥渆nding my life.鈥 At a moment of heightened social isolation and elevated concern over students鈥 mental health, references to self-harm stood out, accounting for nearly a third of incident reports over a six-month period. In a document titled 鈥淢y Educational Autobiography,鈥 students at Roosevelt High School on the south side of Minneapolis discussed bullying, drug overdoses and suicide. 鈥淜ill me,鈥 one student wrote in a document titled 鈥済oodbye.鈥

Nearly a year after 社区黑料 submitted public records requests to understand the Minneapolis district鈥檚 use of Gaggle during the pandemic, a trove of documents offer an unprecedented look into how one school system deploys a controversial security tool that grew rapidly during COVID-19, but carries significant civil rights and privacy implications.

The data, gleaned from those 1,300 incident reports in the first six months of the crisis, highlight how Gaggle鈥檚 team of content moderators subject children to relentless digital surveillance long after classes end for the day, including on weekends, holidays, late at night and over the summer. In fact, only about a quarter of incidents were reported to district officials on school days between 8 a.m and 4 p.m., bringing into sharp relief how the service extends schools鈥 authority far beyond their traditional powers to regulate student speech and behavior, including at home.

Now, as COVID-era restrictions subside and Minneapolis students return to in-person learning this fall, a tool that was pitched as a remote learning necessity isn鈥檛 going away anytime soon. Minneapolis officials reacted swiftly when the pandemic engulfed the nation and forced students to learn from the confines of their bedrooms, paying more than $355,000 鈥 including nearly $64,000 in federal emergency relief money 鈥 to partner with Gaggle until 2023. Faced with a public health emergency, the district circumvented normal procurement rules, a reality that prevented concerned parents from raising objections until after it was too late.

A mental health dilemma

With each alert, Matlock and other district officials were given a vivid look into students鈥 most intimate thoughts and online behaviors, raising significant privacy concerns. It鈥檚 unclear, however, if any of them made kids safer. Independent research on the efficacy of Gaggle and similar services .

When students鈥 mental health comes into play, a complicated equation emerges. In recent years, schools have ramped up efforts to identify and provide interventions to children at risk of harming themselves or others. Gaggle executives see their tool as a key to identify youth who are lamenting over hardships or discussing violent plans. On average, Gaggle notifies school officials within 17 minutes after zeroing in on student content related to suicide and self-harm, according to the company, and officials claim they saved more than 1,400 lives during the 2020-21 school year.

Jeff Patterson

鈥淎s a parent you have no idea what鈥檚 going on in your kid鈥檚 head, but if you don鈥檛 know you can鈥檛 help them,鈥 said Jeff Patterson, Gaggle鈥檚 founder and CEO. 鈥淎nd I would always want to err on trying to identify kids who need help.鈥

Critics, however, have questioned Gaggle鈥檚 effectiveness and worry that rummaging through students personal files and conversations 鈥 and in some cases outing students for exhibiting signs of mental health issues including depression 鈥 could backfire.

Using surveillance to identify children in distress could exacerbate feelings of stigma and shame and could ultimately make students less likely to ask for help, said Jennifer Mathis, the director of policy and legal advocacy at in Washington, D.C.

鈥淢ost kids in that situation are not going to share anything anymore and are going to suffer for that,鈥 she said. 鈥淚t suggests that anything you write or say or do in school 鈥 or out of school 鈥 may be found and held against you and used in ways that you had not envisioned.鈥

Minneapolis parent Holly Kragthorpe-Shirley had a similar concern and questioned whether kids 鈥渁ctually have a safe space to raise some of their issues in a safe way鈥 if they鈥檙e stifled by surveillance.

In Minneapolis, for instance, Gaggle flagged the keywords 鈥渇eel depressed鈥 in a document titled 鈥淪EL Journal,鈥 a reference to social-emotional learning. In another instance, Gaggle flagged 鈥渟uicidal鈥 in a document titled 鈥渕ental health problems workbook.鈥

District officials acknowledged that Gaggle had captured student assignments and other personal files, an issue that civil rights groups have long been warning about. The documents obtained by 社区黑料 put hard evidence behind those concerns, said Amelia Vance, the director of at The Future of Privacy Forum, a Washington-based think tank.

Amelia Vance

鈥淭he hypotheticals we鈥檝e been talking about for a few years have come to fruition,鈥 she said. 鈥淚t is highly likely to undercut the trust of students not only in their school generally but in their teacher, in their counselor 鈥 in the mental health problems workbook.鈥 

Patterson shook off any privacy reservations, including those related to monitoring sensitive materials like journal entries, which he characterized as 鈥渃ries for help.鈥

鈥淪ometimes when we intervene we might cause some challenges, but more often than not the kids want to be helped,鈥 he said. Though Gaggle only monitors student files tied to school accounts, he cited a middle school girl鈥檚 private journal in a success story. He said the girl wrote in a digital journal that she suffered with self esteem issues and guilt after getting raped.

鈥淣o one in her life knew about this incident and because she journaled about it,鈥 Gaggle was able to notify school officials about what they鈥檇 learned, he said. 鈥淭hey were able to intervene and get this girl help for things that she couldn鈥檛 have dealt with on her own.鈥

鈥楴eedles in haystacks鈥

Tools like Gaggle have become ubiquitous in classrooms across the country, according to forthcoming research by the D.C.-based In a recent survey, 81 percent of teachers reported having such software in place in their schools. Though most students said they鈥檙e comfortable being monitored, 58 percent said they don鈥檛 share their 鈥渢rue thoughts or ideas鈥 as a result and 80 percent said they鈥檙e more careful about what they search online.

Such data suggest that youth are being primed to accept surveillance as an inevitable reality, said Elizabeth Laird, the center鈥檚 director of equity in civic technology. In return, she said, they鈥檙e giving up the ability to explore new ideas and learn from mistakes.

Gaggle, in business since 1999 and recently relocated to Dallas, monitors the digital files of more than 5 million students across the country each year with the pandemic being very good for its bottom line. Since the onset of the crisis, the number of students surveilled by the privately held company, which does not report its yearly revenue, has . Through artificial intelligence, Gaggle scans students鈥 emails, chat messages and other materials uploaded to students鈥 Google or Microsoft accounts in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. Moderators evaluate flagged material and notify school officials about content they find troubling 鈥 a bar that Matlock acknowledged is quite low as 鈥渢he system is always going to err on the side of caution鈥 and requires district administrators to evaluate materials鈥 context.

鈥淲e鈥檙e looking for needles in haystacks to basically save kids.鈥
鈥擩eff Patterson, founder and CEO of Gaggle, which analyzed more than 10 billion online student communications in the 2020-21 school year.

In Minneapolis, Gaggle officials discovered a majority of offenses in files within students鈥 Google Drive, including in word documents and spreadsheets. More than half of incidents originated on the Drive. Meanwhile, 22 percent originated in emails and 23 percent came from Google Hangouts, the chat feature.

School officials are alerted to only a tiny fraction of student communications caught up in Gaggle鈥檚 dragnet. Last school year, Gaggle collected more than 10 billion items nationally but just 360,000 incidents resulted in notifications to district officials, according to the company. Nationally, 41 percent of incidents during the 2020-21 school year related to suicide and self-harm, according to Gaggle, and a quarter centered on violence.

鈥淲e are looking for needles in haystacks to basically save kids,鈥 Patterson said.

鈥楢 really slippery slope鈥

It was Google Hangouts that had Matt Shaver on edge. When the pandemic hit, classrooms were replaced by video conferences and casual student interactions in hallways and cafeterias were relegated to Hangouts. For Shaver, who taught at a Minneapolis elementary school during the pandemic, students鈥 Hangouts use became overwhelming.

Students were so busy chatting with each other, he said, that many had lost focus on classroom instruction. So he proposed a blunt solution to district technology officials: Shut it down.

鈥淭he thing I wanted was 鈥楾ake the temptation away, take the opportunity away for them to use that,鈥欌 said Shaver, who has since left teaching and is now policy director at the education reform group EdAllies. 鈥淎nd I actually got pushback from IT saying 鈥楴o we鈥檙e not going to do that, this is a good social aspect that we鈥檙e trying to replicate.鈥欌

But unlike those hallway interactions, nobody was watching. Matlock, the district鈥檚 security head, said he was initially in the market for a new anonymous reporting tool, which allows students to flag their friends for behaviors they find troubling. He turned to Gaggle, which operates the anonymous reporting system SpeakUp for Safety, and saw the company鈥檚 AI-powered digital surveillance tool, which goes well beyond SpeakUp鈥檚 powers to ferret out potentially alarming student behavior, as a possibility to 鈥渆nhance the supports for students online.鈥

鈥淲e wanted to get something in place quickly, as we were moving quickly with the lockdown,鈥 he said, adding that going through traditional procurement hoops could take months. 鈥淕aggle had a strong national presence and a reputation.鈥

The district signed an initial six-month, $99,603 contract with Gaggle just a week after the virus shuttered schools in Minneapolis. Board of Education Chair Kim Ellison signed a second, three-year contract at an annual rate of $255,750 in September 2020.

The move came with steep consequences. Though SpeakUP was used just three times during the six-month window included in 社区黑料鈥檚 data, Gaggle鈥檚 surveillance tool flagged students nearly 1,300 times.

During that time, which coincided with the switch to remote learning, the largest share of incidents 鈥 38 percent 鈥 were pornographic or sexual in nature, including references to 鈥渟exual activity involving a student,鈥 professional videos and explicit, student-produced selfies which trigger alerts to the National Center for Missing and Exploited Children.

鈥淚鈥檓 trying to imagine finding out about this as a high schooler, that every single word I鈥檝e written on a Google Hangout or whatever is being monitored 鈥 we live in a country with laws around unreasonable search and seizure 鈥 and surveillance is just a really slippery slope.鈥
鈥擬att Shaver, former Minneapolis Public Schools teacher

An additional 30 percent were related to suicide and self-harm, including incidents that were triggered by keywords including 鈥渃utting,鈥 鈥渇eeling depressed,鈥 鈥渨ant to die,鈥 and 鈥渆nd it all.鈥 an additional 18 percent were related to violence, including threats, physical altercations, references to weapons and suspected child abuse. Such incidents were triggered by keywords including 鈥淏omb,鈥 鈥淕lock,鈥 鈥済oing to fight,鈥 and 鈥渂eat her.鈥 About a fifth of incidents were triggered by profanity.

Concerns over Gaggle鈥檚 reach during the pandemic weren鈥檛 limited to Minneapolis. In December 2020, a group of civil rights organizations including the American Civil Liberties Union of Northern California that by using Gaggle, the Fresno Unified School District had violated the California Electronic Communications Privacy Act, which requires officials to obtain search warrants before accessing electronic information. Such monitoring, the groups contend, infringe on students鈥 free-speech and privacy rights with little ability to opt out.

Shaver, whose students used Google Hangouts to the point of it becoming a distraction, was alarmed to learn that those communications were being analyzed by artificial intelligence and poured over by a remote team of people he didn鈥檛 even know.

鈥淚鈥檓 trying to imagine finding out about this as a high schooler, that every single word I鈥檝e written on a Google Hangout or whatever is being monitored,鈥 he said. 鈥淭here is, of course, some lesson in this, obviously like, 鈥楤e careful of what you put online.鈥 But we live in a country with laws around unreasonable search and seizure 鈥 and surveillance is just a really slippery slope.”

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

The potential to save lives

To Matlock, Gaggle is a lifesaver 鈥 literally. When the tool flagged a Minneapolis student鈥檚 suicide note in the middle of the night, Matlock said he rushed to intervene. In a late-night phone call, the security chief said he warned the unnamed parents, who knew their child was struggling but didn鈥檛 fully recognize how bad things had become. Because of Gaggle, school officials were able to get the student help. To Matlock, the possibility that he saved a student鈥檚 life offers a feeling he 鈥渃an鈥檛 even measure in words.鈥

鈥淚f it saved one kid, if it supported one caregiver, if it supported one family, I鈥檒l take it,鈥 he said. 鈥淭hat鈥檚 the bottom line.鈥

Despite heightened concern over youth mental health issues during the pandemic, its effect on youth suicide rates remains fuzzy. Preliminary data from the Minnesota health department show . Between 2019 and 2020, suicides among people 24 years old and younger decreased by more than 20 percent statewide. Nationally, the has surged during the pandemic, according to the Centers for Disease Control and Prevention, but for people of all ages show a 5.6 percent decline in self-inflicted fatalities in 2020 compared to 2019.

Meanwhile, Gaggle reported that it identified a significant increase of threats related to suicide, self-harm and violence nationwide between March 2020 and March 2021. During that period, Gaggle observed a 31 percent increase in flagged content overall, including a 35 percent increase in materials related to suicide and self-harm. Gaggle officials said the data highlight a mental health crisis among youth during the pandemic. But other factors could be at play. Among them is , creating additional opportunities for Gaggle to tag youth behavior. Meanwhile, the number of students monitored by Gaggle nationally grew markedly during the pandemic.

But that hasn鈥檛 stopped Gaggle from as it markets a new service: Gaggle Therapy. In school districts that sign up for the service, students who are flagged by Gaggle鈥檚 digital monitoring tool are matched with counselors for weekly teletherapy sessions. Therapists available through the service are independent contractors for Gaggle and districts can either pay Gaggle for 鈥渂lanket coverage,鈥 which makes all students eligible, or a 鈥渞etainer鈥 fee, which allows them to 鈥渦se the service as you need it,鈥 . Under the second scenario, Gaggle would have a financial incentive to identify more students in need of teletherapy.

In Minneapolis, Matlock said that school-based social workers and counselors lead intervention efforts when students are identified for materials related to self-harm. 鈥淭he initial moment may be a shock鈥 when students are confronted by school staff about their online behaviors, he said, but providing them with help 鈥渋s much better in the long run.鈥

A presentation sent to Minneapolis teachers explains how the district responds after Gaggle flags a 鈥減ossible student situation鈥 that officials say present an imminent threat. (Photo obtained by 社区黑料)

As the district rolled out the service, many parents and students were out of the loop. Among them was Nathaniel Genene, a recent graduate who served as the Minneapolis school board鈥檚 student representative at the time. He said that classmates contacted him after initial news of the Gaggle contract was released.

鈥淚 had a couple of friends texting me like 鈥楴athaniel, is this true?鈥欌 he said. 鈥淚t was kind of interesting because I had no idea it was even a thing.鈥

Yet as students gained a greater awareness that their communications were being monitored, Matlock said they began to test Gaggle鈥檚 parameters using potential keywords 鈥渁nd then say 鈥楬i鈥 to us while they put it in there.鈥

As students became conditioned to Gaggle, 鈥渢he shock is probably a little bit less,鈥 said Rochelle Cox, an associate superintendent at the Minneapolis school district. Now, she said students have an outlet to get help without having to explicitly ask. Instead, they can express their concerns online with an understanding that school officials are listening. As a result, school-based mental health professionals are able to provide the care students need, she said.

Mathis, with The Bazelon Center for Mental Health Law, called that argument 鈥渞idiculous.鈥 Officials should make sure that students know about available mental health services and ensure that they feel comfortable reaching out for help, she said.

鈥淭hat鈥檚 very different than deciding that we鈥檙e going to catch people by having them write into the ether and that鈥檚 how we鈥檙e going to find the students who need help,鈥 she said. 鈥淲e can be a lot more direct in communicating than that, and we should be a lot more direct and a lot more positive.鈥

In fact, subjecting students to surveillance could push them further into isolation and condition them to lie when officials reach out to inquire about their digital communications, argued Vance of the Future of Privacy Forum.

鈥淓ffective interventions are rarely going to be built on that, you know, 鈥業 saw what you were typing into a Google search last night鈥 or 鈥榳riting a journal entry for your English class,鈥欌 Vance said. 鈥淭hat doesn鈥檛 feel like it builds a trusting relationship. It feels creepy.鈥

]]>
鈥楧on鈥檛 Get Gaggled鈥: Minneapolis School District Spends Big on Student Surveillance Tool, Raising Ire After Terminating Its Police Contract /article/dont-get-gaggled-minneapolis-school-district-spends-big-on-student-surveillance-tool-raising-ire-after-terminating-its-police-contract/ Sun, 18 Oct 2020 17:01:00 +0000 /?post_type=article&p=562914 Minneapolis education leaders have spent hundreds of thousands of dollars this year to surveil children online, even after the district ended its police department contract and launched school safety reforms that officials said would build trust between adults and students.

The district terminated its longstanding relationship with the city鈥檚 police department after George Floyd died at the hands of a Minneapolis officer in May. But since the pandemic closed campuses in March and required students to attend online classes from home, the district has shelled out more than $355,000 for a digital surveillance tool called Gaggle, according to contracts obtained by 社区黑料 through a public records request.

Gaggle is currently used in hundreds of districts across the U.S., relying on artificial intelligence and a team of moderators to scan billions of student emails, chat messages and files each year in search of references to sex, drugs and violence.

Even while the police-free schools movement has garnered momentum in the wake of Floyd鈥檚 death, with districts nationwide reexamining the role of cops on campus, it has not appeared to slow the recent growth of the nearly $3 billion-a-year school security industry. A Gaggle executive said their service is key to student safety, and the company saw a sales surge with more than 100 school districts becoming new customers since schools went virtual in March.

鈥淲ith school now taking place in our students鈥 living rooms and bedrooms, safety is more important than ever,鈥 Jeff Patterson, Gaggle’s founder and CEO, said in the media release. 鈥淢any educators are concerned that without in-person school, they may not be able to identify students in abusive situations or those suffering from mental illness.鈥

But there鈥檚 little research to back up the company鈥檚 claims and critics argue that Gaggle and similar products could be detrimental to child development and amount to pervasive government surveillance. Civil rights groups and racial justice advocates are especially concerned about online surveillance tools during the pandemic as students across the country spend the majority of their academic lives in front of screens.

In Minneapolis, the latest revelation further outraged activists who cheered the district鈥檚 decision to terminate the police contract but grew wary after officials sought to substitute campus cops with 鈥減ublic safety support specialists鈥 with law enforcement backgrounds.

鈥淢y concern was that they would replace physical policing with technological policing, which appears to be something like Gaggle,鈥 said Marika Pfefferkorn, executive director of the Midwest Center for School Transformation and a proponent of the police-free schools movement. Pfefferkorn pushed the district to split with the cops but said the move was just the first step in curtailing the policing and surveillance of students 鈥 particularly those of color. Instead, Gaggle 鈥渉as the potential to further criminalize students.鈥

No such thing as confidentiality online

An initial six-month district contract with Gaggle, signed by Chief Operations Officer Karen DeVet just a week after the virus shuttered city schools, totaled $99,603 and was in place through the end of September. A second, three-year contract was signed months after Floyd鈥檚 death and went into effect this month at an annual rate of $255,750. School Board Chair Kim Ellison signed the second contract on Sept. 18. District and school board officials didn鈥檛 respond to multiple requests for comment.

In the contract, Gaggle notes that it 鈥渃annot guarantee security and confidentiality through its services鈥 and 鈥渕ay choose to turn over鈥 student messages to the police. However, the company said it 鈥渟hall not be responsible for contacting, notifying or alerting鈥 law enforcement and cannot guarantee that 鈥渁ll unsafe communications can or will be detected while monitoring your student communications or website content.鈥

Through the contracts, Gaggle on a range of Google services, including email, Docs, a video platform, the chat service Google Hangouts and other Google Classroom tools. Through artificial intelligence, the company scans students鈥 emails, chat messages and other materials for specific words and phrases that may indicate harm. Moderators evaluate flagged content and notify school officials about references to self-harm, depression, drug use and violent threats. Gaggle鈥檚 algorithm scans student content for trigger words including 鈥渂omb,鈥 鈥渄runk,鈥 鈥済un鈥 and 鈥渒ill me,鈥 according to . But it also scans for LGBTQ-specific words like 鈥済ay鈥 and 鈥渓esbian,鈥 which are often flagged as potential bullying.

Such keywords could lead Gaggle to disproportionately subject LGBTQ students to school surveillance, Pfefferkorn said.

鈥淥ver and over again, we continue to see with algorithms that bias is often baked in,鈥 she said.


鈥淎ny time you have a service turned on, you see pornography, you can see drugs and alcohol use or use being talked about. You, of course, have anxiety, depression and suicide being talked about.鈥Bill McCullough, Gaggle鈥檚 vice president of sales. 


In a brief message buried on one Minneapolis high school鈥檚 website 鈥 with the headline 鈥淒on鈥檛 Get Gaggled鈥 鈥 district staff noted that distance learning presents new challenges in supporting students鈥 mental and emotional health needs and offers a reminder that 鈥渢here is no such thing as confidentiality online.鈥 The webpage links to a video featuring counseling services manager Derek Francis, who notes that the district 鈥渨ill be monitoring chats and postings for inappropriate content and will follow up as is appropriate.鈥

鈥淢ake sure you鈥檙e not saying things online that you would never say to someone鈥檚 face,鈥 Francis warns students. 鈥淲e don鈥檛 want you to end up regretting something that you post.鈥

Prior to the pandemic, the Minneapolis district didn鈥檛 believe Gaggle鈥檚 services were necessary, said Bill McCullough, the company鈥檚 vice president of sales. But when the virus closed buildings, 鈥渢hey wanted us to start the service as quickly as possible,鈥 he said. After an initial six-month pilot, the district 鈥渞ealized that this service is extremely valuable and moved to a full contract this fall.鈥

鈥淎ny time you have a service turned on, you see pornography, you can see drugs and alcohol use or use being talked about,鈥 he said. 鈥淵ou, of course, have anxiety, depression and suicide being talked about.鈥

Ben Feist, the chief programs officer at the American Civil Liberties Union of Minnesota, has urged the state to adopt student data privacy protections for years, due in part to surveillance concerns with companies like Gaggle. In an interview, he said the Minneapolis district鈥檚 partnership with Gaggle is 鈥渕assively intrusive鈥 at a time when students鈥 use of technology for school has reached 鈥渃omplete saturation.鈥


鈥淎s far as I can tell, nobody has really thought this through, at least from any type of privacy lens. It鈥檚 hugely troubling.鈥 鈥擝en Feist, chief program officer for ACLU of Minnesota


By terminating the police contract, district leaders have said they鈥檙e working to dismantle what they called a 鈥渨hite supremacist culture.鈥 But Feist said that Gaggle could perpetuate racial disparities in student discipline. The Minneapolis district educates about 35,000 students, roughly 65 percent of whom are youth of color.

鈥淭here鈥檚 every reason to believe that the implementation of this type of surveillance is going to have a disproportionate impact on students of color and bring more people into a surveillance net that could have been avoided,鈥 he said. 鈥淎s far as I can tell, nobody has really thought this through, at least from any type of privacy lens. It鈥檚 hugely troubling.鈥

Searching for 鈥榮ad kids鈥

Gaggle and similar student surveillance platforms have long marketed themselves as crucial to preventing school violence. After the 2018 mass school shooting in Parkland, Florida, for example, companies bombarded education leaders with sales pitches touting their wares as the key ingredient to precluding more carnage. In the pandemic era, Gaggle is marketing itself as a tool for mental health intervention.

鈥淧eople are using our product to identify, largely, who the sad kids are in the school district,鈥 said McCullough, who noted concerns that the pandemic has taken a toll on students鈥 emotional wellbeing and could lead to a spike in youth suicides. Such a trend has emerged in students鈥 emails and other digital communications, he said, with an uptick in student comments about depression, suicide and domestic abuse. 鈥淏ut thankfully kids are still talking about it and we鈥檙e able to go and identify those kids who are in crisis鈥 to connect them with mental health services. McCullough declined to detail how his service has been used in Minneapolis, citing student privacy concerns.

Last school year, Gaggle monitored more than 4.5 million students鈥 online activities across the U.S., efforts it claims saved 927 lives, according to a company media release. In total, the company scanned 6.25 billion items within school accounts for content deemed harmful, including 64,000 references to suicide or self harm, 38,000 references of violence toward others and 18,000 instances of nudity or sexual content.

School surveillance doesn鈥檛 stop when classes end for the day. Prior to the pandemic, about 40 percent of incidents occurred after school hours, according to company data. But since March, incidents happening after hours increased to 55 percent. While threats of violence decreased by 43 percent after the pandemic closed campuses, the platform observed an uptick in students sending each other nude selfies.

Several years ago, concerning material was most often found in student emails, McCullough said. But now, messages are most often flagged in Google Docs, which students have used as makeshift chat rooms. In this context, students are often 鈥渢heir most authentic self鈥 and typically share documents 鈥渨ith just a few friends,鈥 he said.

Minneapolis and other districts have also paid Gaggle to monitor student communications in the chat tool Google Hangouts, which has taken on a new role in education during the pandemic. Without face-to-face interaction between students, they鈥檙e using Hangouts to collaborate on science projects and other assignments, McCullough said.

But critics argue that schools鈥 use of tools like Gaggle could discourage students from expressing themselves. Elizabeth Laird, the senior fellow of student privacy at the Center for Democracy and Technology, questioned whether such surveillance runs counter to schools鈥 mission of providing supportive environments where students can speak freely and learn from mistakes.

鈥淲hen people are surveilled in this way, it really limits that kind of free expression and can have a chilling effect on what they鈥檙e comfortable saying and doing,鈥 she said. Laird also raised concerns about the accuracy of algorithms, which often struggle to decipher true threats from slang or humor. Such noisy data could flag some students unnecessarily and miss signs that are genuinely concerning, she said.

School interactions with police are also a concern. In a recent parent survey, the Center for Democracy and Technology found that while most parents support the use of education technology, they鈥檙e also concerned about protecting their children鈥檚 digital privacy from vulnerabilities like hacks. Of the 1,200 participants who completed the online survey in May and June, 55 percent of parents 鈥 and 61 percent of those who are Black 鈥 said they鈥檙e concerned that .

Though Gaggle rarely contacts the police directly, McCullough said, the district hasn鈥檛 said how it鈥檚 responding to tips generated from the surveillance service.

The district completed a chaotic candidate search last month and hired 11 鈥減ublic safety support specialists鈥 to replace the school-based police. The district has refused to disclose the names and qualifications of the 11 people who filled the openings, but documents obtained by 社区黑料 suggest that more than half bring experience in policing, security or corrections 鈥 bolstering critics鈥 fears that the district ended the police contract but created an internal security force. According to an August school board agenda, the specialists training is supposed to encompass 鈥渟chool security 101,鈥 de-escalating conflicts, dismantling the 鈥渟chool-to-prison pipeline鈥 鈥 and Gaggle.

Pfefferkorn, the local activist, blasted Minneapolis schools for a lack of transparency in its student surveillance practices and demanded that officials answer hard questions.

鈥淚t鈥檚 an opportunity for the district to hold a meeting where they share鈥 how they鈥檙e using Gaggle to monitor students, she said. 鈥淎lthough you鈥檙e in a contract, contracts have been broken.鈥

]]>