Westport – 社区黑料 America's Education News Source Mon, 12 Sep 2022 17:37:28 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Westport – 社区黑料 32 32 Gaggle Surveils Millions of Kids in the Name of Safety. Targeted Families Argue it鈥檚 鈥楴ot That Smart鈥 /article/gaggle-surveillance-minnesapolis-families-not-smart-ai-monitoring/ Tue, 12 Oct 2021 11:15:00 +0000 /?post_type=article&p=578988 In the midst of a pandemic and a national uprising, Teeth Logsdon-Wallace was kept awake at night last summer by the constant sounds of helicopters and sirens. 

For the 13-year-old from Minneapolis who lives close to where George Floyd was murdered in May 2020, the pandemic-induced isolation and social unrest amplifed his transgender dysphoria, emotional distress that occurs when someone鈥檚 gender identity differs from their sex assigned at birth. His billowing depression landed him in the hospital after an attempt to die by suicide. During that dark stretch, he spent his days in an outpatient psychiatric facility, where therapists embraced music therapy. There, he listened to a punk song on loop that promised how  

Eventually they did. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Logsdon-Wallace, a transgender eighth-grader who chose the name Teeth, has since 鈥済raduated鈥 from weekly therapy sessions and has found a better headspace, but that didn鈥檛 stop school officials from springing into action after he wrote about his mental health. In a school assignment last month, he reflected on his suicide attempt and how the punk rock anthem by the band Ramshackle Glory helped him cope 鈥 intimate details that wound up in the hands of district security. 

In a classroom assignment last month, Minneapolis student Teeth Logsdon-Wallace explained how the Ramshackle Glory song 鈥淵our Heart is a Muscle the Size of Your Fist鈥 helped him cope after an attempt to die by suicide. In the assignment, which was flagged by the student surveillance company Gaggle, Logsdon-Wallace wrote that the song was 鈥渁 reminder to keep on loving, keep on fighting and hold on for your life.鈥 (Photo courtesy Teeth Logsdon-Wallace)

The classroom assignment was one of thousands of Minneapolis student communications that got flagged by Gaggle, a digital surveillance company that saw rapid growth after the pandemic forced schools into remote learning. In an earlier investigation, 社区黑料 analyzed nearly 1,300 public records from Minneapolis Public Schools to expose how Gaggle subjects students to relentless digital surveillance 24 hours a day, seven days a week, raising significant privacy concerns for more than 5 million young people across the country who are monitored by the company鈥檚 digital algorithm and human content moderators. 

But technology experts and families with first-hand experience with Gaggle鈥檚 surveillance dragnet have raised a separate issue: The service is not only invasive, it may also be ineffective. 

While the system flagged Logsdon-Wallace for referencing the word 鈥渟uicide,鈥 context was never part of the equation, he said. Two days later, in mid-September, a school counselor called his mom to let her know what officials had learned. The meaning of the classroom assignment 鈥 that his mental health had improved 鈥 was seemingly lost in the transaction between Gaggle and the school district. He felt betrayed. 

 鈥淚 was trying to be vulnerable with this teacher and be like, 鈥楬ey, here鈥檚 a thing that鈥檚 important to me because you asked,鈥 Logsdon-Wallace said. 鈥淣ow, when I鈥檝e made it clear that I鈥檓 a lot better, the school is contacting my counselor and is freaking out.鈥

Jeff Patterson, Gaggle鈥檚 founder and CEO, said in a statement his company does not 鈥渕ake a judgement on that level of the context,鈥 and while some districts have requested to be notified about references to previous suicide attempts, it鈥檚 ultimately up to administrators to 鈥渄ecide the proper response, if any.鈥  

鈥楢 crisis on our hands鈥

Minneapolis Public Schools first contracted with Gaggle in the spring of 2020 as the pandemic forced students nationwide into remote learning. Through AI and the content moderator team, Gaggle tracks students鈥 online behavior everyday by analyzing materials on their school-issued Google and Microsoft accounts. The tool scans students鈥 emails, chat messages and other documents, including class assignments and personal files, in search of keywords, images or videos that could indicate self-harm, violence or sexual behavior. The remote moderators evaluate flagged materials and notify school officials about content they find troubling. 

In Minneapolis, Gaggle flagged students for keywords related to pornography, suicide and violence, according to six months of incident reports obtained by 社区黑料 through a public records request. The private company also captured their journal entries, fictional stories and classroom assignments. 

Gaggle executives maintain that the system saves lives, including those of during the 2020-21 school year. Those figures have not been independently verified. Minneapolis school officials make similar assertions. Though the pandemic鈥檚 effects on suicide rates remains fuzzy, suicide has been a leading cause of death among teenagers for years. Patterson, who has watched his business during COVID-19, said Gaggle could be part of the solution. Though not part of its contract with Minneapolis schools, the company recently launched a service that connects students flagged by the monitoring tool with teletherapists. 

鈥淏efore the pandemic, we had a crisis on our hands,鈥 he said. 鈥淚 believe there鈥檚 a tsunami of youth suicide headed our way that we are not prepared for.鈥 

Schools nationwide have increasingly relied on technological tools that purport to keep kids safe, yet there鈥檚 to back up their claims.

Minneapolis student Teeth Logsdon-Wallace poses with his dog Gilly. (Photo courtesy Alexis Logsdon)

Like many parents, Logsdon-Wallace鈥檚 mother Alexis Logsdon didn鈥檛 know Gaggle existed until she got the call from his school counselor. Luckily, the counselor recognized that Logsdon-Wallace was discussing events from the past and offered a measured response. His mother was still left baffled. 

鈥淭hat was an example of somebody describing really good coping mechanisms, you know, 鈥業 have music that is one of my soothing activities that helps me through a really hard mental health time,鈥欌 she said. 鈥淏ut that doesn鈥檛 matter because, obviously, this software is not that smart 鈥 it鈥檚 just like 鈥榃oop, we saw the word.鈥欌 

鈥楻andom and capricious鈥

Many students have accepted digital surveillance as an inevitable reality at school, according to a new survey by the Center for Democracy and Technology  in Washington, D.C. But some youth are fighting back, including Lucy Dockter, a 16-year-old junior from Westport, Connecticut. On multiple occasions over the last several years, Gaggle has flagged her communications 鈥 an experience she described as 鈥渞eally scary.鈥

鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥
Lucy Dockter, 16, Westport, Connecticut student mistakenly flagged by Gaggle

On one occasion, Gaggle sent her an email notification of 鈥淚nappropriate Use鈥 while she was walking to her first high school biology midterm and her heart began to race as she worried what she had done wrong. Dockter is an editor of her high school鈥檚 literary journal and, according to her, Gaggle had ultimately flagged profanity in students鈥 fictional article submissions. 

鈥淭he link at the bottom of this email is for something that was identified as inappropriate,鈥 Gaggle warned in its email while pointing to one of the fictional articles. 鈥淧lease refrain from storing or sharing inappropriate content in your files.鈥 

Gaggle emailed a warning to Connecticut student Lucy Dockter for profanity in a literary journal article. (Photo courtesy Lucy Dockter)

But Gaggle doesn鈥檛 catch everything. Even as she got flagged when students shared documents with her, the articles鈥 authors weren鈥檛 receiving similar alerts, she said. And neither did Gaggle鈥檚 AI pick up when she wrote about the discrepancy in where she included a four-letter swear word to make a point. In the article, which Dockter wrote with Google Docs, she argued that Gaggle鈥檚 monitoring system is 鈥渞andom and capricious,鈥 and could be dangerous if school officials rely on its findings to protect students. 

Her experiences left the Connecticut teen questioning whether such tracking is even helpful. 

鈥淲ith such a seemingly random service, that doesn鈥檛 seem to 鈥 in the end 鈥 have an impact on improving student health or actually taking action to prevent suicide and threats鈥 she said in an interview. 鈥淚f it works, it could be extremely beneficial. But if it鈥檚 random, it鈥檚 completely useless.鈥

Lucy Dockter

Some schools have asked Gaggle to email students about the use of profanity, but Patterson said the system has an error that he blamed on the tech giant Google, which at times 鈥渄oes not properly indicate the author of a document and assigns a random collaborator.鈥

鈥淲e are hoping Google will improve this functionality so we can better protect students,鈥 Patterson said. 

Back in Minneapolis, attorney Cate Long said she became upset when she learned that Gaggle was monitoring her daughter on her personal laptop, which 10-year-old Emmeleia used for remote learning. She grew angrier when she learned the district didn鈥檛 notify her that Gaggle had identified a threat. 

This spring, a classmate used Google Hangouts, the chat feature, to send Emmeleia a death threat, warning she鈥檇 shoot her 鈥減uny little brain with my grandpa鈥檚 rifle.鈥

Minneapolis mother Cate Long said a student used Google Hangouts to send a death threat to her 10-year-old daughter Emmeleia. Officials never informed her about whether Gaggle had flagged the threat. (Photo courtesy Cate Long)

When Long learned about the chat, she notified her daughter鈥檚 teacher but was never informed about whether Gaggle had picked up on the disturbing message as well. Missing warning signs could be detrimental to both students and school leaders; districts if they fail to act on credible threats.

鈥淚 didn鈥檛 hear a word from Gaggle about it,鈥 she said. 鈥淚f I hadn鈥檛 brought it to the teacher鈥檚 attention, I don鈥檛 think that anything would have been done.鈥 

The incident, which occurred in April, fell outside the six-month period for which 社区黑料 obtained records. A Gaggle spokesperson said the company picked up on the threat and notified district officials an hour and a half later but it 鈥渄oes not have any insight into the steps the district took to address this particular matter.鈥 

Julie Schultz Brown, the Minneapolis district spokeswoman, said that officials 鈥渨ould never discuss with a community member any communication flagged by Gaggle.鈥 

鈥淭hat unrelated but concerned parent would not have been provided that information nor should she have been,鈥 she wrote in an email. 鈥淭hat is private.鈥 

Cate Long poses with her 10-year-old daughter Emmeleia. (Photo courtesy Cate Long)

鈥楾he big scary algorithm鈥

When identifying potential trouble, Gaggle鈥檚 algorithm relies on keyword matching that compares student communications against a dictionary of thousands of words the company believes could indicate potential issues. The company scans student emails before they鈥檙e delivered to their intended recipients, said Patterson, the CEO. Files within Google Drive, including Docs and Sheets, are scanned as students write in them, he said. In one instance, the technology led to the arrest of a 35-year-old Michigan man who tried to send pornography to an 11-year-old girl in New York, . Gaggle prevented the file from ever reaching its intended recipient.  

Though the company allows school districts to alter the keyword dictionary to reflect local contexts, less than 5 percent of districts customize the filter, Patterson said. 

That鈥檚 where potential problems could begin, said Sara Jordan, an expert on artificial intelligence and senior researcher at the in Washington. For example, language that students use to express suicidal ideation could vary between Manhattan and rural Appalachia, she said.

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 This is not Netflix鈥檚 recommendation engine. This is not Spotify.鈥
Sara Jordan, AI expert and senior researcher, Future of Privacy Forum

Sara Jordan

On the other hand, she noted that false-positives are highly likely, especially when the system flags common swear words and fails to understand context. 

鈥淵ou鈥檙e going to get 25,000 emails saying that a student dropped an F-bomb in a chat,鈥 she said. 鈥淲hat鈥檚 the utility of that? That seems pretty low.鈥 

She said that Gaggle鈥檚 utility could be impaired because it doesn鈥檛 adjust to students鈥 behaviors over time, comparing it to Netflix, which recommends television shows based on users鈥 ever-evolving viewing patterns. 鈥淪omething that doesn鈥檛 learn isn鈥檛 going to be accurate,鈥 she said. For example, she said the program could be more useful if it learned to ignore the profane but harmless literary journal entries submitted to Dockter, the Connecticut student. Gaggle鈥檚 marketing materials appear to overhype the tool鈥檚 sophistication to schools, she said. 

鈥淲e鈥檙e using the big scary algorithm term here when I don鈥檛 think it applies,鈥 she said. 鈥淭his is not Netflix鈥檚 recommendation engine. This is not Spotify. This is not American Airlines serving you specific forms of flights based on your previous searches and your location.鈥 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart.鈥
Jeff Patterson, Gaggle founder and CEO

Patterson said Gaggle鈥檚 proprietary algorithm is updated regularly 鈥渢o adjust to student behaviors over time and improve accuracy and speed.鈥 The tool monitors 鈥渢housands of keywords, including misspellings, slang words, evolving trends and terminologies, all informed by insights gleaned over two decades of doing this work.鈥 

Ultimately, the algorithm to identify keywords is used to 鈥渘arrow down the haystack as much as possible,鈥 Patterson said, and Gaggle content moderators review materials to gauge their risk levels. 

鈥淎rtificial intelligence without human intelligence ain鈥檛 that smart,鈥 he said. 

In Minneapolis, officials denied that Gaggle infringes on students鈥 privacy and noted that the tool only operates within school-issued accounts. The district鈥檚 internet use policy states that students should 鈥渆xpect only limited privacy,鈥 and that the misuse of school equipment could result in discipline and 鈥渃ivil or criminal liability.鈥 District leaders have also cited compliance with the Clinton-era which became law in 2000 and requires schools to monitor 鈥渢he online activities of minors.鈥 

Patterson suggested that teachers aren鈥檛 paying close enough attention to keep students safe on their own and 鈥渟ometimes they forget that they鈥檙e mandated reporters.鈥 On the , Patterson says he launched the company in 1999 to provide teachers with 鈥渁n easy way to watch over their gaggle of students.鈥 Legally, teachers are mandated to report suspected abuse and neglect, but Patterson broadens their sphere of responsibility and his company鈥檚 role in meeting it. As technology becomes a key facet of American education, Patterson said that schools 鈥渉ave a moral obligation to protect the kids on their digital playground.鈥 

But Elizabeth Laird, the director of equity in civic technology at the Center for Democracy and Technology, argued the federal law was never intended to mandate student 鈥渢racking鈥 through artificial intelligence. In fact, the statute includes a disclaimer stating it shouldn鈥檛 be 鈥渃onstrued to require the tracking of internet use by any identifiable minor or adult user.鈥 In , her group urged the government to clarify the Children鈥檚 Internet Protection Act鈥檚 requirements and distinguish monitoring from tracking individual student behaviors. 

Sen. Elizabeth Warren, a Democrat from Massachusetts, agrees. In recent letters to Gaggle and other education technology companies, Warren and other Democratic lawmakers said they鈥檙e concerned the tools 鈥渕ay extend beyond鈥 the law鈥檚 intent 鈥渢o surveil student activity or reinforce biases.鈥 Around-the-clock surveillance, they wrote, demonstrates 鈥渁 clear invasion of student privacy, particularly when students and families are unable to opt out.鈥 

鈥淓scalations and mischaracterizations of crises may have long-lasting and harmful effects on students鈥 mental health due to stigmatization and differential treatment following even a false report,鈥 the senators wrote. 鈥淔lagging students as 鈥榟igh-risk鈥 may put them at risk of biased treatment from physicians and educators in the future. In other extreme cases, these tools can become analogous to predictive policing, which are notoriously biased against communities of color.鈥

A new kind of policing

Shortly after the school district piloted Gaggle for distance learning, education leaders were met with an awkward dilemma. Floyd鈥檚 murder at the hands of a Minneapolis police officer prompted Minneapolis Public Schools to sever its ties with the police department for school-based officers and replace them with district security officers who lack the authority to make arrests. Gaggle flags district security when it identifies student communications the company believes could be harmful. 

Some critics have compared the surveillance tool to a new form of policing that, beyond broad efficacy concerns, could have a disparate impact on students of color, similar to traditional policing. to suffer biases. 

Matt Shaver, who taught at a Minneapolis elementary school during the pandemic but no longer works for the district, said he was concerned that could be baked into Gaggle鈥檚 algorithm. Absent adequate context or nuance,  he worried the tool could lead to misunderstandings. 

Data obtained by 社区黑料 offer a limited window into Gaggle鈥檚 potential effects on different student populations. Though the district withheld many details in the nearly 1,300 incident reports, just over 100 identified the campuses where the involved students attended school. An analysis of those reports failed to identify racial discrepancies. Specifically, Gaggle was about as likely to issue incident reports in schools where children of color were the majority as it was at campuses where most children were white. It remains possible that students of color in predominantly white schools may have been disproportionately flagged by Gaggle or faced disproportionate punishment once identified. Broadly speaking, Black students are far more likely to be suspended or arrested at school than their white classmates, according to federal education data. 

Gaggle and Minneapolis district leaders acknowledged that students鈥 digital communications are forwarded to police in rare circumstances. The Minneapolis district鈥檚 internet use policy explains that educators could contact the police if students use technology to break the law and a document given to teachers about the district鈥檚 Gaggle contract further highlights the possibility of law enforcement involvement. 

Jason Matlock, the Minneapolis district鈥檚 director of emergency management, safety and security, said that law enforcement is not a 鈥渞egular partner,鈥 when responding to incidents flagged by Gaggle. It doesn鈥檛 deploy Gaggle to get kids into trouble, he said, but to get them help. He said the district has interacted with law enforcement about student materials flagged by Gaggle on several occasions, but only in cases related to child pornography. Such cases, he said, often involve students sharing explicit photographs of themselves. During a six-month period from March to September 2020, Gaggle flagged Minneapolis students more than 120 times for incidents related to child pornography, according to records obtained by 社区黑料.

Jason Matlock, the director of emergency management, safety and security at the Minneapolis school district, discusses the decision to partner with Gaggle as students moved to remote learning during the pandemic. (Screenshot)

鈥淓ven if a kid has put out an image of themselves, no one is trying to track them down to charge them or to do anything negative to them,鈥 Matlock said, though it鈥檚 unclear if any students have faced legal consequences. 鈥淚t鈥檚 the question as to why they鈥檙e doing it,鈥 and to raise the issue with their parents.

Gaggle鈥檚 keywords could also have a disproportionate impact on LGBTQ children. In three-dozen incident reports, Gaggle flagged keywords related to sexual orientation including 鈥済ay, and 鈥渓esbian.鈥 On at least one occasion, school officials outed an LGBTQ student to their parents, according to

Logsdon-Wallace, the 13-year-old student, called the incident 鈥渄isgusting and horribly messed up.鈥 

鈥淭hey have gay flagged to stop people from looking at porn, but one, that is going to be mostly targeting people who are looking for gay porn and two, it鈥檚 going to be false-positive because they are acting as if the word gay is inherently sexual,鈥 he said. 鈥淲hen people are just talking about being gay, anything they鈥檙e writing would be flagged.鈥 

The service could also have a heavier presence in the lives of low-income families, he added, who may end up being more surveilled than their affluent peers. Logsdon-Wallace said he knows students who rely on school devices for personal uses because they lack technology of their own. Among the 1,300 Minneapolis incidents contained in 社区黑料鈥檚 data, only about a quarter were reported to district officials on school days between 8 a.m. and 4 p.m.

鈥淭hat鈥檚 definitely really messed up, especially when the school is like 鈥極h no, no, no, please keep these Chromebooks over the summer,鈥欌 an invitation that gave students 鈥渢he go-ahead to use them鈥 for personal reasons, he said.

鈥淓specially when it鈥檚 during a pandemic when you can鈥檛 really go anywhere and the only way to talk to your friends is through the internet.鈥

]]>