Sen. Elizabeth Warren – 社区黑料 America's Education News Source Tue, 02 May 2023 01:42:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Sen. Elizabeth Warren – 社区黑料 32 32 Meet the Gatekeepers of Students鈥 Private Lives /article/meet-the-gatekeepers-of-students-private-lives/ Mon, 02 May 2022 11:15:00 +0000 /?post_type=article&p=588567 If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Megan Waskiewicz used to sit at the top of the bleachers, rest her back against the wall and hide her face behind the glow of a laptop monitor. While watching one of her five children play basketball on the court below, she knew she had to be careful. 

The mother from Pittsburgh didn鈥檛 want other parents in the crowd to know she was also looking at child porn.

Waskiewicz worked as a content moderator for Gaggle, a surveillance company that monitors the online behaviors of some 5 million students across the U.S. on their school-issued Google and Microsoft accounts. Through an algorithm designed to flag references to sex, drugs, and violence and a team of content moderators like Waskiewicz, the company sifts through billions of students鈥 emails, chat messages and homework assignments each year. Their work is supposed to ferret out evidence of potential self-harm, threats or bullying, incidents that would prompt Gaggle to notify school leaders and, .

As a result, kids鈥 deepest secrets 鈥 like nude selfies and suicide notes 鈥 regularly flashed onto Waskiewicz鈥檚 screen. Though she felt 鈥渁 little bit like a voyeur,鈥 she believed Gaggle helped protect kids. But mostly, the low pay, the fight for decent hours, inconsistent instructions and stiff performance quotas left her feeling burned out. Gaggle鈥檚 moderators face pressure to review 300 incidents per hour and Waskiewicz knew she could get fired on a moment鈥檚 notice if she failed to distinguish mundane chatter from potential safety threats in a matter of seconds. She lasted about a year.

鈥淚n all honesty I was sort of half-assing it,鈥 Waskiewicz admitted in an interview with 社区黑料. 鈥淚t wasn鈥檛 enough money and you鈥檙e really stuck there staring at the computer reading and just click, click, click, click.”

Content moderators like Waskiewicz, hundreds of whom are paid just $10 an hour on month-to-month contracts, are on the front lines of a company that claims it saved the lives of 1,400 students last school year and argues that the growing mental health crisis makes its presence in students鈥 private affairs essential. Gaggle founder and CEO Jeff Patterson has warned about 鈥渁 tsunami of youth suicide headed our way鈥 and said that schools have 鈥渁 moral obligation to protect the kids on their digital playground.鈥 

Eight former content moderators at Gaggle shared their experiences for this story. While several believed their efforts in some cases did shield kids from serious harm, they also surfaced significant questions about the company鈥檚 efficacy, its employment practices and its effect on students鈥 civil rights.

Among the moderators who worked on a contractual basis, none had prior experience in school safety, security or mental health. Instead, their employment histories included retail work and customer service, but they were drawn to Gaggle while searching for remote jobs that promised flexible hours. 

They described an impersonal and cursory hiring process that appeared automated. Former moderators reported submitting applications online and never having interviews with Gaggle managers 鈥 either in-person, on the phone or over Zoom 鈥 before landing jobs.

Once hired, moderators reported insufficient safeguards to protect students鈥 sensitive data, a work culture that prioritized speed over quality, scheduling issues that sent them scrambling to get hours and frequent exposure to explicit content that left some traumatized. Contractors lacked benefits including mental health care and one former moderator said he quit after repeated exposure to explicit material that so disturbed him he couldn鈥檛 sleep and without 鈥渁ny money to show for what I was putting up with.鈥

Gaggle content moderators encompass as many as 600 contractors at any given time and just two dozen work as employees who have access to benefits and on-the-job training that lasts several weeks. Gaggle executives have sought to downplay contractors鈥 role with the company, arguing they use 鈥渃ommon sense鈥 to distinguish false flags generated by the algorithm from potential threats and do 鈥渘ot require substantial training.鈥 

While the experiences reported by Gaggle鈥檚 moderator team platforms like Meta-owned Facebook, Patterson said his company relies on 鈥淯.S.-based, U.S.-cultured reviewers as opposed to outsourcing that work to India or Mexico or the Philippines,鈥 as . He rebuffed former moderators who said they lacked sufficient time to consider the severity of a particular item.

鈥淪ome people are not fast decision-makers. They need to take more time to process things and maybe they鈥檙e not right for that job,鈥 he told 社区黑料. 鈥淔or some people, it鈥檚 no problem at all. For others, their brains don鈥檛 process that quickly.鈥

Executives also sought to minimize the contractors鈥 access to students鈥 personal information; a spokeswoman said they only see 鈥渟mall snippets of text鈥 and lacked access to what鈥檚 known as students鈥 鈥減ersonally identifiable information.鈥 Yet former contractors described reading lengthy chat logs, seeing nude photographs and, in some cases, coming upon students鈥 names. Several former moderators said they struggled to determine whether something should be escalated as harmful due to 鈥済ray areas,鈥 such as whether a Victoria鈥檚 Secret lingerie ad would be considered acceptable or not. 

鈥淭hose people are really just the very, very first pass,鈥 Gaggle spokeswoman Paget Hetherington said. 鈥淚t doesn鈥檛 really need training, it鈥檚 just like if there鈥檚 any possible doubt with that particular word or phrase it gets passed on.鈥 

Molly McElligott, a former content moderator and customer service representative, said management was laser focused on performance metrics, appearing more interested in business growth and profit than protecting kids. 

鈥淚 went into the experience extremely excited to help children in need,鈥 McElligott wrote in an email. Unlike the contractors, McElligott was an employee at Gaggle, where she worked for five months in 2021 before taking a position at the Manhattan District Attorney’s Office in New York. 鈥淚 realized that was not the primary focus of the company.”

Gaggle is part of a burgeoning campus security industry that鈥檚 seen significant business growth in the wake of mass school shootings as leaders scramble to prevent future attacks. Patterson, who founded the company in 1999 by that could be monitored for , said its focus now is mitigating the .

Patterson said the team talks about 鈥渓ives saved鈥 and child safety incidents at every meeting, and they are open about sharing the company鈥檚 financial outlook so that employees 鈥渃an have confidence in the security of their jobs.鈥

Content moderators work at a Facebook office in Austin, Texas. Unlike the social media giant, Gaggle鈥檚 content moderators work remotely. (Ilana Panich-Linsman / Getty Images)

鈥榃e are just expendable鈥

Under the pressure of new federal scrutiny along with three other companies that monitor students online, it relies on a 鈥渉ighly trained content review team鈥 to analyze student materials and flag safety threats. Yet former contractors, who make up the bulk of Gaggle鈥檚 content review team, described their training as 鈥渁 joke,鈥 consisting of a slideshow and an online quiz, that left them ill-equipped to complete a job with such serious consequences for students and schools.

As an employee on the company鈥檚 safety team, McElligott said she underwent two weeks of training but the disorganized instruction meant her and other moderators were 鈥渕ore confused than when we started.鈥

Former content moderators have also flocked to employment websites like Indeed.com to warn job seekers about their experiences with the company, often sharing reviews that resembled the former moderators鈥 feedback to 社区黑料.

鈥淚f you want to be not cared about, not valued and be completely stressed/traumatized on a daily basis this is totally the job for you,鈥 one on Indeed. 鈥淲arning, you will see awful awful things. No they don鈥檛 provide therapy or any kind of support either.

鈥淭hat isn鈥檛 even the worst part,鈥 the reviewer continued. 鈥淭he worst part is that the company does not care that you hold them on your backs. Without safety reps they wouldn鈥檛 be able to function, but we are just expendable.鈥 

As the first layer of Gaggle鈥檚 human review team, contractors analyze materials flagged by the algorithm and decide whether to escalate students鈥 communications for additional consideration. Designated employees on Gaggle鈥檚 Safety Team are in charge of calling or emailing school officials to notify them of troubling material identified in students鈥 files, Patterson said.

Gaggle鈥檚 staunchest critics have questioned the tool鈥檚 efficacy and describe it as a student privacy nightmare. In March, Democratic Sens. Elizabeth Warren and Ed Markey and similar companies to protect students鈥 civil rights and privacy. In a report, the senators said the tools could surveil students inappropriately, compound racial disparities in school discipline and waste tax dollars.

The information shared by the former Gaggle moderators with 社区黑料 鈥渟truck me as the worst-case scenario,鈥 said attorney Amelia Vance, the co-founder and president of Public Interest Privacy Consulting. Content moderators鈥 limited training and vetting, as well as their lack of backgrounds in youth mental health, she said, 鈥渋s not acceptable.鈥

In to lawmakers, Gaggle described a two-tiered review procedure but didn鈥檛 disclose that low-wage contractors were the first line of defense. CEO Patterson told 社区黑料 they 鈥渄idn鈥檛 have nearly enough time鈥 to respond to lawmakers鈥 questions about their business practices and didn鈥檛 want to divulge proprietary information. Gaggle uses a third party to conduct criminal background checks on contractors, Patterson said, but he acknowledged they aren鈥檛 interviewed before getting placed on the job.

鈥淭here鈥檚 a lot of contractors. We can鈥檛 do a physical interview of everyone and I don鈥檛 know if that’s appropriate,鈥 he said. 鈥淚t might actually introduce another set of biases in terms of who we hire or who we don鈥檛 hire.”

鈥極ther eyes were seeing it鈥

In a previous investigation, 社区黑料 analyzed a cache of public records to expose how Gaggle鈥檚 algorithm and content moderators subject students to relentless digital surveillance long after classes end for the day, extending schools鈥 authority far beyond their traditional powers to regulate speech and behavior, including at home. Gaggle鈥檚 algorithm relies largely on keyword matching and gives content moderators a broad snapshot of students鈥 online activities including diary entries, classroom assignments and casual conversations between students and their friends. 

After the pandemic shuttered schools and shuffled students into remote learning, Gaggle oversaw a surge in students鈥 online materials and of school districts interested in their services. Gaggle as educators scrambled to keep a watchful eye on students whose chatter with peers moved from school hallways to instant messaging platforms like Google Hangouts. One year into the pandemic, Gaggle in references to suicide and self-harm, accounting for more than 40% of all flagged incidents. 

Waskiewicz, who began working for Gaggle in January 2020, said that remote learning spurred an immediate shift in students鈥 online behaviors. Under lockdown, students without computers at home began using school devices for personal conversations. Sifting through the everyday exchanges between students and their friends, Waskiewicz said, became a time suck and left her questioning her own principles. 

鈥淚 felt kind of bad because the kids didn鈥檛 have the ability to have stuff of their own and I wondered if they realized that it was public,鈥 she said. 鈥淚 just wonder if they realized that other eyes were seeing it other than them and their little friends.鈥

Student activity monitoring software like Gaggle has become ubiquitous in U.S. schools, and 81% of teachers work in schools that use tools to track students鈥 computer activity, according to a recent survey by the nonprofit Center for Democracy and Technology. A majority of teachers said the benefits of using such tools, which can block obscene material and monitor students鈥 screens in real time, outweigh potential risks.

Likewise, students generally recognize that their online activities on school-issued devices are being observed, the survey found, and alter their behaviors as a result. More than half of student respondents said they don鈥檛 share their true thoughts or ideas online as a result of school surveillance and 80% said they were more careful about what they search online. 

A majority of parents reported that the benefits of keeping tabs on their children鈥檚 activity exceeded the risks. Yet they may not have a full grasp on how programs like Gaggle work, including the heavy reliance on untrained contractors and weak privacy controls revealed by 社区黑料鈥檚 reporting, said Elizabeth Laird, the group鈥檚 director of equity in civic technology. 

鈥淚 don鈥檛 know that the way this information is being handled actually would meet parents鈥 expectations,鈥 Laird said. 

Another former contractor, who reached out to 社区黑料 to share his experiences with the company anonymously, became a Gaggle moderator at the height of the pandemic. As COVID-19 cases grew, he said he felt unsafe continuing his previous job as a caregiver for people with disabilities so he applied to Gaggle because it offered remote work. 

About a week after he submitted an application, Gaggle gave him a key to kids鈥 private lives 鈥 including, most alarming to him, their nude selfies. Exposure to such content was traumatizing, the former moderator said, and while the job took a toll on his mental well-being, it didn鈥檛 come with health insurance. 

鈥淚 went to a mental hospital in high school due to some hereditary mental health issues and seeing some of these kids going through similar things really broke my heart,鈥 said the former contractor, who shared his experiences on the condition of anonymity, saying he feared possible retaliation by the company. 鈥淚t broke my heart that they had to go through these revelations about themselves in a context where they can鈥檛 even go to school and get out of the house a little bit. They have to do everything from home 鈥 and they鈥檙e being constantly monitored.鈥 

In this screenshot, Gaggle explains its terms and conditions for contract content moderators. The screenshot, which was provided to 社区黑料 by a former contractor who asked to remain anonymous, has been redacted.

Gaggle employees are offered benefits, including health insurance, and can attend group therapy sessions twice per month, Hetherington said. Patterson acknowledged the job can take a toll on staff moderators, but sought to downplay its effects on contractors and said they鈥檙e warned about exposure to disturbing content during the application process. He said using contractors allows Gaggle to offer the service at a price school districts can afford. 

鈥淨uite honestly, we鈥檙e dealing with school districts with very limited budgets,鈥 Patterson said. 鈥淭here have to be some tradeoffs.鈥 

The anonymous contractor said he wasn鈥檛 as concerned about his own well-being as he was about the welfare of the students under the company鈥檚 watch. The company lacked adequate safeguards to protect students鈥 sensitive information from leaking outside the digital environment that Gaggle built for moderators to review such materials. Contract moderators work remotely with limited supervision or oversight, and he became especially concerned about how the company handled students鈥 nude images, which are reported to school districts and the . Nudity and sexual content accounted for about 17% of emergency phone calls and email alerts to school officials last school year, . 

Contractors, he said, could easily save the images for themselves or share them on the dark web. 

Patterson acknowledged the possibility but said he wasn鈥檛 aware of any data breaches. 

鈥淲e do things in the interface to try to disable the ability to save those things,鈥 Patterson said, but 鈥測ou know, human beings who want to get around things can.鈥

鈥楳ade me feel like the day was worth it鈥

Vara Heyman was looking for a career change. After working jobs in retail and customer service, she made the pivot to content moderation and a contract position with Gaggle was her first foot in the door. She was left feeling baffled by the impersonal hiring process, especially given the high stakes for students. 

Waskiewicz had a similar experience. In fact, she said the only time she ever interacted with a Gaggle supervisor was when she was instructed to provide her bank account information for direct deposit. The interaction left her questioning whether the company that contracts with more than 1,500 school districts was legitimate or a scam. 

鈥淚t was a little weird when they were asking for the banking information, like 鈥榃ait a minute is this real or what?鈥欌 Waskiewicz said. 鈥淚 Googled them and I think they鈥檙e pretty big.鈥

Heyman said that sense of disconnect continued after being hired, with communications between contractors and their supervisors limited to a Slack channel. 

Despite the challenges, several former moderators believe their efforts kept kids safe from harm. McElligott, the former Gaggle safety team employee, recalled an occasion when she found a student鈥檚 suicide note. 

鈥淜nowing I was able to help with that made me feel like the day was worth it,鈥 she said. 鈥淗earing from the school employees that we were able to alert about self-harm or suicidal tendencies from a student they would never expect to be suffering was also very rewarding. It meant that extra attention should or could be given to the student in a time of need.鈥 

Susan Enfield, the superintendent of Highline Public Schools in suburban Seattle, said her district鈥檚 contract with Gaggle has saved lives. Earlier this year, for example, the company detected a student鈥檚 suicide note early in the morning, allowing school officials to spring into action. The district uses Gaggle to keep kids safe, she said, but acknowledged it can be a disciplinary tool if students violate the district鈥檚 code of conduct. 

鈥淣o tool is perfect, every organization has room to improve, I鈥檓 sure you could find plenty of my former employees here in Highline that would give you an earful about working here as well,鈥 said Enfield, one of 23 current or former superintendents from across the country who Gaggle cited as references in its letter to Congress. 

鈥淭here鈥檚 always going to be pros and cons to any organization, any service,鈥 Enfield told 社区黑料, 鈥渂ut our experience has been overwhelmingly positive.鈥

True safety threats were infrequent, former moderators said, and most of the content was mundane, in part because the company鈥檚 artificial intelligence lacked sophistication. They said the algorithm routinely flagged students鈥 papers on the novels To Kill a Mockingbird and The Catcher in the Rye. They also reported being inundated with spam emailed to students, acting as human spam filters for a task that鈥檚 long been automated in other contexts. 

Conor Scott, who worked as a contract moderator while in college, said that 鈥99% of the time鈥 Gaggle鈥檚 algorithm flagged pedestrian materials including pictures of sunsets and student鈥檚 essays about World War II. Valid safety concerns, including references to violence and self-harm, were rare, Scott said. But he still believed the service had value and felt he was doing 鈥渢he right thing.鈥

McElligott said that managers鈥 personal opinions added another layer of complexity. Though moderators were 鈥渉eld to strict rules of right and wrong decisions,鈥 she said they were ultimately 鈥渂eing judged against our managers鈥 opinions of what is concerning and what is not.鈥 

鈥淚 was told once that I was being overdramatic when it came to a potential inappropriate relationship between a child and adult,鈥 she said. 鈥淭here was also an item that made me think of potential trafficking or child sexual abuse, as there were clear sexual plans to meet up 鈥 and when I alerted it, I was told it was not as serious as I thought.鈥 

Patterson acknowledged that gray areas exist and that human discretion is a factor in deciding what materials are ultimately elevated to school leaders. But such materials, he said, are not the most urgent safety issues. He said their algorithm errs on the side of caution and flags harmless content because district leaders are 鈥渟o concerned about students.鈥 

The former moderator who spoke anonymously said he grew alarmed by the sheer volume of mundane student materials that were captured by Gaggle鈥檚 surveillance dragnet, and pressure to work quickly didn鈥檛 offer enough time to evaluate long chat logs between students having 鈥渉eartfelt and sensitive鈥 conversations. On the other hand, run-of-the-mill chatter offered him a little wiggle room. 

鈥淲hen I would see stuff like that I was like 鈥極h, thank God, I can just get this out of the way and heighten how many items per hour I鈥檓 getting,鈥欌 he said. 鈥淚t鈥檚 like 鈥業 hope I get more of those because then I can maybe spend a little more time actually paying attention to the ones that need it.鈥欌 

Ultimately, he said he was unprepared for such extensive access to students鈥 private lives. Because Gaggle鈥檚 algorithm flags keywords like 鈥済ay鈥 and 鈥渓esbian,鈥 for example, it alerted him to students exploring their sexuality online. Hetherington, the Gaggle spokeswoman, said such keywords are included in its dictionary to 鈥渆nsure that these vulnerable students are not being harassed or suffering additional hardships,鈥 but critics have accused the company of subjecting LGBTQ students to disproportionate surveillance. 

鈥淚 thought it would just be stopping school shootings or reducing cyberbullying but no, I read the chat logs of kids coming out to their friends,鈥 the former moderator said. 鈥淚 felt tremendous power was being put in my hands鈥 to distinguish students鈥 benign conversations from real danger, 鈥渁nd I was given that power immediately for $10 an hour.鈥 

Minneapolis student Teeth Logsdon-Wallace, who posed for this photo with his dog Gilly, used a classroom assignment to discuss a previous suicide attempt and explained how his mental health had since improved. He became upset after Gaggle flagged his assignment. (Photo courtesy Alexis Logsdon)

A privacy issue

For years, student privacy advocates and civil rights groups have warned about the potential harms of Gaggle and similar surveillance companies. Fourteen-year-old Teeth Logsdon-Wallace, a Minneapolis high school student, fell under Gaggle鈥檚 watchful eye during the pandemic. Last September, he used a class assignment to write about a previous suicide attempt and explained how music helped him cope after being hospitalized. Gaggle flagged the assignment to a school counselor, a move the teen called a privacy violation. 

He said it鈥檚 鈥渏ust really freaky鈥 that moderators can review students鈥 sensitive materials in public places like at basketball games, but ultimately felt bad for the contractors on Gaggle鈥檚 content review team. 

鈥淣ot only is it violating the privacy rights of students, which is bad for our mental health, it鈥檚 traumatizing these moderators, which is bad for their mental health,鈥 he said. Relying on low-wage workers with high turnover, limited training and without backgrounds in mental health, he said, can have consequences for students. 

鈥淏ad labor conditions don鈥檛 just affect the workers,鈥 he said. 鈥淚t affects the people they say they are helping.鈥 

Gaggle cannot prohibit contractors from reviewing students鈥 private communications in public settings, Heather Durkac, the senior vice president of operations, said in a statement. 

鈥淗owever, the contractors know the nature of the content they will be reviewing,鈥 Durkac said. 鈥淚t is their responsibility and part of their presumed good and reasonable work ethic to not be conducting these content reviews in a public place.鈥 

Gaggle鈥檚 former contractors also weighed students’ privacy rights. Heyman said she 鈥渨ent back and forth鈥 on those implications for several days before applying to the job. She ultimately decided that Gaggle was acceptable since it is limited to school-issued technology. 

鈥淚f you don鈥檛 want your stuff looked at, you can use Hotmail, you can use Gmail, you can use Yahoo, you can use whatever else is out there,鈥 she said. 鈥淎s long as they鈥檙e being told and their parents are being told that their stuff is going to be monitored, I feel like that is OK.鈥 

Logsdon-Wallace and his mother said they didn鈥檛 know Gaggle existed until his classroom assignment got flagged to a school counselor. 

Meanwhile, the anonymous contractor said that chat conversations between students that got picked up by Gaggle鈥檚 algorithm helped him understand the effects that surveillance can have on young people. 

鈥淪ometimes a kid would use a curse word and another kid would be like, 鈥楧ude, shut up, you know they鈥檙e watching these things,鈥欌 he said. 鈥淭hese kids know that they鈥檙e being looked in on,鈥 even if they don鈥檛 realize their observer is a contractor working from the couch in his living room. 鈥淎nd to be the one that is doing that 鈥 that is basically fulfilling what these kids are paranoid about 鈥 it just felt awful.鈥 

If you are in crisis, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255), or contact the Crisis Text Line by texting TALK to 741741.

Disclosure: Campbell Brown is the head of news partnerships at Facebook. Brown co-founded 社区黑料 and sits on its board of directors.

]]>