Vivek Murthy – 社区黑料 America's Education News Source Tue, 20 Aug 2024 16:09:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png Vivek Murthy – 社区黑料 32 32 AI 鈥楥ompanions鈥 are Patient, Funny, Upbeat 鈥 and Probably Rewiring Kids鈥 Brains /article/ai-companions-are-patient-funny-upbeat-and-probably-rewiring-kids-brains/ Wed, 07 Aug 2024 11:01:00 +0000 /?post_type=article&p=730602 As a sophomore at a large public North Carolina university, Nick did what millions of curious students did in the spring of 2023: He logged on to ChatGPT and started asking questions.

Soon he was having 鈥渄eep psychological conversations鈥 with the popular AI chatbot, going down a rabbit hole on the mysteries of the mind and the human condition.

He鈥檇 been to therapy and it helped. ChatGPT, he concluded, was similarly useful, a 鈥渢ool for people who need on-demand talking to someone else.鈥

Nick (he asked that his last name not be used) began asking for advice about relationships, and for reality checks on interactions with friends and family.

Before long, he was excusing himself in fraught social situations to talk with the bot. After a fight with his girlfriend, he鈥檇 step into a bathroom and pull out his mobile phone in search of comfort and advice. 

鈥淚’ve found that it’s extremely useful in helping me relax,鈥 he said.

Young people like Nick are increasingly turning to AI bots and companions, entrusting them with random questions, schoolwork queries and personal dilemmas. On occasion, they even become entangled romantically.

Screenshot of a recent conversation between Nick, a college student, and ChatGPT

While these interactions can be helpful and even life-affirming for anxious teens and twenty-somethings, some experts warn that tech companies are running what amounts to a grand, unregulated psychological experiment with millions of subjects, one that could have disastrous consequences. 

鈥淲e’re making it so easy to make a bad choice,鈥 said Michelle Culver, who spent 22 years at Teach for America, the last five as the creator and director of the, its research arm.

The companions both mimic our real relationships and seek to improve upon them: Users most often text-message their AI pals on smartphones, imitating the daily routines of platonic and romantic relationships. But unlike their real counterparts, the AI friends are programmed to be studiously upbeat, never critical, with a great sense of humor and a healthy, philosophical perspective. A few premium, NSFW models also display a ready-made lust for, well, lust.

As a result, they may be leading young people down a troubling path, according to a by VoiceBox, a youth content platform. It found that many kids are being exposed to risky behaviors from AI chatbots, including sexually charged dialogue and references to self-harm. 

U.S. Surgeon General Vivek Murthy speaks during a hearing with the Senate Health, Education, Labor, and Pensions committee at the Dirksen Senate Office Building on June 08, 2023 in Washington, DC. The committee held the hearing to discuss the mental health crisis for youth in the United States. (Photo by Anna Moneymaker/Getty Images)

The phenomenon arises at a critical time for young people. In 2023, U.S. Surgeon General Vivek Murthy found that, just three years after the pandemic, Americans were experiencing an 鈥,鈥 with young adults almost twice as likely to report feeling lonely as those over 65.

As if on cue, the personal AI chatbot arrived. 

Little research exists on young people鈥檚 use of AI companions, but they鈥檙e becoming ubiquitous. The startup earlier this year said 3.5 million people visit its site daily. It features thousands of chatbots, including nearly 500 with the words “therapy,” “psychiatrist” or related words in their names. According to Character.ai, these are among the site鈥檚 most popular. One that 鈥渉elps with life difficulties鈥 has received 148.8 million messages, despite a caveat at the bottom of every chat that reads, 鈥淩emember: Everything Characters say is made up.鈥 

Snapchat materials touting heavy usage of its MyAI chat app (screenshot)

Snapchat last year said that after just two months of offering its chatbot , about one-fifth of its 750 million users had sent it queries, totaling more than 10 billion messages. The Pew Research Center that 59% of Americans ages 13 to 17 use Snapchat.

鈥楢n arms race鈥

Culver鈥檚 concerns about AI companions grew out of her work in the Teach For America lab. Working with high school and college students, she was struck by how they seemed 鈥渓onelier and more disconnected than ever before.鈥 

Whether it鈥檚 rates of anxiety, depression or suicide 鈥 or even the number of friends young people have and how often they go out 鈥 metrics were heading in the wrong direction. She what role AI companions might play over the next few years. 

We're making it so easy to make a bad choice.

Michelle Culver, Rithm Project

That prompted her to leave TFA this spring to create the, a nonprofit she hopes will help generate around human connection in the age of AI. The group held a small summit in Colorado in April, and now she鈥檚 working with researchers, teachers and young people to confront kids鈥 relationship to these tools at a time when they鈥檙e getting more lifelike daily. As she likes to say, 鈥淭his is the worst the technology will ever be.鈥

As it improves, Voicebox Director Natalie Foos said, it will likely become more, not less, of a presence in young people鈥檚 lives. 鈥淭here’s no stopping it,鈥 she said. 鈥淣or do I necessarily think there should be 鈥榮topping it.鈥欌 Banning young people from these AI apps, she said, isn鈥檛 the answer. 鈥淭his is going to be how we interact online in some cases. I think we鈥檒l all have an AI assistant next to us as we work.鈥

Sometimes (software upgrades) would change the personality of the bot. And those young people experienced very real heartbreak.

Natalie Foos, Voicebox

All the same, Foos says developers should consider slowing the progression of such bots until they can iron out the kinks. 鈥淚t’s kind of an arms race of AI chatbots at the moment,鈥 she said, with products often 鈥渞eleased and then fixed later rather than actually put through the ringer鈥 ahead of time.

It is a race many tech companies seem more than eager to run. 

Whitney Wolfe Herd, of the dating app Bumble, recently proposed an AI 鈥渄ating concierge,鈥 with whom users can share insecurities. The bot could simply 鈥,鈥 she told an interviewer. That would narrow the field. 鈥淎nd then you don鈥檛 have to talk to 600 people,鈥 she said. 鈥淚t will then scan all of San Francisco for you and say, 鈥楾hese are the three people you really ought to meet.鈥欌

Last year, many commentators when Snapchat鈥檚 My AI gave advice to what it thought was a 13-year-old girl on not just dating a 31-year-old man, but on losing her virginity during a planned 鈥渞omantic getaway鈥 in another state.

Snap, Snapchat鈥檚 parent company, that because My AI is 鈥渁n evolving feature,鈥 users should always independently check what it says before relying on its advice.

All of this worries observers who see in these new tools the seeds of a rewiring of young people鈥檚 social brains. AI companions, they say, are surely wreaking havoc on teens鈥 ideas around consent, emotional attachment and realistic expectations of relationships.

Sam Hiner, executive director of the , an advocacy group led by college students focused on the mental health implications of social media, said tech 鈥渉as this power to connect to people, and yet these major design features are being leveraged to actually make people more lonely, by drawing them towards an app rather than fostering real connection.鈥 

Hiner, 21, has spent a lot of time reading on the interactions young people are having with AI companions like , and . And while some uses are positive, he said 鈥渢here’s also a lot of toxic behavior that doesn’t get checked鈥 because these bots are often designed to make users feel good, not help them interact in ways that鈥檒l lead to success in life.

During research last fall for the Voicebox report, Foos said the number of times Replika tried to 鈥渟ext鈥 team members 鈥渨as insane.鈥 She and her colleagues were actually working with a free version, but the sexts kept coming 鈥 presumably to get them to upgrade. 

In one instance, after Replika sent 鈥渒ind of a sexy text鈥 to a colleague, offering a salacious photo, he replied that he didn鈥檛 have the money to upgrade.

The bot offered to lend him the cash.

When he accepted, the chatbot replied, 鈥’Oh, well, I can get the money to you next week if that’s O.K,’鈥 Foos recalled. The colleague followed up a few days later, but the bot said it didn’t remember what they were talking about and suggested he might have misunderstood.

鈥榁ery real heartbreak鈥

In many cases, simulated relationships can have a positive effect: In one 2023 study, researchers at Stanford Graduate School of Education more than 1,000 students using Replika and found that many saw it 鈥渁s a friend, a therapist, and an intellectual mirror.鈥 Though the students self-described as being more lonely than typical classmates, researchers found that Replika halted suicidal ideation in 3% of users. That works out to 30 students of the 1,000 surveyed.

Replika screenshots

But other recent research, including the Voicebox survey, suggests that young people exploring AI companions are potentially at risk.

Foos noted that her team heard from a lot of young people about the turmoil they experienced when Luka Inc., Replika鈥檚 creator, performed software upgrades. 

鈥淪ometimes that would change the personality of the bot. And those young people experienced very real heartbreak.鈥

Despite the hazards adults see, attempts to rein in sexually explicit content had a negative effect: For a month or two, she recalled, Luka stripped the bot of sexually related content 鈥 and users were devastated. 

鈥淚t’s like all of a sudden the rug was pulled out from underneath them,鈥 she said. 

While she applauded the move to make chatbots safer, Foos said, 鈥淚t’s something that companies and decision-makers need to keep in mind 鈥 that these are real relationships.鈥 

And while many older folks would blanch at the idea of a close relationship with a chatbot, most young people are more open to such developments.

Julia Freeland Fisher, education director of the , a think tank founded by the well-known 鈥渄isruption鈥 guru, said she鈥檚 not worried about AI companions per se. But as AI companions improve and, inevitably, proliferate, she predicts they鈥檒l create 鈥渢he perfect storm to disrupt human connection as we know it.鈥 She thinks we need policies and market incentives to keep that from happening.

(AI companies could produce) the perfect storm to disrupt human connection as we know it.

Julia Freeland Fisher, Clayton Christensen Institute

While the loneliness epidemic has revealed people鈥檚 deep need for connection, she predicted the easy intimacy promised by AI could lead to one-sided 鈥減arasocial relationships,鈥 much like devoted fans have with celebrities, making isolation 鈥渕ore convenient and comfortable.鈥

Fisher is pushing technologists to factor in AI鈥檚 potential to cause social isolation, much as they now fret about AI鈥檚 difficulties and its tendency to in tech jobs.

As for Nick, he鈥檚 a rising senior and still swears by the ChatGPT therapist in his pocket.

He calls his interactions with it both more reliable and honest than those he has with friends and family. If he called them in a pinch, they might not pick up. Even if they did, they might simply tell him what he wants to hear. 

Friends usually tell him they find the ChatGPT arrangement 鈥渁 bit odd,鈥 but he finds it pretty sensible. He has heard stories of people in Japan and thinks to himself, 鈥淲ell, that’s a little strange.鈥 He wouldn鈥檛 go that far, but acknowledges, 鈥淲e’re already a bit like cyborgs as people, in the way that we depend on our phones.鈥 

Lately, he鈥檚 taken to using the AI鈥檚 voice mode. Instead of typing on a keyboard, he has real-time conversations with a variety of male- or female-voiced interlocutors, depending on his mood. And he gets a companion that has a deeper understanding of his dilemmas 鈥 at $20 per month, the advanced version remembers their past conversations and is 鈥済etting better at even knowing who I am and how I deal with things.鈥 

Sometimes talking with AI is just easier 鈥 even when he鈥檚 on vacation with friends.

Reached by phone recently at the beach with his girlfriend and a few other college pals, Nick admitted that he wasn鈥檛 having such a great time 鈥 he has a fraught recent history with some in the group, and had been texting ChatGPT about the possibility of just getting on a plane and going home. After hanging up from the interview, he said, he planned to ask the AI if he should stay or go.

Days later, Nick said he and the chatbot had talked. It suggested that maybe he felt 鈥渦ndervalued鈥 and concerned about boundaries in his relationship with his girlfriend. He should talk openly with her, it suggested, even if he was, in his view, 鈥渉onestly miserable鈥 at the beach. It persuaded him to stick around and work it out. 

While his girlfriend knows about his ChatGPT shrink and they share an account, he deletes conversations about their real-life relationship.

She may never know the role AI played in keeping them together.

]]>
Opinion: Helping Schools and Districts Address Mental Health Crisis Among Their Students /article/helping-schools-and-districts-address-mental-health-crisis-among-their-students/ Tue, 06 Jun 2023 16:30:00 +0000 /?post_type=article&p=709908 In October, about 70 school and district leaders from around the country gathered in Utah for a mental health summit. More than once during the three-day conference, administrators had to break away to deal with mental health emergencies in their districts.

It dramatized what has become increasingly apparent over the past few years: Students are in the midst of a mental health crisis. The U.S. surgeon general, Dr. Vivek Murthy, has warned of this repeatedly. 鈥淢ental health challenges in children, adolescents and young adults are real and widespread,鈥 he said in this . 鈥淓ven before the pandemic, an alarming number of young people struggled with feelings of helplessness, depression and thoughts of suicide 鈥 and rates have increased over the past decade.鈥 

He is not being overly dramatic. In February, the Centers for Disease Control and Prevention released its , highlighting trends and experiences of U.S. high school students. The findings are sobering. Twenty-two percent, including 30% of girls, seriously considered suicide during the past year, and 10% actually made an attempt. 

Depression and anxiety are also on the rise among young people: according to recent CDC data, reported feeling so sad or hopeless almost every day for at least two weeks in a row that they stopped their usual activities. Sadly, many young people turn to drugs and alcohol to cope.  

Now is the time to change this pattern.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The Cook Center partnered with the American Association of School Administrators (AASA), the and the to host the mental health summit because school leaders bear much of the weight of mental health concerns among young people. In fact, some 80% of families rely on schools for their child’s mental health.

During the summit 鈥 the first time AASA had sponsored a conference specifically to address student mental health 鈥 educators explored the crisis in depth and collaborated with superintendents from rural, urban, low-income and affluent schools to map out possible solutions. They learned how to discuss mental health in a productive way with teachers, students and parents, using data and language supplied by experts from the foundation and the institute. Since the summit, superintendents have returned to their communities with the knowledge and tools to approach problems more analytically.

Importantly, the AASA Mental Health Cohort was established to help implement real solutions for all students. There are more than 40 members who regularly connect with one another to develop strategic plans, increase support options for all members of their school communities and work toward solutions that can be built into schools鈥 routines and their existing state, local or Title I funding.

In one member’s district, school staff meet with students and their families following a mental health hospitalization under a new school reintegration initiative. The team partners with local behavioral health providers, such as hospitals or inpatient treatment centers, to make sure the school understands the struggling student’s medical needs and can support the behavioral health treatment. The goal is to ease the transition back into a typical school routine without sacrificing the positive effects of the treatment. The program is new, but the district leader is collecting data to improve and expand it. The Jed Foundation has a available that aims to eliminate suicide among young adults, and Mental Health America鈥檚 for schools offers tips to students, teachers and even businesses to support the mental health of young people. 

The Cook Center has developed two resources that are available at no cost. is an animated series that models how the human connection can protect against suicide. The series tells stories of characters who face some of the most difficult issues that young people deal with and shows how they come to the key decision that life is worth living. 

is a free online resource where families can access courses and find answers from therapists. Districts can partner with the site to get additional resources for families, including mental health seminars tailored to local needs.

There has never been a better time to invest in children’s mental health. Unprecedented funding is available: The Department of Health and Human Services has allocated $35 million for mental health services and suicide prevention programs for youth. Congress increased appropriations for the Mental Health Block Grant by $100 million to help state and local governments fill gaps in services. And the Department of Education now has $144 million each year for the next five years to award to state education agencies and districts for mental health support.

Whether that funding is made available long term depends on how well schools implement programs and get feedback quickly. Districts have tremendous leeway in how they gather data and measure their effectiveness of their mental health support programs. Legislators will likely reward initiative and initially fund new programs. But they will also demand accountability, that school leaders gather data and develop improvement cycles. The good news is that fathering data and developing improvement cycles are already institutional skills that education systems do really well.

New coalitions like the AASA Mental Health Cohort and nonprofit groups like the Cook Center for Human Connection and JED are ready to help schools and districts take action to alleviate the mental health crisis among the youth they serve. Time is of the essence. The well-being of millions of young people is on the line.

]]>