cheating – 社区黑料 America's Education News Source Fri, 19 Sep 2025 17:52:41 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png cheating – 社区黑料 32 32 Another AI Side Effect: Erosion of Student-Teacher Trust /article/another-ai-side-effect-erosion-of-student-teacher-trust/ Mon, 22 Sep 2025 10:30:00 +0000 /?post_type=article&p=1020954 William Liang was sitting in chemistry class one day last spring, listening to a teacher deliver a lecture on 鈥渞esponsible AI use,鈥 when he suddenly realized what his teachers are up against.

The talk was about a big, take-home essay, and Liang, then a sophomore at a Bay Area high school, recalled that it covered the basics: the rubric for grading as well as suggestions for how to use generative AI to keep students honest: They should use it as a 鈥渢hinking partner鈥 and brainstorming tool.

As he listened, Liang glanced around the classroom and saw that several classmates, laptops open, had already leaped ahead several steps, generating entire drafts of their essays.

Liang said his generation doesn鈥檛 engage in moral hand-wringing about AI. 鈥淔or us, it鈥檚 simply a tool that enables us not to have to think for ourselves.鈥

For us, it鈥檚 simply a tool that enables us not to have to think for ourselves.

William Liang, student

But with AI鈥檚 awesome power comes a side effect that many would rather not consider: It鈥檚 killing the trust between teachers and students. 

When students can cheaply and easily outsource their work, he said, why value a teacher鈥檚 feedback? And when teachers, relying on sometimes unreliable AI-detection software, believe their students are taking such major shortcuts, the relationship erodes further.

It鈥檚 an issue that researchers are just beginning to study, with results that suggest an imminent shakeup in student-teacher relationships: AI, they say, is forcing teachers to rethink how they think about students, assessments and, to a larger extent, learning itself. 

If you ask Liang, now a junior and an experienced 鈥 he has penned pieces for The Hill, The San Diego Union-Tribune, and the conservative Daily Wire 鈥 AI has already made school more transactional, stripping many students of their desire to learn in favor of simply completing assignments. 

鈥淭he incentive system for students is to just get points,鈥 he said in an interview. 

While much of the attention of the past few years has focused on how teachers can detect AI-generated work and put a stop to it, a few researchers are beginning to look at how AI affects student-teacher relationships.

Researcher Jiahui Luo of the Education University of Hong Kong that college students in many cases resent the lack of 鈥渢wo-way transparency鈥 around AI. While they鈥檙e required to declare their AI use and even submit chat records in a few cases, Luo wrote, the same level of transparency 鈥渋s often not observed from the teachers.鈥 That produces a 鈥渓ow-trust environment,鈥 where students feel unsafe to freely explore AI.

In 2024, after being asked by colleagues at Drexel University to help resolve an AI cheating case, researcher , who teaches in the university鈥檚 , analyzed college students鈥 , spanning December 2022 to June 2023, shortly after Open AI unleashed ChatGPT onto the world. He found that many students were beginning to feel the technology was testing the trust they felt from instructors, in many cases eroding it 鈥 even if they didn鈥檛 rely on AI.

While many students said instructors trusted them and would offer them the benefit of the doubt in suspected cases of AI cheating, others were surprised when they were accused nonetheless. That damaged the trust relationship.

For many, it meant they鈥檇 have to work on future assignments 鈥渄efensively,鈥 Gorichanaz wrote, anticipating cheating accusations. One student even suggested, 鈥淪creen recording is a good idea, since the teacher probably won鈥檛 have as much trust from now on.鈥 Another complained that their instructor now implicitly trusted AI plagiarism detectors 鈥渕ore than she trusts us.鈥

It's creating this situation of mutual distrust and suspicion, and it makes nobody like each other.

Tim Gorichanaz, Drexel University

In an interview, Gorichanaz said instructors鈥 trust in AI detectors is a big problem. 鈥淭hat’s the tool that we’re being told is effective, and yet it’s creating this situation of mutual distrust and suspicion, and it makes nobody like each other. It’s like, 鈥楾his is not a good environment.鈥欌

For Gorichanaz, the biggest problem is that AI detectors simply aren鈥檛 that reliable 鈥 for one thing, they are more likely to flag the papers of English language learners as being written by AI, he said. In one Stanford University , they 鈥渃onsistently鈥 misclassified non-native English writing samples as AI-generated, while accurately identifying the provenance of writing samples by native English speakers.

鈥淲e know that there are these kinds of biases in the AI detectors,鈥 Gorichanaz said. That potentially puts 鈥渁 seed of doubt鈥 in the instructor鈥檚 mind, when they should simply be using other ways to guide students鈥 writing. 鈥淪o I think it’s worse than just not using them at all.鈥 

鈥業t is an enormous wedge in the relationship鈥

Liz Shulman, an English teacher at Evanston Township High School near Chicago, recently had an experience similar to Liang鈥檚: One of her students covertly relied on AI to help write an essay on Romeo and Juliet, but forgot to delete part of the prompt he鈥檇 used. Next to the essay鈥檚 title were the words, 鈥淢ake it sound like an average ninth-grader.鈥

Asked about it, the student simply shrugged, Shulman recalled in she co-authored with Liang.

In an interview, Shulman said that just three weeks into the new school year, in late August, she had already had to sit down with another student who used AI for an assignment. 鈥淚 pretty much have to assume that students are going to use it,鈥 she said. 鈥淚t is an enormous wedge in the relationship, which is so important to build, especially this time of the year.鈥

It is an enormous wedge in the relationship, which is so important to build.

Liz Shulman, English teacher

Her take: School has transformed since 2020鈥檚 long COVID lockdowns, with students recalibrating their expectations. It鈥檚 less relational, she said, and 鈥渕uch more transactional.鈥 

During lockdowns, she said, Google 鈥渋nfiltrated every classroom in America 鈥 it was how we pushed out documents to students.鈥 Five years later, if students miss a class because of illness, their 鈥渋nstinct鈥 now is simply to check , the widely used management tool, 鈥渞ather than coming to me and say, 鈥楬ey, I was sick. What did we do?鈥欌

That鈥檚 a bitter pill for an English teacher who aspires to shift students鈥 worldviews and beliefs 鈥 and who relies heavily on in-class discussions.

鈥淭hat’s not something you can push out on a Google doc,鈥 Shulman said. 鈥淭hat takes place in the classroom.鈥

In a sense, she said, AI is contracting where learning can reliably take place: If students can simply turn off their thinking at home and rely on AI tools to complete assignments, that leaves the classroom as the sole place where learning occurs. 

鈥淏ecause of AI, are we only going to 鈥榙o school鈥 while we’re in school?鈥 she asked. 

鈥榃e forget all the stuff we learned before鈥

Accounts of teachers resigned to students cheating with AI are 鈥渃oncerning鈥 and stand in contrast to what a solid body of research says about the importance of teacher agency, said , senior vice president for Innovation and Impact at the Carnegie Foundation.

Teachers, she said, 鈥渁re not just in a classroom delivering instruction 鈥 they’re part of a community. Really wonderful school and system leaders recognize that, and they involve them. They’re engaged in decision making. They have that agency.鈥

One of the main principles of Carnegie鈥檚 , a blueprint for improving secondary education, includes a 鈥渃ulture of trust,鈥 suggesting that schools nurture supportive learning and 鈥減ositive relationships鈥 for students and educators.

鈥淓ducation is a deeply social process,鈥 Stafford-Brizard said. 鈥淭eaching and learning are social, and schools are social, and so everyone contributing to those can rely on that science of relational trust, the science of relationships. We can pull from that as intentionally as we pull from the science of reading.鈥

Education is a deeply social process. Teaching and learning are social, and schools are social.

Brooke Stafford-Brizard, Carnegie Foundation

Gorichanaz, the Drexel scholar, said that for all of its newness, generative AI presents educators with what鈥檚 really an old challenge: How to understand and prevent cheating. 

鈥淲e have this tendency to think AI changed the entire world, and everything’s different and revolutionized and so on,鈥 he said. 鈥淏ut it’s just another step. We forget all the stuff we learned before.鈥

Specifically, research going back identifies four key reasons why students cheat: They don鈥檛 understand the relevance of an assignment to their life, they鈥檙e under time pressure, or intimidated by its high stakes, or they don’t feel equipped to succeed.

Even in the age of AI, said Gorichanaz, teachers can lessen the allure of taking shortcuts by solving for these conditions 鈥 figuring out, for instance, how to intrinsically motivate students to study by helping them connect with the material for its own sake. They can also help students see how an assignment will help them succeed in a future career. And they can design courses that prioritize deeper learning and competence. 

To alleviate testing pressure, teachers can make assignments more low-stakes and break them up into smaller pieces. They can also give students more opportunities in the classroom to practice the skills and review the knowledge being tested.

And teachers should talk openly about academic honesty and the ethics of cheating.

鈥淚鈥檝e found in my own teaching that if you approach your assignments in that way, then you don’t always have to be the police,鈥 he said. Students are 鈥渕ore incentivized, just by the system, to not cheat.鈥

With writing, teachers can ask students to submit smaller 鈥渃heckpoint鈥 assignments, such as outlines and handwritten notes and drafts that classmates can review and comment on. They can also rely more on oral exams and handwritten blue book assignments. 

Shulman, the Chicago-area English teacher, said she and her colleagues are not only moving back to blue books, but to doing 鈥渁 lot more on paper than we ever used to.鈥 They鈥檙e asking students to close their laptops in class and assigning less work to be completed outside of class. 

As for Liang, the high school junior, he said his new English teacher expects all assignments to come in hand-written. But he also noted that a few teachers have fallen under the spell of ChatGPT themselves, using it for class presentations. As one teacher last spring clicked through a slide show, he said, 鈥淚t was glaringly obvious, because all kids are AI experts, and they can just instantly sniff it out.鈥 

He added, 鈥淭here was a palpable feeling of distrust in the room.鈥

]]>
Students Increasingly Rely on Chatbots, but at What Cost? /article/students-increasingly-rely-on-chatbots-but-at-what-cost/ Sat, 02 Aug 2025 16:30:00 +0000 /?post_type=article&p=1018929 This article was originally published in

Students don鈥檛 have the same incentives to talk to their professors 鈥 or even their classmates 鈥 anymore. Chatbots like ChatGPT, Gemini and Claude have given them a new path to self-sufficiency. Instead of asking a professor for help on a paper topic, students can go to a chatbot. Instead of forming a study group, students can ask AI for help. These chatbots give them quick responses, on their own timeline.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


For students juggling school, work and family responsibilities, that ease can seem like a lifesaver. And maybe turning to a chatbot for homework help here and there isn鈥檛 such a big deal in isolation. But every time a student decides to ask a question of a chatbot instead of a professor or peer or tutor, that鈥檚 one fewer opportunity to build or strengthen a relationship, and the human connections students make on campus are among the most important benefits of college.

Julia Freeland-Fisher studies how technology can help or hinder student success at the . She said the consequences of turning to chatbots for help can compound.

鈥淥ver time, that means students have fewer and fewer people in their corner who can help them in other moments of struggle, who can help them in ways a bot might not be capable of,鈥 she said.

As colleges further embed ChatGPT and other chatbots into campus life, Freeland-Fisher warns lost relationships may become a devastating unintended consequence.

Asking for help

Christian Alba said he has never turned in an AI-written assignment. Alba, 20, attends College of the Canyons, a large community college north of Los Angeles, where he is studying business and history. And while he hasn鈥檛 asked ChatGPT to write any papers for him, he has turned to the technology when a blank page and a blinking cursor seemed overwhelming. He has asked for an outline. He has asked for ideas to get him started on an introduction. He has asked for advice about what to prioritize first.

鈥淚t鈥檚 kind of hard to just start something fresh off your mind,鈥 Alba said. 鈥淚 won鈥檛 lie. It鈥檚 a helpful tool.鈥 Alba has wondered, though, whether turning to ChatGPT with these sorts of questions represents an overreliance on AI. But Alba, like many others in higher education, worries primarily about AI use as it relates to academic integrity, not social capital. And that鈥檚 a problem.

Jean Rhodes, a psychology professor at the University of Massachusetts Boston, has spent decades studying the way college students seek help on campus and how the relationships formed during those interactions end up benefitting the students long-term. Rhodes doesn鈥檛 begrudge students integrating chatbots into their workflows, as many of their professors have, but she worries that students will get inferior answers to even simple-sounding questions, like, 鈥渉ow do I change my major?鈥

A chatbot might point a student to the registrar鈥檚 office, Rhodes said, but had a student asked the question of an advisor, that person may have asked important follow-up questions 鈥 why the student wants the change, for example, which could lead to a deeper conversation about a student鈥檚 goals and roadblocks.

鈥淲e understand the broader context of students鈥 lives,鈥 Rhodes said. 鈥淭hey鈥檙e smart but they鈥檙e not wise, these tools.鈥

Rhodes and one of her former doctoral students, Sarah Schwartz, created a program called Connected Scholars to help students understand why it鈥檚 valuable to talk to professors and have mentors. The program helped them hone their networking skills and understand what people get out of their networks over the course of their lives 鈥 namely, social capital.

Connected Scholars is offered as a semester-long course at U Mass Boston, and a forthcoming paper examines outcomes over the last decade, finding students who take the course are three times more likely to graduate. Over time, Rhodes and her colleagues discovered that the key to the program鈥檚 success is getting students past an aversion to asking others for help.

Students will make a plethora of excuses to avoid asking for help, Rhodes said, ticking off a list of them: 鈥溾業 don鈥檛 want to stand out,鈥 鈥業 don鈥檛 want people to realize I don鈥檛 fit in here,鈥 鈥楳y culture values independence,鈥 鈥業 shouldn鈥檛 reach out,鈥 鈥業鈥檒l get anxious,鈥 鈥楾his person won鈥檛 respond.鈥 If you can get past that and get them to recognize the value of reaching out, it鈥檚 pretty amazing what happens.鈥

Connections are key

Seeking human help doesn鈥檛 only leave students with the resolution to a single problem, it gives them a connection to another person. And that person, down the line could become a friend, a mentor or a business partner 鈥 a 鈥渟trong tie,鈥 as social scientists describe their centrality to a person鈥檚 network. They could also become a 鈥渨eak tie鈥 who a student may not see often, but could, importantly, still offer or crucial one day.

Daniel Chambliss, a retired sociologist from Hamilton College, emphasized the value of relationships in his 2014 book, 鈥淗ow College Works,鈥 co-authored with Christopher Takacs. Over the course of their research, the pair found that the key to a successful college experience boiled down to relationships, specifically two or three close friends and one or two trusted adults. Hamilton College goes out of its way to make sure students can form those relationships, structuring work-study to get students into campus offices and around faculty and staff, making room for students of varying athletic abilities on sports teams, and more.

Chambliss worries that AI-driven chatbots make it too easy to avoid interactions that can lead to important relationships. 鈥淲e鈥檙e suffering epidemic levels of loneliness in America,鈥 he said. 鈥淚t鈥檚 a really major problem, historically speaking. It鈥檚 very unusual, and it鈥檚 profoundly bad for people.鈥

As students increasingly turn to artificial intelligence for help and even casual conversation, Chambliss predicted it will make people even more isolated: 鈥淚t鈥檚 one more place where they won鈥檛 have a personal relationship.鈥

In fact, by researchers at the MIT Media Lab and OpenAI found that the most frequent users of ChatGPT 鈥 power users 鈥 were more likely to be lonely and isolated from human interaction.

鈥淲hat scares me about that is that Big Tech would like all of us to be power users,鈥 said Freeland-Fisher. 鈥淭hat鈥檚 in the fabric of the business model of a technology company.鈥

Yesenia Pacheco is preparing to re-enroll in Long Beach City College for her final semester after more than a year off. Last time she was on campus, ChatGPT existed, but it wasn鈥檛 widely used. Now she knows she鈥檚 returning to a college where ChatGPT is deeply embedded in students鈥 as well as faculty and staff鈥檚 lives, but Pacheco expects she鈥檒l go back to her old habits 鈥 going to her professors鈥 office hours and sticking around after class to ask them questions. She sees the value.

She understands why others might not. Today鈥檚 high schoolers, she has noticed, are not used to talking to adults or building mentor-style relationships. At 24, she knows why they matter.

鈥淎 chatbot,鈥 she said, 鈥渋sn鈥檛 going to give you a letter of recommendation.鈥

This article was and was republished under the license.

]]>
From English to Automotive Class, Teachers Assign Projects to Combat AI Cheating /article/from-english-to-automotive-class-teachers-assign-projects-to-combat-ai-cheating/ Fri, 13 Jun 2025 14:30:00 +0000 /?post_type=article&p=1016862 This article was originally published in

Kids aren鈥檛 as sneaky as they think they are. 

They do try, as Holly Distefano has seen in her middle school English language arts classes. When she poses a question to her seventh graders over her school鈥檚 learning platform and watches the live responses roll in, there are times when too many are suspiciously similar. That鈥檚 when she knows students are using an artificial intelligence tool to write an answer. 

鈥淚 really think that they have become so accustomed to it, they lack confidence in their own writing,鈥 Distefano, who teaches in Texas, says. 鈥淚n addition to just so much pressure on them to be successful, to get good grades, really a lot is expected of them.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Distefano is sympathetic 鈥 but still expects better from her students. 

鈥淚鈥檝e shown them examples of what AI is 鈥 it鈥檚 not real,鈥 she says. 鈥淚t鈥檚 like margarine to me.鈥

Educators have been trying to curb the use of AI-assisted cheating since ChatGPT exploded onto the scene. 

It鈥檚 a formidable challenge. For instance, there鈥檚 a  reserved for tech influencers who rack up thousands of views and likes teaching students how to most effectively use AI programs to generate their essays, including step-by-step instructions on bypassing AI detectors. And the search term for software that purports to 鈥渉umanize鈥 AI-generated content spiked in the fall, , only to fall sharply before hitting the peak of its popularity around the end of April.

While the overall proportion of students who say they鈥檝e cheated , students also say . 

But there may be a solution on the horizon, one that will help ensure students have to put more effort into their schoolwork than entering a prompt into a large language model.

Teachers are transitioning away from question-and-answer assignments or straightforward essays 鈥 in favor of projects. 

It鈥檚 not especially high-tech or even particularly ingenious. Yet proponents say it鈥檚 a strategy that pushes students to focus on problem-solving while instructing them on how to use AI ethically. 

Becoming 鈥楢I-Proof鈥

During this past school year, Distefano says her students鈥 use of AI to cheat on their assignments has reached new heights. She鈥檚 spent more time coming up with ways to stop or slow their ability to plug questions and assignments into an AI generator, including by giving out hard copy work. 

It used to mainly be a problem with take-home assignments, but Distefano has increasingly seen students use AI during class. Kids have long been astute at getting around whatever firewalls schools put on computers, and their desire to circumvent AI blockers is no different. 

Between schoolwork, sports, clubs and everything else middle schoolers are juggling, Distefano can see why they鈥檙e tempted by the allure of a shortcut. But she worries about what her students are missing out on when they avoid the struggle that comes with learning to write. 

鈥淭o get a student to write is challenging, but the more we do it, the better we get.鈥 she says. 鈥淏ut if we鈥檙e bypassing that step, we鈥檙e never going to get that confidence. The downfall is they’re not getting that experience, not getting that feeling of, 鈥楾his is something I did.鈥欌 

Distefano is not alone in trying to beat back the onslaught of AI cheating. Blue books, which college students use to complete exams by hand, have had a  as professors try to eliminate the risk of AI intervention, reports The Wall Street Journal. 

Richard Savage, the superintendent of California Online Public Schools, says AI cheating is not a major issue among his district鈥檚 students. But Savage says it鈥檚 a simple matter for teachers to identify when students do turn to AI to complete their homework. If a student does well in class but fails their thrice-yearly 鈥渄iagnostic exams,鈥 that鈥檚 a clear sign of cheating. It would also be tough for students to fake their way through live, biweekly progress meetings with their teachers, he adds. 

Savage says educators in his district will spend the summer working on making their lesson plans 鈥淎I-proof.鈥 

鈥淎I is always changing, so we鈥檙e always going to have to modify what we do,鈥 he says. 鈥淲e鈥檙e all learning this together. The key for me is not to be AI-averse, not to think of AI as the enemy, but think of it as a tool.鈥

鈥楾rick Them Into Learning鈥

Doing that requires teachers to work a little differently. 

Leslie Eaves, program director for project-based learning at the Southern Regional Education Board, has been devising solutions for educators like Distefano and Savage. 

Eaves authored the board鈥檚 , released earlier this year. Rather than exile AI, the report recommends that teachers use AI to enhance classroom activities that challenge students to think more deeply and critically about the problems they鈥檙e presented with. 

It also outlines what students need to become what Eaves calls 鈥渆thical and effective users鈥 of artificial intelligence. 

鈥淭he way that happens is through creating more cognitively demanding assignments, constantly thinking in our own practice, 鈥業n what way am I encouraging students to think?鈥欌 she says. 鈥淲e do have to be more creative in our practice, to try and do some new things to incorporate more student discourse, collaborative hands-on assignments, peer review and editing, as a way to trick them into learning because they have to read someone else鈥檚 work.鈥

In an English class lesson on 鈥淭he Odyssey,” Eaves offers as an example, students could focus on reading and discussion, use pen and paper to sketch out the plot structure, and use AI to create an outline for an essay based on their work, before moving on to peer-editing their papers. 

Eaves says that the teachers she鈥檚 working with to take a project-based approach to their lesson plans aren鈥檛 panicking about AI but rather seem excited about the possibilities. 

And it鈥檚 not only English teachers who are looking to shift their instruction so that AI is less a tool for cheating and more a tool that helps students solve problems. She recounts that an automotive teacher realized he had to change his teaching strategy because when his students adopted AI, they 鈥渟topped thinking.鈥 

鈥淪o he had to reshuffle his plan so kids were re-designing an engine for use in racing, [figuring out] how to upscale an engine in a race car,鈥 Eaves says. 鈥淎I gave you a starting point 鈥 now what can we do with it?鈥

When it comes to getting through to students on AI ethics, Savage says the messaging should be a combination of digital citizenship and the practical ways that using AI to cheat will stunt students鈥 opportunities. Students with an eye on college, for example, give up the opportunity to demonstrate their skills and hurt their competitiveness for college admissions and scholarships when they turn over their homework to AI. 

Making the shift to more project-based classrooms will be a heavy lift for educators, he says, but districts will have to change, because generative AI is here to stay. 

鈥淭he important thing is we don鈥檛 have the answers. I鈥檓 not going to pretend I do,鈥 Savage says. 鈥淚 know what we can do, when we can get there, and then it鈥檒l probably change. The answer is having an open mind and being willing to think about the issue and change and adapt.鈥

]]>
AI Skeptic Creates Chatbot to Help Teachers Design Courses /article/ai-skeptic-creates-chatbot-to-help-teachers-design-courses/ Thu, 27 Mar 2025 14:30:00 +0000 /?post_type=article&p=1012561 While many educators spent the past two years fretting that artificial intelligence is killing student writing, upending person-to-person tutoring and generally wreaking havoc on scholastic inquiry, the well-known thinker and ed tech expert Michael Feldstein has been quietly exploring something completely different. 

For more than a year, he has led an with a group of about 70 educators online to build what鈥檚 essentially a chat bot with one job: to guide teachers, step-by-step, through the process of designing their own courses 鈥 a privilege previously reserved for just a few instructors at elite institutions. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The experimental software, dubbed the AI Learning Design Assistant, or , has yet to hit the market. But when it does, Feldstein said, it will be free. With any luck, it could mark a new era, offering teachers at all levels an easy way to design their own homegrown coursework, assessments and even curricula at a fraction of the cost demanded by commercial publishers. Feldstein has worked primarily with college instructors, and his work is widely applicable in higher ed. But it鈥檚 got potential in K-12 education as well.

AI is interesting because there are many possible answers. That makes the question harder to answer. Nevertheless, we need to answer it.

Michael Feldstein, co-creator of ALDA

He鈥檚 pushing to democratize instructional design, a little-known academic field in which professional designers build courses by working backwards: They interview teachers to help them drill down to what鈥檚 important, then create courses based on the findings. 

When it鈥檚 ready, he said, ALDA could well shake up the teaching profession, making off-the-shelf AI behave like a personal instructional designer for virtually every teacher who wants one. 

And for the record, Feldstein said, there鈥檚 an acute shortage of such designers, so this particular iteration of AI likely won鈥檛 put anyone out of a job. 

鈥榃hat is this good for?鈥

Feldstein is well-known in the ed tech community, having worked over the years at Oracle, Cengage Learning and elsewhere. A one-time assistant director of the State University of New York鈥檚 Learning Network, he has more recently garnered a wide audience with his 鈥 required reading for college instructors and ed tech experts.

Over the past few years, Feldstein has likened tools such as ChatGPT and AI image generators like Midjourney to 鈥渢oys in both good and bad ways.鈥 They invite people to play and give players the ability to explore what鈥檚 basically cutting-edge AI. 鈥淚t鈥檚 fun. And, like all good games, you learn by playing,鈥 he .

But he cautions that when they鈥檙e asked to do something specific, they 鈥渢end to do weird things鈥 such as return strange results and, on occasion, hallucinate.

As a longtime observer of ed tech, Feldstein鈥檚 approach has always been to step back and ask: What is this good for? 

鈥淎I is interesting because there are many possible answers, and those answers change on a monthly basis as the capabilities change,鈥 he said. That makes the question harder to answer. Nevertheless, we need to answer it.鈥 

ALDA鈥檚 focus, he said, has always been on helping participants think more deeply about what teachers do: The AI probes students to find out what they know, then fills in the gaps. 

鈥淎s an educator, if I ask you a question, I’m trying to understand if you know something,鈥 he said. 鈥淪o my question is directly related to a learning objective.鈥 

By training, teachers naturally modify their questions to help figure out if students have misconceptions. They circle around the topic, offering clues, hints and feedback to help students home in on what they know. But they don鈥檛 simply give away the answer.

Over the course of the year, he and colleagues have broken down the various aspects of their work, including what they鈥檇 outsource if they had an assistant or 鈥渏unior learning designer鈥 at their side. 

Excerpts of a conversation between an AI chatbot and a teacher who is in the process of designing a course. The open-source tool, AI Learning Design Assistant, or ALDA, is being co-developed by educator and blogger Michael Feldstein along with a small group of college instructors. (Courtesy of Michael Feldstein)

The AI starts simply, asking 鈥淲ho are your students? What is your course about? What are the learning goals? What鈥檚 your teaching style?鈥 It moves on from there: 鈥淲hat are the learning objectives for this lesson? How do you know when students have achieved those objectives? What are some common misconceptions they have?鈥

Eventually teachers can begin designing the course and its assessments with a clear focus on goals and, in the end, their own creativity. 

Feldstein holds decidedly modest goals for the project.

鈥淭he idea that we’re going to somehow invent a better AI model than these companies that are spending billions of dollars is crazy,鈥 Feldstein said. But making course design accessible 鈥渋s very doable and very useful.鈥 

He has intentionally brought together a diverse group of instructors that includes both heavy AI users and skeptics. Among them: Paul Wilson, a longtime professor of religion and philosophy at Shaw University in Raleigh, N.C. Though Wilson has taught there for 32 years, he has dabbled in AI over the past few years as it reared its head in classes, assignments and faculty meetings. 

He came away from Feldstein鈥檚 sessions over the past few months with the outlines of not one but two courses: a world religion survey, which he designed last summer, and a course in pastoral care. The latter, he said, is a 鈥渟pecialty class鈥 for ministers-in-training who are getting their first taste of interacting with congregation members.

鈥淭hey’re doing field work,鈥 he said, 鈥渁nd this particular class is going to cover the functions they would have if they were serving in pastoral ministry.鈥 

The course will cover everything from the business of running a congregation to the teaching and counseling duties of a pastor and the 鈥減rophetic鈥 role 鈥 preaching and teaching the Bible, shepherding the congregation and offering spiritual guidance. 

Wilson said the AI let him tweak the course design in response to test users鈥 suggestions. 鈥淏y the end, my experience was that I was working with something valuable,鈥 he said. He is offering the class this semester. 

鈥淚 got a very good course design, with all the parameters that I was looking for,鈥 he said. 

Geneva Dampare, director of strategy and operations at the United Negro College Fund, said the organization invited six instructors from five HBCUs to Feldstein鈥檚 workshop. Dampare, who has an instructional design background, joined as well. 

Many faculty at these institutions, she said, don鈥檛 see AI as the menace that other instructors do. For them, it鈥檚 a kind of equalizer at colleges that don鈥檛 typically offer a perk like instructional designers. 

But by the end of the process last November, Dampare said, many instructors 鈥渃ould comfortably speak about AI, speak about how they are integrating the ALDA tool into the curriculum development that they’re doing for next semester or future semesters.鈥

]]>
Could Massachusetts AI Cheating Case Push Schools to Refocus on Learning? /article/could-massachusetts-ai-cheating-case-push-schools-to-refocus-on-learning/ Thu, 31 Oct 2024 18:48:54 +0000 /?post_type=article&p=734887 A Massachusetts family is awaiting a judge鈥檚 ruling in a federal lawsuit that could determine their son鈥檚 future. To a few observers, it could also push educators to limit the use of generative artificial intelligence in school.

To others, it鈥檚 simply a case of helicopter parents gone wild.

The case, filed last month, tackles key questions of academic integrity, the college admissions arms race and even the purpose of school in an age when students can outsource onerous tasks like thinking to a chatbot.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


While its immediate outcome will largely serve just one family 鈥 the student鈥檚 parents want a grade changed so their son can apply early-admission to elite colleges 鈥 the case could ultimately prompt school districts nationwide to develop explicit policies on AI. 

If the district, in a prosperous community on Boston鈥檚 South Shore, is forced to change the student鈥檚 grade, that could also prompt educators to focus more clearly on the knife鈥檚 edge of AI鈥檚 promises and threats, confronting a key question: Does AI invite students to focus on completing assignments rather than actual learning?

鈥淲hen it comes right down to it, what do we want students to do?鈥 asked John Warner, a well-known and author of . 鈥淲hat do we want them to take away from their education beyond a credential? Because this technology really does threaten the integrity of those credentials. And that’s why you see places trying to police it.鈥

鈥楿nprepared in a technology transition鈥

The facts of the case seem simple enough: The parents of a senior at Hingham High School have sued the school district, saying their son was wrongly penalized as a junior for relying on AI to research and write a history project that he and a partner were assigned in Advanced Placement U.S. History. The teacher used the anti-plagiarism tool Turnitin, which flagged a draft of the essay about NBA Hall of Famer Kareem Abdul Jabbar鈥檚 civil rights activism as possibly containing AI-generated material. So she used a 鈥渞evision history鈥 tool to uncover how many edits the students had made, as well as how long they spent writing. She discovered 鈥渕any large cut and paste items鈥 in the first draft, suggesting they鈥檇 relied on outside sources for much of the text. She ran the draft through two other digital tools that also indicated it had AI-generated content and gave the boys a D on the assignment. 

From there, the narrative gets a bit murky. 

On the one hand, the complaint notes, when the student and his partner started the essay last fall, the district didn鈥檛 have a policy on using AI for such an assignment. Only later did it lay out prohibitions against AI.

The boy鈥檚 mother, Jennifer Harris, last month asked a local , 鈥淗ow do you know if you鈥檙e crossing a line if the line isn鈥檛 drawn?鈥

The pair tried to explain that using AI isn鈥檛 plagiarism, telling teachers there鈥檚 considerable debate over its use in academic assignments, but that they hadn鈥檛 tried to pass off others鈥 work as their own. 

For its part, the district says Hingham students are trained to know plagiarism and academic dishonesty when they see it. 

District officials declined to be interviewed, but in an affidavit, Social Studies Director Andrew Hoey said English teachers at the school regularly review proper citation and research techniques 鈥 and they set expectations for AI use.

Social studies teachers, he said, can justifiably expect that skills taught in English class 鈥渨ill be applied to all Social Studies classes,鈥 including AP US History 鈥 even if they鈥檙e not laid out explicitly. 

A spokesperson for National History Day, the group that sponsored the assignment, provided 社区黑料 with a link to its , which say students may use AI to brainstorm topic ideas, look for resources, review their writing for grammar and punctuation and simplify the language of a source to make it more understandable.

They can鈥檛 use AI to 鈥渃reate elements of your project鈥 such as writing text, creating charts, graphs, images or video. 

In March, the school鈥檚 National Honor Society faculty advisor, Karen Shaw, said the pair鈥檚 use of AI was 鈥渢he most egregious鈥 violation of academic honesty she and others had seen in 16 years, according to the lawsuit. The society rejected their applications.

Peter S. Farrell, the family鈥檚 attorney, said the district 鈥渦sed an elephant gun to slay a mouse,鈥 overreacting to what鈥檚 basically a misunderstanding.

The boys鈥 failing grade on the assignment, as well as the accusation of cheating, kept him out of the Honor Society, the lawsuit alleges. Both penalties have limited his chances to get into top colleges on early decision, as he鈥檇 planned this fall.

The student, who goes unnamed in the lawsuit, is 鈥渁 very, very bright, capable, well-rounded student athlete鈥 with a 4.3 GPA, a 鈥減erfect鈥 ACT score and an 鈥渁lmost perfect鈥 SAT score, said Farrell. 鈥淚f there were a perfect plaintiff, he’s it.鈥 

They knew that there was no leg to stand on in terms of the severity of that sanction.

Peter S. Farrell, attorney for student

While the boy earned a C+ in the course, he scored a perfect 5 on the AP exam last spring, according to the lawsuit. His exclusion from the Honor Society, Farrell said, 鈥渞eally shouldn’t sit right with anybody.鈥

For a public high school to take such a hard-nosed position 鈥渟imply because they got caught unprepared in a technology transition鈥 doesn鈥檛 serve anyone鈥檚 interests, Farrell said. 鈥淎nd it’s certainly not good for the students.鈥

Ultimately, the school鈥檚 own investigation found that over the past two years it had inducted into the Honor Society seven other students who had academic integrity infractions, Farrell said. The student at the center of the lawsuit was allowed to reapply and was inducted on Oct. 15.

鈥淭hey knew that there was no leg to stand on in terms of the severity of that sanction,鈥 Farrell said.

鈥楧istricts are trying to take it seriously鈥

While Hingham didn鈥檛 adopt a districtwide AI policy until this school year, it鈥檚 actually ahead of the curve, said Bree Dusseault, the principal and managing director of the , a think tank at Arizona State University. Most districts have been cautious to put out formal guidance on AI.

Dusseault contributed an affidavit on behalf of the plaintiffs, laying out the fragmented state of AI uptake and guidance. She more than 1,000 superintendents last year and found that just 5% of districts had policies on AI, with another 31% promising to develop them in the future. Even among CRPE鈥檚 group of 40 鈥渆arly adopter鈥 school districts that are exploring AI and encouraging teachers to experiment with it, just 26 had published policies in place. 

They鈥檙e hesitant for a reason, she said: They’re trying to figure out what the technology鈥檚 implications are before putting rules in writing. 

鈥淒istricts are trying to take it seriously,鈥 she said. 鈥淭hey’re learning the capacity of the technology, and both the opportunities and the risks it presents for learning.鈥 But so often they鈥檙e surprised by new technological developments and capabilities that they never imagined. 

Even if they鈥檙e hesitant to commit to full-blown policies, Dusseault said, districts should consider more informal guidelines that clearly lay out for students what academic integrity, plagiarism and acceptable use are. Districts that are 鈥渢otally silent鈥 on AI run the risk of student confusion and misuse. And if a district is penalizing students for AI use, it needs to have clear policy language explaining why.

That said, a few observers believe the case boils down to little more than a cheating student and his helicopter parents.

Benjamin Riley, founder of , an AI-focused education think tank, said the episode seems like an example of clear-cut academic dishonesty. Everyone involved in the civil case, he said, especially the boy鈥檚 parents and their lawyer, 鈥渟hould be embarrassed. This isn’t some groundbreaking lawsuit that will help define the contours of how we use AI in education; it’s helicopter parenting run completely amok that may serve as catnip to journalists (and their editors) but does nothing to illuminate anything.鈥

This isn't some groundbreaking lawsuit that will help define the contours of how we use AI in education; it's helicopter parenting run completely amok.

Benjamin Riley, Cognitive Resonance

Alex Kotran, founder of , a nonprofit that offers a free AI literacy curriculum, said the honor society director鈥檚 statement about the boys鈥 alleged academic dishonesty makes him think 鈥渢here’s clearly plenty more than what we’re hearing from the student.鈥 While schools genuinely do need to understand the challenge of getting AI policies right, he said, 鈥淚 worry that this is just a student with overbearing parents and a big check to throw lawyers at a problem.鈥

Others see the case as surfacing larger-scale problems: Writing in this week, Jane Rosenzweig, director of the and author of the newsletter, said the Massachusetts case is 鈥渓ess about AI and more about a family鈥檚 belief that one low grade will exclude their child from the future they want for him, which begins with admission to an elite college.鈥

That problem long predated ChatGPT, Rosenzweig wrote. But AI is putting our education system on a collision course 鈥渨ith a technology that enables students to bypass learning in favor of grades.鈥

“I feel for this student,鈥 said Warner, the writing coach. 鈥淭he thought that they need to file a lawsuit because his future is going to be derailed by this should be such an indictment of the system.鈥

The case underscores the need for school districts to rethink how they interact with students in the Age of AI, he said. 鈥淭his stuff is here. It’s embedded in the tools students use to do their work. If you open up Microsoft Word or Google Docs or any of this stuff, it’s right there.鈥

What do we want them to take away from their education beyond a credential? Because this technology really does threaten the integrity of those credentials.

John Warner, writing coach

Perhaps as a result, Warner said, students have increasingly come to view school more transactionally, with assignments as a series of products rather than as an opportunity to learn and develop important skills.

鈥淚’ve taught those students,鈥 he said. 鈥淔or the most part, those are a byproduct of disengagement, not believing [school] has anything to offer 鈥 and that the transaction can be satisfied through 鈥榥on-work鈥 rather than work.鈥

His observations align with recent research by Dusseault鈥檚 colleagues, who that four graduating classes of high school students, or about 13.5 million students, had been affected by the pandemic, with many 鈥渟truggling academically, socially, and emotionally鈥 as they enter adulthood.

Ideally, Warner said, AI tools should offer an opportunity to refocus students to emphasize process over product. 鈥淭his is a natural design for somebody who teaches writing,鈥 he said, 鈥渂ecause I’m obsessed with process.鈥漌arner recalled giving a recent series of talks at , a small, alternative liberal arts college in California, where he encountered students who said they had no use for AI chatbots. They preferred to think through difficult problems themselves. 鈥淭hey were just like, ‘Aw, man, I don’t want to use that stuff. Why do I want to use that stuff? I’ve got thoughts.’鈥

]]>
鈥楧istrust, Detection & Discipline:鈥 New Data Reveals Teachers鈥 ChatGPT Crackdown /article/distrust-detection-discipline-new-data-reveals-teachers-chatgpt-crackdown/ Tue, 02 Apr 2024 20:01:00 +0000 /?post_type=article&p=724713 New survey data puts hard numbers behind the steep rise of ChatGPT and other generative AI chatbots in America鈥檚 classrooms 鈥 and reveals a big spike in student discipline as a result. 

As artificial intelligence tools become more common in schools, most teachers say their districts have adopted guidance and training for both educators and students, by the nonprofit Center for Democracy and Technology. What this guidance lacks, however, are clear instructions on how teachers should respond if they suspect a student used generative AI to cheat. 


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


鈥淭hough there has been positive movement, schools are still grappling with how to effectively implement generative AI in the classroom 鈥 making this a critical moment for school officials to put appropriate guardrails in place to ensure that irresponsible use of this technology by teachers and students does not become entrenched,鈥 report co-authors Maddy Dwyer and Elizabeth Laird write.

Among the middle and high school teachers who responded to the online survey, which was conducted in November and December, 60% said their schools permit the use of generative AI for schoolwork 鈥 double the number who said the same just five months earlier on a similar survey. And while a resounding 80% of educators said they have received formal training about the tools, including on how to incorporate generative AI into assignments, just 28% said they鈥檝e received instruction on how to respond if they suspect a student has used ChatGPT to cheat. 

That doesn鈥檛 mean, however, that students aren鈥檛 getting into trouble. Among survey respondents, 64% said they were aware of students who were disciplined or faced some form of consequences 鈥 including not receiving credit for an assignment 鈥 for using generative AI on a school assignment. That represents a 16 percentage-point increase from August. 

The tools have also affected how educators view their students, with more than half saying they鈥檝e grown distrustful of whether their students鈥 work is actually theirs. 

Fighting fire with fire, a growing share of teachers say they rely on digital detection tools to sniff out students who may have used generative AI to plagiarize. Sixty-eight percent of teachers 鈥 and 76% of licensed special education teachers 鈥 said they turn to generative AI content detection tools to determine whether students鈥 work is actually their own. 

The findings carry significant equity concerns for students with disabilities, researchers concluded, especially in the face of are ineffective.

]]>
High School Cheating Increase from ChatGPT? Research Finds Not So Much /article/high-school-cheating-increase-from-chatgpt-research-finds-not-so-much/ Tue, 06 Feb 2024 19:30:00 +0000 /?post_type=article&p=721579 The rise of AI chatbot tools caused panic among high school teachers and administrators nationwide 鈥 but researchers say the frequency of students cheating on assignments remained 鈥渟urprisingly鈥 stagnant.

According to from Stanford University, about 60 to 70 percent of high school students surveyed in the fall of 2023 have engaged in cheating behavior 鈥 the same number prior to the debut of ChatGPT in the fall of 2022.

鈥淚 thought that we would see higher numbers in the fall so it was a little surprising to me,鈥 said Denise Pope, a senior lecturer at Stanford鈥檚 Graduate School of Education who surveyed students across 40 high schools through an she co-founded.

Victor Lee, an associate professor at Stanford鈥檚 Graduate School of Education who helped oversee the research with Pope, said high school students are 鈥渦nderwhelmed鈥 by AI chatbot tools.

鈥淚t just sounds very sterile and vanilla to them,鈥 Lee said. 鈥淭hey may have heard about it, but the media a lot of kids are using are quite different than the ones adults and working professionals are attuned to.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


A conducted by the in the fall of 2023 found nearly one-third of students aged 13 to 17 have never heard of ChatGPT and another 44 percent have only heard 鈥渁 little鈥 about it. 

From those who were familiar with ChatGPT, the vast majority 鈥 about 81 percent 鈥 said they had not used it to help with school work.

鈥淢any teens are using a variety of technology鈥but] among those who鈥檝e heard at least a little about ChatGPT, shares of them still aren鈥檛 sure how they feel about it,鈥 said Colleen McClain, a research associate at the Pew Research Center.

Here are four things to know about the effects AI chatbot tools have had on high school cheating:

1. High school students who weren鈥檛 cheating before aren鈥檛 cheating now.

According to the , surveys of more than 70,000 high schools from 2002 to 2015 found about 64 percent of students cheated on a test 鈥 a similar outcome to Stanford鈥檚 findings after the rise of AI chatbot tools.

Pope said what surprises educators and parents the most is how common cheating has been.

鈥淲e know from our research that when students do cheat, it鈥檚 typically for reasons that have very little to do with their access to technology,鈥 Pope told .

鈥淲hen a student is less engaged, when they feel like they don’t belong or are not respected or valued in their community, when they鈥檙e stressed and highly sleep deprived 鈥 these are things that tend to correlate with cheating,鈥 Pope said. 

Lee said this number will 鈥渃onsistently stay there unless schools engage in certain steps to be thoughtful about what climate they’re creating that motivates cheating.鈥

This includes tapping into the topics students are already interested in and developing useful skills based on how they naturally enjoy learning.

鈥淎 lot of the time, the AI students encounter is via Snapchat because they have a chatbot built into it,鈥 Lee said. 鈥淎nd students aren鈥檛 turning to Google as their primary search, they turn to YouTube鈥or] video-based searches rather than text-based.鈥

2. ChatGPT awareness is higher among White, wealthier and older students.

Pew found about 72 percent of white students had at least some knowledge of ChatGPT compared to 56 percent of Black students.

In addition, more than 75 percent of students in households with an annual income of $75,000 or more had some knowledge of ChatGPT compared to 41 percent of students in households with annual incomes under $30,000.

Data courtesy of the Pew Research Center. (Chart: Meghan Gallagher/社区黑料)

McClain pointed to the 鈥渄igital divide鈥 as an explanation for Pew鈥檚 survey findings.

鈥淭he pattern here is quite striking,鈥 McClain said. 鈥淚t certainly speaks to the fact that not every teen is equally likely to have heard about these tools and used them.鈥

She added how awareness of ChatGPT was seen more in older students 鈥 particularly those in 11th and 12th grade.

鈥淓ven among those who heard at least a little about ChatGPT鈥young] teens may still be figuring out how they feel about it,鈥 McClain said.

3. High school students have adopted a 鈥済ood faith鈥 approach to AI chatbot tools.

Pew found only 20 percent of students aged 13 to 17 said ChatGPT was acceptable to write essays compared to 57 percent who said it was not.

But, nearly 70 percent said it was acceptable to research new topics compared to 13 percent who said it was not.

Data courtesy of the Pew Research Center. (Chart: Meghan Gallagher/社区黑料)

The Stanford researchers found similar outcomes.

At four high schools surveyed this fall 2023, about 9 to 16 percent of students used AI chatbot tools to write essays and about 55 to 77 percent used it to generate an idea for a paper, project or assignment.

Data courtesy of Stanford鈥檚 Graduate School of Education. (Chart: Meghan Gallagher/社区黑料)

鈥淭he vast majority don鈥檛 want AI to do all the work for them so they鈥檙e coming into this with sort of a good faith effort,鈥 Lee said.

鈥淲hen I鈥檝e had conversations with educators, they sort of breathe a sigh of relief and think 鈥榦h okay let鈥檚 think about some of the cool things we could do鈥 and that鈥檚 exciting,鈥 Lee added.

4. Prohibiting AI chatbot tools won鈥檛 solve the systemic issues of why students cheat.

For Pope, finding comfort around AI chatbot tools starts with educators and parents including their students into the conversation.

鈥淚f you’re going to come up with a classroom or home policy, you want to have the students present, speaking up and telling you what they think will be the most useful and appropriate uses of AI,鈥 Pope said.

Lee said addressing AI chatbot tool usage in high schools is just the 鈥渢ip of a much larger iceberg.鈥

鈥淧art of why we get concerned is because students feel pretty disenfranchised from the boring assignments, tedious homework and essays in these weird written formats that they don鈥檛 feel will provide them any long term need or use,鈥 Lee said.

鈥淚 don鈥檛 see us as saying AI is the best thing since sliced bread, but I also don鈥檛 think of us as saying AI is going to destroy humanity,鈥 Lee added.

]]>
Survey: AI is Here, but Only California and Oregon Guide Schools on its Use /article/survey-ai-is-here-but-only-california-and-oregon-guide-schools-on-its-use/ Wed, 01 Nov 2023 04:01:00 +0000 /?post_type=article&p=717117 Artificial intelligence now has a daily presence in many teachers鈥 and students鈥 lives, with chatbots like ChatGPT, Khan Academy鈥檚 tutor and AI image generators like all freely available. 

But nearly a year after most of us came face-to-face with the first of these tools, a that few states are offering educators substantial guidance on how to best use AI, let alone fairly and with appropriate privacy protections.

As of mid-October, just two states, California and , offered official guidance to schools on using AI, according to the Center for Reinventing Public Education at Arizona State University. 

CRPE said 11 more states are developing guidance, but that another 21 states don鈥檛 plan to give schools guidelines on AI 鈥渋n the foreseeable future.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


Seventeen states didn鈥檛 respond to CRPE鈥檚 survey and haven鈥檛 made official guidance publicly available.

Bree Dusseault

As more schools experiment with AI, good policies and advice 鈥 or a lack thereof 鈥 will 鈥渄rive the ways adults make decisions in school,鈥 said Bree Dusseault, CRPE鈥檚 managing director. That will ripple out, dictating whether these new tools will be used properly and equitably.

鈥淲e’re not seeing a lot of movement in states getting ahead of this,鈥 she said. 

The reality in schools is that AI is here. Edtech companies are pitching products and schools are buying them, even if state officials are still trying to figure it all out. 

Satya Nitta

鈥淚t doesn’t surprise me,鈥 said Satya Nitta, CEO of , a generative AI company developing voice-activated assistants for teachers. 鈥淣ormally the technology is well ahead of regulators and lawmakers. So they’re probably scrambling to figure out what their standard should be.鈥

Nitta said a lot of educators and officials this week are likely looking 鈥渧ery carefully鈥 at Monday鈥檚 on AI 鈥渢o figure out what next steps are.鈥 

The order requires, among other things, that AI developers share safety test results with the U.S. government and develop standards that ensure AI systems are 鈥渟afe, secure, and trustworthy.鈥 

It follows five months after the U.S. Department of Education released a detailed, with recommendations on using AI in education.

Deferring to districts

The fact that 13 states are at least in the process of helping schools figure out AI is significant. Last summer, no states offered such help, CRPE found. Officials in New York, , Rhode Island and Wyoming said decisions about many issues related to AI, such as academic integrity and blocking websites or tools, are made on the local level.

Still, researchers said, it鈥檚 significant that the majority of states still don鈥檛 plan AI-specific strategies or guidance in the 2023-24 school year.

There are a few promising developments: North Carolina will soon require high school graduates to pass a computer science course. In Virginia, Gov. Glenn Youngkin in September on AI careers. And Pennsylvania Gov. Josh Shapiro in September to create a state governing board to guide use of generative AI, including developing training programs for state employees.

Tara Nattrass

But educators need help understanding artificial intelligence, 鈥渨hile also trying to navigate its impact,鈥 said Tara Nattrass, managing director of innovation strategy at the International Society for Technology in Education. 鈥淪tates can ensure educators have accurate and relevant guidance related to the opportunities and risks of AI so that they are able to spend less time filtering information and more time focused on their primary mission: teaching and learning.鈥

Beth Blumenstein, Oregon鈥檚 interim director of digital learning & well-rounded access, said AI is already being used in Oregon schools. And the state Department of Education has received requests from educators asking for support, guidance and professional development.

Beth Blumenstein

Generative AI is 鈥渁 powerful tool that can support education practices and provide services to students that can greatly benefit their learning,鈥 she said. 鈥淗owever, it is a highly complex tool that requires new learning, safety considerations, and human oversight.鈥

Three big issues she hears about are cheating, plagiarism and data privacy, including how not to run afoul of Oregon鈥檚 Student Information Protection Act or the federal Children鈥檚 Online Privacy and Protection Act. 

鈥楴ow I have to do AI?鈥

In August, CRPE conducted focus groups with 18 superintendents, principals and senior administrators in five states who said they were cautiously optimistic about AI鈥檚 potential, but many complained about navigating yet another new disruption.

鈥淲e just got through this COVID hybrid remote learning,鈥 one leader told researchers. 鈥淣ow I have to do AI?鈥

Nitta, Merlyn Mind鈥檚 CEO, said that syncs with his experience.

鈥淏roadly, school districts are looking for some help, some guidance: 鈥楽hould we use ChatGPT? Should we not use it? Should we use AI? Is it private? Are they in violation of regulations?鈥 It’s a complex topic. It’s full of all kinds of mines and landmines.鈥 

And the stakes are high, he said. No educator wants to appear in a newspaper story about her school using an AI chatbot that feeds inappropriate information to students. 

鈥淚 wouldn’t go so far as to say there’s a deer-caught-in-headlights moment here,鈥 Nitta said, 鈥渂ut there’s certainly a lot of concern. And I do believe it’s the responsibility of authorities, of responsible regulators, to step in and say, 鈥楬ere’s how to use AI safely and appropriately.鈥 鈥澛

]]>
Before Trump, D.A. Fani Willis Targeted Teachers in Atlanta Cheating Scandal /article/before-trump-d-a-fani-willis-targeted-teachers-in-atlanta-cheating-scandal/ Fri, 18 Aug 2023 11:30:00 +0000 /?post_type=article&p=713554 A decade before she unleashed the sprawling case now entangling former President Donald Trump in Georgia, Fulton County District Attorney Fani Willis used similar methods to target an unlikely group: public school educators in Atlanta.

As an assistant district attorney in 2013, Willis turned heads in one of her first big cases: She helped convene a grand jury that indicted decorated Superintendent Beverly Hall and nearly three dozen other educators for cheating on state standardized tests. In the end, Willis brought a dozen cases to trial, with a jury convicting 11.

This week, Willis invoked the same statute 鈥 Georgia鈥檚 Racketeer Influenced and Corrupt Organizations, or RICO, Act 鈥 to indict Trump and 18 others in an alleged plot to overturn the state鈥檚 2020 election results. 

In doing so, she offered a reminder of her role in a divisive chapter in the city鈥檚 recent history. While the former president that Willis is, among other things, 鈥渁 rabid partisan,鈥 the cheating prosecutions left fissures in her own community, where many say she stood up for children but others accuse her of turning her back on Black educators. 

鈥楥ooking the books鈥

Hall, the Atlanta superintendent, arrived in the district in 1999, eventually leading what she would call a data-driven turnaround. She told observers that under her tenure, Atlanta schools were 鈥渄ebunking the American algorithm that socio-economics predicts academic success,鈥 The Atlanta Journal-Constitution .

By 2009, her efforts had earned her one of education鈥檚 top honors: . But the same year, the Journal-Constitution the first of several stories analyzing Atlanta鈥檚 results on the Georgia Criterion-Referenced Competency Test. The analysis found that scores had risen at rates that were statistically 鈥渁ll but impossible.鈥 It also found that district officials disregarded internal irregularities and retaliated against whistleblowers. 

Critics would soon compare Hall to 鈥渁 Mafia boss who demanded fealty from subordinates while perpetrating a massive, self-serving fraud,鈥 the city newspaper reported at the time. Willis pursued Hall using the same tools many prosecutors employ against Mafia bosses and drug kingpins. In bringing charges under the state鈥檚 RICO Act, Willis alleged that Hall and her colleagues used the 鈥渓egitimate enterprise鈥 of the school system to carry out an illegitimate act: cheating.

Lonnie King, a former head of the local NAACP, the newspaper that when he looked at the data, 鈥淚 thought Beverly Hall was cooking the books鈥 as early as 2006.

The newspaper鈥檚 coverage led Gov. Sonny Perdue to appoint a team of special investigators, who conducted 2,100 interviews and reviewed 800,000 documents. By 2011, they uncovered cheating in 44 of the 56 schools they examined, concluding that 178 educators participated. Investigators eventually found widespread tampering with test papers and concluded that Hall stood at the center of 鈥渁 culture of corruption.鈥

Special investigator Michael Bowers, a former state attorney general, in 2013 that interrogating teachers in the scheme had left him in tears.

“The thing I remember most was talking to some of the teachers who had been mistreated, mostly single moms,鈥 he said. 鈥淎nd it’s heartbreaking. They told of how they had been forced to cheat.鈥 One told him, 鈥淚 had no choice.鈥

鈥極n the backs of babies鈥

Hall retired in 2011, but on March 29, 2013, a Fulton County grand jury indicted her and more than 30 others in what Willis called a conspiracy comprising administrators, principals, teachers and even a school secretary.

Similar to this week鈥檚 indictments, the Atlanta defendants faced charges of racketeering, conspiracy and making false statements. Hall also faced theft charges because her rising salary was tied to test scores 鈥 in 2009, the year she was named Superintendent of the Year, she got , prosecutors noted.

Former Atlanta Mayor Andrew Young, who in 2014 asked the judge in Superintendent Beverly Hall鈥檚 criminal trial to be 鈥渕erciful鈥 and drop the case. Hall died of breast cancer in 2015. (Monica Morgan/Getty Images)

If convicted, Hall could have served as many as 45 years in prison, but she soon fell ill and the judge in the case indefinitely postponed her trial. At an April 2014 hearing, Andrew Young, a former Atlanta mayor and United Nations ambassador, rose in the courtroom and asked the judge to be 鈥渕erciful鈥 and drop the case against her.

鈥淟et God judge her,鈥 he said.

Hall died of breast cancer in 2015, at age 68.

Public opinion on the case was sharply divided, with many Black commentators accusing Willis of overreach. But eventually, 34 of Hall鈥檚 subordinates faced criminal charges.

Brittney Cooper

Brittney Cooper, a professor of Women’s and Gender Studies and Africana Studies at Rutgers University, : 鈥淪capegoating Black teachers for failing in a system that is designed for Black children, in particular, not to succeed is the real corruption here.鈥

Cooper noted that former Washington, D.C., Schools Chancellor Michelle Rhee, who is Korean-American, had also been for creating a 鈥渃ulture of fear about test scores.鈥 An by USA Today revealed findings similar to Atlanta鈥檚, but an inspector general report found of widespread cheating and Rhee never faced prosecution.

While most of the Atlanta educators eventually pleaded guilty to avoid jail time, 12 went to trial in 2014. As with the Trump case, this one was complex: Jury selection took more than , and jurors sat through complex statistical analyses of answer-sheet erasure patterns, among other matters. At a few points in the trial, a dozen or more lawyers offered different versions of events.

A demonstrator holds a sign in support of prosecutor Fani Willis outside of the Lewis R. Slaton Courthouse before this week鈥檚 indictment of former U.S. President Donald Trump in Atlanta, Georgia. (Christian Monterrosa/AFP)

In an early case that went to trial in 2013, Willis said supervisor Tamara Cotman worked to protect educators鈥 jobs by advising principals under investigation not to cooperate with state investigators 鈥 a charge Cotman denied 鈥 and by vowing to return high test scores at any cost.

鈥淪he did it on the backs of babies,鈥 Willis during closing arguments. The jury acquitted Cotman, who was later convicted of other charges in the larger case.

Former President Donald Trump at the Georgia state GOP convention on June 10, 2023. Fani Willis, the prosecutor who is pursuing the Georgia election case, made a name for herself a decade ago by pursuing similar racketeering charges against Atlanta educators. (Anna Moneymaker/Getty Images)

In court, Willis told the jury of 鈥渃heating parties鈥 at which educators got together to erase children鈥檚 incorrect answers on test sheets and pencil in correct ones. At a few of the parties, she said, educators 鈥渁te fish and grits 鈥 I can鈥檛 make this up.鈥 

The jury convicted 11 of the 12 of racketeering and other charges.

The Rev. Dr. Raphael G. Warnock, at the time senior pastor of Ebenezer Baptist Church 鈥 he now serves as a U.S. Senator 鈥 The New York Times, 鈥淭here鈥檚 no question that this has not been our finest hour. It鈥檚 a dark chapter, but it鈥檚 just that. It鈥檚 a chapter.鈥

In 2015, commentators Van Jones and Mark Holden that the educators convicted in the case were 鈥渢he latest victims of overcriminalization,鈥 facing serious jail time because of Willis鈥檚 鈥渦nprecedented use鈥 of RICO. Three were sentenced to seven years in prison, they noted, while others received one- or two-year sentences if they didn鈥檛 accept plea deals. 

鈥淭hese punishments do not fit the crimes,鈥 they wrote. 

Sen. Raphael G. Warnock, then senior pastor of the Ebenezer Baptist Church, called the cheating scandal a “dark chapter.” (Curtis Compton/Getty Images)

Since then, several of the defendants have loudly proclaimed their innocence, even as they鈥檝e served prison time or pursued appeals to avoid it. A handful of those cases remain outstanding. In several instances, they and their defenders say they’ve spent their life savings pursuing appeals.

In 2019, Shani Robinson, one of those found guilty, about the ordeal. In an interview, , 鈥渢he thought of being blamed for something that I did not do is horrifying. … I felt like if I was on the right side of justice, that one day I would be vindicated. That was the moment that I decided that I would never take a plea deal.”

But many parents saw it differently.

Shawnna Hayes-Jocelyn had three of her four children in classes at schools affected by the cheating. She said Willis rightly brought RICO charges. 

鈥淵ou鈥檇 better believe she did the right thing, because that was the worst Black-on-Black crime example that could have ever happened around education,鈥 she told 社区黑料. 鈥淏ecause what they did to those children is that they didn’t give those children options and opportunities.鈥

Shawnna Hayes-Jocelyn

Hayes-Jocelyn said her mind was made up once she read the state report that alleged widespread cheating among educators. 

鈥淲hen I read that report and saw what was happening in that school system, yeah, people said, 鈥極h, this is RICO. We think about RICO as organized crime.鈥 I said, 鈥楾his was organized crime.鈥欌 

Those familiar with Willis鈥檚 work say she鈥檚 tenacious. Atlanta NAACP president Gerald Griggs, one of the defense attorneys in the cheating trial, told The Guardian this week that Trump is 鈥済oing to be very surprised when he鈥檚 sitting across from her for months on trial. He鈥檒l find out how great of a lawyer she really is.鈥

Asked in 2021 if she had regrets about pursuing the school cheating cases, Willis was blunt, the Times that by going after teachers, principals and administrators, she was 鈥渄efending poor Black children.” Public education, she said, offers these children their only chance to get ahead. 鈥淪o if what I am being criticized for is doing something to protect people that did not have a voice for themselves, I sit in that criticism, and y鈥檃ll can put it in my obituary.鈥

]]>
Texas Professors on ChatGPT: 鈥楽trategize, Don鈥檛 Demonize鈥 to Curb Academic Dishonesty /article/utep-on-chatgpt-strategize-dont-demonize-to-curtail-academic-dishonesty/ Sat, 05 Aug 2023 11:30:00 +0000 /?post_type=article&p=712642 This article was originally published in

A faculty member at the University of Texas at El Paso was grading a composition during the spring 2023 semester, and suspected that it was not the student鈥檚 work 鈥 until she got to that one sentence.

The instructor of the upper-level course with a strong writing component believed the essay was prepared, at least in part, by ChatGPT, an artificial intelligence program that launched last November. With a few prompts, users of the free AI program can produce essays, research papers, computer code and more with relative ease.

To stymie ChatGPT, the lecturer directed her students to base one of their answers on how they related to the assigned readings. The answer from the student in question did not sound like a student鈥檚 鈥渧oice.鈥 The final confirmation was the inclusion of something like 鈥淚 don鈥檛 have a personal experience because I鈥檓 AI.鈥


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The student, who earned a zero for his paper, acknowledged his offense and apologized to the instructor, who did not want to be named. The student is among those who tried to cut academic corners with ChatGPT. In most cases, these indiscretions were handled at the classroom level. More serious offenses were submitted to the university鈥檚 Office of Student Conduct and Conflict Resolution, or OSCCR.

鈥淚’d like to see more suggestions or training on how to proactively address ChatGPT with my students rather than solely acting in the role of ‘catching’ and disciplining them,鈥 the faculty member said.

To , UTEP conducted a series of workshops late last spring to inform faculty about the pervasive use of ChatGPT and other forms of AI. UTEP鈥檚 Center for Faculty Leadership and Development organized the presentations to increase awareness and, where possible, to educate faculty on how to use AI effectively in the classroom and as an assessment tool. About 50 university instructors from throughout the university attended the presentations.

Jeffrey Olimpo, director of the faculty leadership center, said the main concern workshop participants shared with him was students鈥 unethical use of ChatGPT and AI in general. His response was that AI is not going away.

鈥淲e came at it from an angle of, 鈥榊ou can鈥檛 put the toothpaste back in the tube,鈥欌 Olimpo said a few weeks after the last workshop.

The event鈥檚 presenters included representatives from OSCCR and the Provost鈥檚 Office. Olimpo recalled that the OSCCR official said that his office already had seen some potential ChatGPT cases.

Strategize, don鈥檛 demonize

The Office of Student Conduct and Conflict Resolution conducted 20 investigations into possible cases of academic dishonesty tied to the use of AI during the spring 2023 semester, according to the university. UTEP did not respond to a question about how those cases were resolved and said that OSCCR director Jovita Sim贸n would not comment on this story.

While the university was aware of ChatGPT鈥檚 potential downsides, Olimpo said there was no reason to chase it down with torches and pitchforks.

鈥淲e try to strategize and not demonize,鈥 he said.

Arthur Ramirez, a second-year UTEP doctoral student in finance, said he began to test ChatGPT soon after it launched to learn if it could help with his research. Initially, he was concerned with its inaccuracies, but found it helpful with coding, especially with better prompts, and to understand certain charts. He said the only instructions a professor gave him was to follow the university鈥檚 guidelines.

鈥淗e said there was no right or wrong way to use ChatGPT,鈥 Ramirez said. 鈥淛ust don鈥檛 abuse it.鈥

Responding to an El Paso Matters Instagram request for students to share their experiences, one UTEP student said that some of his professors encouraged students to use ChatGPT, while others warned them not to use it for plagiarism.

鈥淚 don鈥檛 see what the big deal is,鈥 wrote the student who identified himself as 鈥渟ergio.iii.鈥

Sergio.iii called the AI program an effective study and communication tool with the right prompts. He said it helped create outlines for papers, add focus to his PowerPoint presentations and often gave more understandable explanations to complicated topics.

鈥淭he students using ChatGPT unethically aren鈥檛 even being smart about it,鈥 he wrote via Instagram. El Paso Matters reached out to the user, but he did not respond. 鈥淢ost people use it in a brain-dead way where they just copy and paste answers straight out of ChatGPT and they end up with responses that look identical to a dozen other students.鈥

Leslie Waters, an assistant professor of history, did not offer ChatGPT instructions at the start of the spring 2023 semester. She believed the obscure primary source material from her 20th century European history course would be AI-proof. In one case she gave students copies of letters written by soldiers and their families during World War I and asked them to write essays based on the letters鈥 themes.

Three of her students submitted papers that focused generally on the war, but did not mention the letters or their themes. Additionally, the essays included ChatGPT red flags: grammatically correct sentences that lacked analysis and critical thinking. Each of those students earned low scores. Waters planned to send one of those cases to OSCCR, which she said uses software that can detect AI-generated material.

鈥淚t鈥檚 not easy (for me) to prove, but it鈥檚 extremely easy for me to detect,鈥 she said of ChatGPT work.

Her plan for the fall 2023 semester is to talk to her students about the perils of using ChatGPT, and to encourage them to stay on top of their coursework. It is her experience that students cheat out of desperation. She will give multi-level assignments that force students to submit papers at various stages to keep track of their progress.

Olimpo did not respond to several requests for the recommendations generated by his spring workshops, but he previously proposed that a faculty committee review and possibly update the university鈥檚 general course syllabus in regards to the use of AI tools.

A June 2023 article in The Chronicle of Higher Education included the results of a faculty survey of how to work with ChatGPT this fall. Two of the more popular ideas were to alter assignments to make AI participation less useful, and to incorporate AI in some work to help students understand its strengths and weaknesses.

As for El Paso Community College, its ChatGPT directive to students is to follow their professors鈥 instructions for assignments and the academic guidelines in the college鈥檚 Student Code of Conduct, said Keri Moe, associate vice president for External Relations Communication & Development.

鈥淐hatGPT, like any technology available, must be used with academic integrity and in accordance with these guidelines,鈥 Moe said.

Texas Tech University Health Sciences Campus El Paso did not respond to a request for instructions on how its leaders want faculty and students to use ChatGPT.

Academic integrity

While some faculty members want to use AI tools such as Turnitin to catch cheaters, Sarah Elaine Eaton, an associate professor in the Werklund School of Education at the University of Calgary, in Canada, advised them to not overreact.

During a May 16 virtual forum about 鈥淎cademic Integrity and AI,鈥 Eaton said that instructors should include a statement in their syllabus about the AI they plan to use to help with their assessments and inform the students about the limitations of those programs.

鈥淚t鈥檚 not about trying to use technology in order to catch students,鈥 Eaton said during the presentation. 鈥淣obody wins in an academic-integrity arms race. Deceptive assessment using tools and technologies without students鈥 knowledge ahead of time is not modeling integrity.鈥

Greg Beam, an associate professor of practice in UTEP Department of Communication, said that he taught an asynchronous virtual course this summer and strongly suspected that some students submitted work done by chatbots. He posted a video on Blackboard where he explained the right and wrong way to use ChatGPT.

Beam told the students that those who admitted that they used the technology improperly would be allowed to redo the assignment with no penalty. Additionally, he told them that he would contact those who did not come forward to ask them follow-up questions about their submissions to verify that they understood the material.

The professor said about 10% of those students redid the assignment. He suspected a few others, but those submissions lacked the tell-tale red flags. It made him wonder if some students had mastered ChatGPT enough to be undetectable.

鈥淔or the most part, at UTEP at least, I don’t think students want to cheat 鈥 they want to learn,鈥 Beam said. 鈥淎nd they’re just as concerned about the potential ramifications of these new technologies as the rest of us are.鈥

This first appeared on and is republished here under a Creative Commons license.

]]>
Did Prison Inmates Cheat on High School Equivalency Exam? /article/opi-report-no-evidence-to-support-montana-prison-cheating/ Fri, 30 Dec 2022 16:30:00 +0000 /?post_type=article&p=701558 This article was originally published in

An investigation into allegations a test administrator helped inmates cheat on a high school equivalency exam at the Shelby prison found 鈥渘o confirming evidence鈥 to support the claims, according to records from the Office of Public Instruction.

鈥淭he reporting individual was unable to provide supporting evidence or additional witnesses to validate his claim,鈥 the investigation report said.

However, testing has not resumed at the institution for a separate reason, according to CoreCivic, a private company that runs the state prison in Shelby.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


The prison, called the Crossroads Correctional Center, houses 757 male inmates, according to a Department of Corrections data dashboard.

The testing program, HiSET, allows adults without a high school diploma to earn the equivalent of a high school degree. OPI earlier provided records that said testing had been suspended at the prison as of April.

Last week, CoreCivic director of public affairs Ryan Gustin said the Shelby prison remains a HiSET testing site. However, he said a separate technology glitch unrelated to the allegation means testing has not resumed at the facility.

鈥淎ll of the IT teams are aware of this matter and are quickly working to alleviate that challenge,鈥 Gustin said.

The issue stems from a change in test provider and ensuing technology 鈥渉iccups,鈥 he said. Gustin said testing at the Crossroads location takes place using computers, and in the case of a provider change, 鈥測ou have to get one hand to shake the other hand.鈥

鈥淲e don鈥檛 have the option contractually to offer a paper exam,鈥 he said.

The Montana Department of Corrections said other 鈥淒OC-run facilities鈥 have not had testing difficulties 鈥 and Gustin said other locations are able to use paper testing, unlike at Crossroads.

The new testing provider and CoreCivic are working to resolve the issue as quickly as possible, Gustin said, and it鈥檚 in their interest to do so.

The original cheating allegation came from a previous Crossroads inmate who said staff were sharing exam questions with inmates before testing and assisting inmates during the HiSET tests. As a result, Montana OPI suspended all HiSET testing at Crossroads and requested an investigation.

The September investigation report found no evidence of cheating. It said investigators reviewed answer sheets and test reports but could not substantiate the claims.

A Crossroads principal who retired in August agreed she had shared 鈥渢est material鈥 with test takers, but she said she had shared 鈥渞eleased exam questions鈥 and The Official Guide to HiSET Exam, according to the investigation report. It said a new principal was unaware of any previous infractions.

Investigators requested additional information from the inmate who made the allegations, but the report said he admitted his accusation was 鈥渂ased on his assumptions.鈥 He also argued that because he had moved to a prerelease center, he was 鈥渘ot comfortable to freely share information,鈥 the report said.

The investigation report also said the inmate who made the accusations may have had a different reason for bringing forward a complaint: 鈥淚t is likely that this unfounded allegation could have arisen from a salary dispute between this individual and Crossroads Correction Center over tutoring hours.鈥

The report did not elaborate on the salary dispute.

A September letter from a senior national HiSET director provided by OPI with the investigation report and by CoreCivic said test results from the site are sound: 鈥淭ests were not compromised and HISET scores obtained by individuals testing here are valid.鈥

is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity. Daily Montanan maintains editorial independence. Contact Editor Darrell Ehrlick for questions: info@dailymontanan.com. Follow Daily Montanan on and .

]]>
The Essay鈥檚 Future: We Talk to 4 Teachers, 2 Experts and 1 AI Chatbot /article/the-future-of-the-high-school-essay-we-talk-to-4-teachers-2-experts-and-1-ai-chatbot/ Mon, 19 Dec 2022 18:01:00 +0000 /?post_type=article&p=701602 ChatGPT, an AI-powered 鈥渓arge language鈥 model, is poised to change the way high school English teachers do their jobs. With the ability to understand and respond to natural language, ChatGPT is a valuable tool for educators looking to provide personalized instruction and feedback to their students. 

O.K., you鈥檝e probably figured out by now that ChatGPT wrote that self-congratulatory opening. But it raises a question: If AI can produce a journalistic lede on command, what mischief could it unleash in high school English?

Actually, the chatbot, by the San Francisco-based R&D company Open AI, is not intended to make high school English teachers obsolete. Instead, it is designed to assist teachers in their work and help them to provide better instruction and support to their students.


Get stories like this delivered straight to your inbox. Sign up for 社区黑料 Newsletter


O.K., ChatGPT wrote most of that too. But you see the problem here, right?

English teachers, whose job is to get young students to read and think deeply and write clearly, are this winter coming up against a formidable, free-to-use foe that can do it all: With just a short prompt, it , , , song lyrics, short stories, , , even outlines and analyses of other writings. 

One user asked it to explaining that 鈥淪anta isn鈥檛 real and we make up stories out of love.鈥 In five trim paragraphs, it broke the bad news from Santa himself and told the boy, 鈥淚 want you to know that the love and care that your parents have for you is real. They have created special memories and traditions for you out of love and a desire to make your childhood special.鈥

One TikToker noted recently that users can upload a podcast, lecture, or YouTube video transcript and ask ChatGPT to take complete notes.

ChatGPT Taking Notes From YouTube

Many educators are alarmed. One high school computer science teacher last week, 鈥淚 am having an existential crisis.鈥 Many of those who have played with the tool over the past few weeks fear it could tempt millions of students to outsource their assignments and basically give up on learning to listen, think, read, or write.

Others, however, see potential in the new tool. Upon ChatGPT鈥檚 release, 社区黑料 queried high school teachers and other educators, as well as thinkers in the tech and AI fields, to help us make sense of this development.

Here are seven ideas, only one of which was written by ChatGPT itself:

1. By its own admission, it messes up.

When we asked ChatGPT, 鈥淲hat’s the most important thing teachers need to know about you?鈥 it offered that it鈥檚 鈥渘ot a tool for teaching or providing educational content, and should not be used as a substitute for a teacher or educational resource.鈥 It also admitted that it鈥檚 鈥渘ot perfect and may generate responses that are inappropriate or incorrect. It is important to use ChatGPT with caution and to always fact-check any information it provides.鈥

2. It鈥檚 going to force teachers to rethink their practice 鈥 whether they like it or not. 

Josh Thompson, a former Virginia high school English teacher working on these issues for the National Council of Teachers of English, said it鈥檚 na茂ve to think that students won鈥檛 find ChatGPT very, very soon, and start using it for assignments. 鈥淪tudents have probably already seen that it’s out there,鈥 he said. 鈥淪o we kind of have to just think, 鈥極.K., well, how is this going to affect us?鈥欌

Josh Thompson (Courtesy of Josh Thompson)

In a word, Thompson said, it鈥檚 going to upend conventional wisdom about what鈥檚 important in the classroom, putting more emphasis on the writing process than the product. Teachers will need to refocus, perhaps even using ChatGPT to help students draft and revise. Students 鈥渕ight turn in this robotic draft, and then we have a conference about it and we talk,鈥 he said.

The tool will force a painful conversation, Thompson and others said, about the utility of teaching the standard five-paragraph essay, which he joked 鈥渟hould be thrown out the window anyway.鈥 While it鈥檚 a good template for developing ideas, it鈥檚 really just a starting point. Even now, Thompson tells students to think of each of the paragraphs not as complete writing, but as the starting point for sections of a larger essay that only they can write.

3. It鈥檚 going to refocus teachers on helping students find their authentic voice.

In that sense, said Sawsan Jaber, a longtime English teacher at East Leyden High School in Franklin Park, Ill., this may be a positive development. 鈥淚 really think that a key to education in general is we’re missing authenticity.鈥

Technology like ChatGPT may force teachers to focus less on standard forms and more on student voice and identity. It may also force students to think more deeply about the audience for their writing, which an AI likely will never be able to do effectively.

Sawsan Jaber (Courtesy of Sawsan Jaber)

鈥淚 think education in general just needs a facelift,鈥 she said, one that helps teachers focus more closely on students鈥 needs. Actually, Jaber said, the benefits of a free tool like ChatGPT might most readily benefit students like hers from low-income households in areas like Franklin Park, near Chicago鈥檚 O鈥橦are Airport. 鈥淭he world is changing, and instead of fighting it, we have to ask ourselves: 鈥楢re the skills that we’ve historically taught kids the skills that they still need in order to be successful in the current context? And I’m not sure that they are.鈥

Jaber noted that universities are asking students to do more project-based and 鈥渦nconventional鈥 work that requires imagination. 鈥淪o why are we so stuck on getting kids to write the five-paragraph essay and worrying if they’re using an AI generator or something else to really come up with it?鈥

An AI generated image by Dall-E prompted with text “robot hanging out with cool high school students in front of lockers ” (Dall-E)

4. It could upend more than just classroom practice, calling into question everything from Advanced Placement assignments to college essays.

Shelley Rodrigo, senior director of the Writing Program at the University of Arizona, said the need for writing instruction won鈥檛 go away. But what may soon disappear is the 鈥渟implistic display of knowledge鈥 schools have valued for decades.

Shelley Rodrigo (Courtesy of Shelley Rodrigo)

鈥淚f it’s, ‘Compare and contrast these two novels,’ O.K., that’s a really generic assignment that AI can pull stuff from the Internet really easily,鈥 she said. But if an assignment asks students to bring their life experience to the discussion of a novel, students can鈥檛 rely on AI for help.

鈥淚f you don’t want generic answers,鈥 she said, 鈥渄on’t ask generic questions.鈥

In looking at coverage of the kinds of writing uploaded from ChatGPT, Rodrigo, also present-elect of NCTE, said it’s easy to see a pattern that others have commented on: Most of it looks like something that would score well on an AP exam. 鈥淧art of me is like, 鈥極.K., so that potentially is a sign that that system is broken.鈥欌

5. Students: Your teachers may already be able to spot AI-assisted writing.

While one of the advantages of relying on ChatGPT may be that it鈥檚 not technically plagiarism or even the product of an essay mill, that doesn鈥檛 mean it鈥檚 100% foolproof.

Eric Wang (Courtesy of Eric Wang)

Eric Wang, a statistician and vice president of AI at Turnitin.com, the plagiarism-detection firm, noted that engineers there can already detect writing created by large-language 鈥渇ill-in-the-next-word鈥 processes, which is what most AI models use.

How? It tends to follow predictable patterns. For one thing, it uses fewer sophisticated words than humans do: 鈥淲ords that are less frequent, maybe a little more esoteric 鈥 like the word ‘esoteric,’鈥 he said. 鈥淥ur use of rare words is more common.鈥

AI applications tend to use more high-probability words in expected places and 鈥渇avor those more probable words,鈥 Wang said. 鈥淪o we can detect it.鈥

Kids: Your untraceable essay may in fact be untraceable 鈥 but it鈥檚 not undetectable. 

6. Like most technological breakthroughs, ChatGPT should be understood, not limited or banned 鈥 but that takes commitment.

L.M. Sacasas, a writer who publishes, a newsletter on technology and culture, likened the response to ChatGPT to the early days of Wikipedia: While many teachers saw that research tool as radioactive, a few tried to help students understand 鈥渨hat it did well, what its limitations were, what might be some good ways of using Wikipedia in their research.鈥

In 2022, most educators 鈥 as well as most students 鈥 now see that Wikipedia has its place. A well-constructed page not only helps orient a reader; it鈥檚 also 鈥渒ind of a launching pad to other sources,鈥 Sacasas said. 鈥淪o you know both what it can do for you and what it can’t. And you treat it accordingly.鈥 

Sacasas hopes teachers use the same logic with ChatGPT.

More broadly, he said, teachers must do a better job helping students see how what they鈥檙e learning has value. So far, 鈥淚 think we haven’t done a very good job of that, so that it’s easier for students to just take the shortcut鈥 and ask software to fill in rather meaningless blanks.

If even competent students are simply going through the motions, he said, 鈥渢hat will encourage students to make the worst use of these tools. And so the real project for us, I’m convinced, is just to instill a sense of the value of learning, the value of engaging texts deeply, the value of aesthetic pleasure that cannot be instrumentalized. That’s very hard work.鈥

An AI generated image by Dall-E prompted with text 鈥渃lassroom full of robots sitting at desks.鈥 (Dall-E)

7. Underestimate it at your peril.

Open AI鈥檚 Sam Altman earlier this month tried to lower expectations, that the tool 鈥渋s incredibly limited, but good enough at some things to create a misleading impression of greatness.鈥

How does it feel, Bob Dylan, to see an AI chatbot write a song in your style about Baltimore? (Getty Images)

Ask ChatGPT to write a , for example, and 鈥 well, it鈥檚 not very good or very Dylanesque at the moment. The chorus:

Baltimore, Baltimore

My home away from home

The people are friendly

And the crab cakes are to die for.

Altman added, 鈥淚t鈥檚 a mistake to be relying on it for anything important right now.鈥 

Jake Carr (Courtesy of Jake Carr)

The tool鈥檚 capabilities in many ways may not be very sophisticated now, said , an English teacher in northern California. 鈥淏ut we’re fooling ourselves if we think something like ChatGPT isn’t only going to get better.鈥

Carr asked the tool to write a short story about 鈥渒ids who ride flying narwhals鈥 and got a rudimentary 鈥淕olden Books鈥 sort of tale. But then he got an idea: Could it produce an outline of such a story using Joseph Campbell鈥檚 鈥溾 template?

It could and it did, producing 鈥渁 pretty darn good outline鈥 that used all of the storytelling elements typically present in popular fiction and screenplays.

He also cut-and-pasted several of his students鈥 essay drafts into the tool and asked it to grade each one based on a rubric he provided.

Revolutionizing the English classroom with AI鈥攈ow can we use technology to enhance student learning and engagement? 馃 馃摎

鈥淚 tell you what: It’s not bad,鈥 he said. The tool even isolated each essay鈥檚 thesis statement.

Carr, who frequently posts TikToks about tech, admitted that ChatGPT is scary for many teachers, but that they should play with it and consider how it forces them to think more deeply about their work. 鈥淚f we don’t talk about it, if we don’t begin the conversation, it’s going to happen anyways and we just won’t get to be part of the conversation,鈥 he said. 鈥淲e just have to be forward thinking and not fear change.鈥

But perhaps we shouldn’t be too sanguine. Asked to write a haiku about is own potential for mayhem, ChatGPT didn’t mince words:

Artificial intelligence

Powerful and dangerous

Beware, for I am here

]]>
VIDEO: New Research Shows True Toll Cheating Scandal Took on Atlanta鈥檚 Most Vulnerable Students /article/video-new-research-shows-true-toll-cheating-scandal-took-on-atlantas-most-vulnerable-students/ Sat, 01 Jan 2000 00:00:00 +0000 When teachers cheat, students ultimately suffer in the long run.

Everyone who follows education news remembers the Atlanta cheating scandal, in which several teachers went to jail. But what happened to the students whose test scores were manipulated?

Research now that those students鈥 scores on later tests were noticeably lower, with enduring effects on English exams. (There鈥檚 also evidence that cheating led to an uptick in high school dropout rates among students whose tests were fudged)

The researchers suggest that the students were harmed because they didn鈥檛 get the remediation or extra support services they needed since their true academic standing was masked by the phony scores.

In other words, cheating on high-stakes exams doesn鈥檛 just corrupt accountability systems; it also inflicts lasting harm on students who were denied meaningful evaluations.

This was one of several new research papers discussed at a organized by CALDER 鈥 a research group affiliated with the American Institutes for Research. Read more of Matt鈥檚 top takeaways from the event here.

]]>