Teachers reported that AI can also help improve students’ writing, as long as it is used to support students’ efforts and not do the work for them: “Teachers report that AI can ‘spark creativity’ and help students overcome writer’s block… At the writing stage, it can help with organization, coherence, syntax, semantics, and grammar. At the revision stage, AI can support editing and rewriting of ideas, as well as help with… punctuation, capitalization and grammar.
But if there’s a refrain to the report, it’s this: AI is most useful when it complements, not replaces, the efforts of a flesh-and-blood teacher.
Disadvantage: AI poses a serious threat to students’ cognitive development
At the top of Brookings’ list of risks is the negative effect that AI can have on children’s cognitive growth: how they learn new skills and perceive and solve problems.
The report describes a kind of fatal loop of AI dependency, where students increasingly offload their own thinking onto the technology, leading to the type of cognitive decline or atrophy most commonly associated with brain aging.
Rebecca Winthrop, one of the report’s authors and a senior fellow at Brookings, warns: “When children use generative AI that tells them what the answer is… they are not thinking for themselves. They are not learning to separate truth from fiction. They are not learning to understand what makes a good argument. They are not learning about different perspectives on the world because they are not actually engaging with the material.“
Cognitive offloading is not new. The report notes that keyboards and computers reduced the need for handwriting and calculators automated basic mathematics. But AI has “accelerated” this type of downloading, especially in schools where learning can seem transactional.
As one student told the researchers: “It’s easy. You don’t have to (use) your brain.”
The report offers a glut of evidence suggesting that students using generative AI are already experiencing declines in content knowledge, critical thinking, and even creativity. And this could have enormous consequences if these young people become adults without learning to think critically.
Advantage: AI can make teachers’ jobs a little easier
The report says another benefit of AI is that it allows teachers to automate some tasks: “generate emails to parents… translate materials, create worksheets, rubrics, quizzes and lesson plans,” and more.
The report cites multiple research studies that found significant benefits for teachers in terms of time savings, including a US study that found that teachers using AI save an average of nearly six hours a week and about six weeks over the course of an entire school year.
Pro/Con: AI can be a driver of equity or inequity
One of the strongest arguments in favor of the educational use of AI, according to the Brookings report, is its ability to reach children who have been excluded from classrooms. The researchers cite Afghanistan, where the Taliban has denied girls and women access to formal post-primary education.
According to the report, a program for afghan girls “It has employed AI to digitize the Afghan curriculum, create lessons based on this curriculum and disseminate content in Dari, Pashto and English through WhatsApp lessons.”
AI can also help make classrooms more accessible for students with a wide range of learning disabilities, including dyslexia.
But “AI can also greatly augment existing divisions,” Winthrop warns. This is because the free AI tools that are most accessible to students and schools may also be the least reliable and objectively least accurate.
“We know that wealthier communities and schools will be able to afford more advanced AI models,” Winthrop says, “and we know that those more advanced AI models are more accurate. Which means this is the first time in the history of educational technology that schools will have to pay more for more accurate information. And that really hurts schools without a lot of resources.”
Disadvantage: AI poses serious threats to social and emotional development
Survey responses revealed deep concern that the use of AI, particularly chatbots, “is undermining students’ emotional well-being, including their ability to build relationships, recover from setbacks, and maintain mental health,” the report says.
One of the many problems with children’s overuse of AI is that the technology is inherently sycophantic: it has been designed to reinforce users’ beliefs.
Winthrop says that if children develop social-emotional skills largely through interactions with chatbots that were designed to agree with them, “it becomes very uncomfortable to be in an environment where someone disagrees with you.”
Winthrop offers an example of a child interacting with a chatbot, “complaining about his parents and saying, ‘They want me to wash the dishes; this is so annoying. I hate my parents.'” The chatbot will probably say, “You’re right.” You are misunderstood. Very sorry. I understand you.’ Unlike a friend who would say, ‘Dude, I wash dishes all the time at my house.’ I don’t know what you’re complaining about. That’s normal.’ That’s the problem.”
TO recent survey from the Center for Democracy and Technology, a nonprofit that advocates for civil rights and civil liberties in the digital age, found that nearly 1 in 5 high school students said they or someone they know has had a romantic relationship with artificial intelligence. And 42% of students in that survey said they or someone they know has used AI as a companion.
The report warns that the echo chamber of AI can hinder a child’s emotional growth: “We learn empathy not when we are understood perfectly, but when we misunderstand and recover,” said one of the experts surveyed.
What to do about it?
The Brookings report offers a long list of recommendations to help parents, teachers, and policymakers (not to mention the tech companies themselves) reap the benefits of AI without subjecting children to the risks the technology currently poses. Among those recommendations:
- Schooling itself could focus less on what the report calls “completing transactional tasks” or a grade-based finish and more on fostering curiosity and the desire to learn. Students will be less willing to ask AI to do work for them if they feel committed to that work.
- AI designed for use by children and adolescents should be less sycophantic and more “antagonistic,” rejecting preconceived notions and challenging users to reflect and evaluate.
- Technology companies could collaborate with educators in “co-design centers.” In the Netherlands, a government-backed center already brings together technology companies and educators to develop, test and evaluate new AI applications in the classroom.
- Holistic AI literacy is crucial, for both teachers and students. Some countries, including China and Estonia, have comprehensive national guidelines on AI literacy.
- As schools continue to embrace AI, it is important that underfunded districts in underserved communities are not left behind, allowing AI to further drive inequality.
- Governments have a responsibility to regulate the use of AI in schools, ensuring that the technology used protects students’ cognitive and emotional health, as well as their privacy. In the United States, the Trump administration has tried to ban for states to regulate AI on their own, even though Congress has so far failed to create a federal regulatory framework.


