Every so often, someone announces the end of intelligence. First it was writing, then television, then video games, and now screens and artificial intelligence. The history of education seems to have been written in fits of panic. Today, much of the debate boils down to one question: technology, yes or no? And, as is often the case, the siren songs end up silencing the important questions: How can it be useful to us? How do we use it? What kind of learning do we want to promote?
The fear of screens is understandable. They change our habits, our attention, and the way we access knowledge. But confusing change with deterioration is a mistake as old as culture itself. Every technological advance has provoked mistrust before being accepted as a tool. The new always seems threatening when it is not understood.
This article does not seek to make screens either heroes or villains. Its purpose is much more modest: to examine those myths that are repeated and replicated with an air of revealed truth and, through scientific studies and expert voices, to dismantle (or at least attempt to dismantle) some of them in order to set the record straight. Because if there is one thing that education (including digital education) should teach us, it is to be wary of simplifications and to look at the facts with a little more calm.
Myth 1: “Screens hinder learning and make us dumber.”
Few phrases are repeated as often (and with as little basis) as this one. Every time a new device appears, someone warns that it is making us clumsier, more distracted, less able to think for ourselves. You know how it is: if something changes too quickly, it must be bad. However, the evidence does not support this.
A few months ago, an MIT study made headlines around the world by suggesting that using ChatGPT could impair critical thinking skills. The claim seemed devastating and was reproduced without nuance by much of the press. However, educator and researcher Lara Crespo, in a magnificent article that largely inspired this text, reminded us of something essential: we are disguising sensationalism as evidence.
As Crespo recalled and demonstrated in her article, the study itself acknowledged its limitations. The sample was small, the tasks were very limited, and the experiment was not conducted in a real educational context. In fact, the authors admitted that the observed effects were provisional, that the study had not yet undergone the required peer review, and that more research was needed before drawing general conclusions.
The accumulated evidence points in a different direction. Psychologist Richard Mayer, a pioneer in the study of multimedia learning, has spent decades demonstrating that well-designed digital environments improve comprehension, retention, and knowledge transfer. The key lies in pedagogical design, not in the presence or absence of screens.
Other experts, such as Michael B. Horn, emphasize that technology can personalize teaching, adapting it to the pace and style of each student. In a diverse classroom, this is not a whim: it is a tool for equity.
And as Carlos Magro, an expert in educational innovation, points out, the problem is not technological, but pedagogical. “If school does not make learning meaningful, students will look for shortcuts—with or without artificial intelligence,” he warns. Screens do not make us dumber. They simply require us to learn in a different way. The concern is not the tool, but how little we have changed the way we teach.
Myth 2: “Screens are bad for mental health.”
This myth sounds convincing because it stems from a real concern. The discomfort associated with excessive screen use (anxiety, insomnia, irritability) does exist. But turning that concern into a blanket condemnation is oversimplifying. Science has not yet found a clear and direct link between screens and psychological deterioration.
Likewise, researchers such as Sonia Livingstone and Danah Boyd have shown that the psychological impact of screen use does not depend so much on the amount of time spent in front of them, but rather on the type of activity, the relational context, and adult supervision. Passively watching videos alone for hours is not the same as participating in collaborative projects, creating content, or solving problems with digital tools in a guided environment.
Organizations such as the American Psychological Association (APA) and UNICEF propose a “balanced digital diet”: not eliminating technology, but teaching children to alternate it with rest, physical activity, and face-to-face relationships. Along the same lines, psychologist Marieth Lozano, from the Colombian College of Psychologists, reminds us that “the use of cell phones should be regulated, but not prohibited.”
However, the narrative of harm persists because it offers a simple explanation for much more complex problems. As philosopher Gregorio Luri points out, banning cell phones “is a cheap measure that promises miraculous results for nervous parents.” Children’s mental health is affected by multiple factors: academic pressure, inequality, lack of adult support. Blaming technology alone is an elegant way of looking the other way.
A study by the University of Oxford involving more than 300,000 adolescents found no strong correlation between screen time and emotional well-being. In fact, it detected positive effects when use was active and creative.
Ultimately, screens are neither poison nor therapy. They are a reflection of our practices. If we teach them to look critically and disconnect consciously, they can be part of a more lucid emotional education.
But none of this happens on its own. Teachers have a decisive role: when they propose the use of devices and applications for clear educational purposes, within a coherent plan of activities, they give pedagogical meaning to the use of technology and turn it into a learning tool. Families are also part of this balance: if they promote the use of technology for cooperative purposes—to solve problems, create together, or research—and abandon the idea that screens are efficient and cheap babysitters, they will be helping their children learn to coexist with digital technology in a healthy way.
Screens are neither the enemy nor the panacea. Blaming or praising them only distracts us from the essential: today, one of the most important educational tasks is learning to live with them with discernment, fairness, and humanity.
Myth 3: “Screens increase inequality”
Few statements sound as reasonable as this one. If some people have access to technology and others do not, it seems logical to think that the gap is widening. And this could, in part, be true. But confusing the symptom with the cause leads to greater errors. Screens do not create inequality: they only reflect it.
In vulnerable contexts, school is often the only place where children can access the internet, learn digital skills, or even have a device. Banning screens in these environments is, in practice, equivalent to closing another door.
Educator Mariana Maggio, from the University of Buenos Aires, has spoken of the “invisible expulsions” that this disconnection produces. Students who do not learn to use technology at school arrive at university or work at a disadvantage, without the basic digital skills that are now taken for granted. This is not a technical problem, but one of educational justice.
There is also a risk of double standards. Gregorio Luri sums it up ironically: “Schools that already have resources will continue to use them; those that don’t will be left behind. And that breaks equity.” Prohibition policies tend to be enforced more rigorously in public schools than in private ones, perpetuating the gap they claim to combat.
Digital inclusion is not about filling classrooms with tablets, but about ensuring access, support, and pedagogical meaning. As the European framework DigCompEdu reminds us, educating digital citizens does not mean using screens without control, but teaching them to understand, regulate, and use them critically.
Denying this opportunity to the most vulnerable is a form of silent exclusion. Screens do not increase inequality: it is the lack of policies to ensure that everyone can learn to use them that does. In a digital world, not teaching technology is teaching inequality.
Myth 4: “You can educate outside the digital world”
There are schools that continue to try to educate as if the 21st century had not begun. By banning screens, they believe they are preserving attention, calm, or the purity of learning, when in reality what they are preserving is a way of teaching that belongs to another time. Attempting to educate citizens without taking the digital world into account is like teaching sailing without ever leaving the harbor.
Educator Brandon Cardet-Hernández, a former school principal in New York, put it in terms that are difficult to refute: “Banning devices disconnects the school from the digital reality of its students.” In his opinion, the classroom should be a space where young people learn to use technology wisely, not where they are taught to fear it.
Talking about education today inevitably means talking about digital citizenship. Students live, communicate, and learn in a hybrid ecosystem, where the physical and virtual worlds are intertwined. Ignoring this environment does not protect them: it leaves them alone to face it. That is why more and more experts agree that teaching students to coexist with technology is part of the educational mission, not a distraction from it.
Schools cannot educate outside the digital world. Their task is not to exclude it, but to help students inhabit it with autonomy, ethics, and discernment. European reference frameworks, such as the aforementioned DigCompEdu or the European Union’s Digital Competence Framework (DigComp), insist on this idea: the critical and conscious integration of technology is an essential skill for teachers and students. Teaching in the 21st century means teaching how to think and act in the digital society.
Take a calm look
In the end, the debate about screens is not really about technology. It is about us, the school we want, and the kind of citizens we aspire to educate. Therefore, the question underlying each of these myths is the following: what does it mean to learn in times of change?
In this Observatory, we have repeated it ad nauseam: screens are neither the enemy nor the panacea. Making them the culprit or the savior only distracts us from the essential: today, one of the most important educational tasks is to learn to live with them with discernment, fairness, and humanity.
This is the real challenge for schools: not to teach against technology, but to teach how to live with it without losing our sense of meaning. To train people to think for themselves in a world saturated with stimuli, to defend their attention as an act of freedom.
Looking calmly does not mean looking less, but looking better. Perhaps education, today more than ever, consists of this: learning to look, including at screens, with attention, meaning, and hope.