How many times, in the last month, have you done something simply because an app reminded you? Drinking water, standing up from your chair to stretch your legs, or ordering sushi for dinner. How much of what we decide every day is really a matter of free will? We believe we freely choose the cafeteria menu, the TV programme, or even the course we enrol in. But a closer look reveals that many of those decisions have already been prepared in advance: the positioning of products in a supermarket, the insistent recommendation of a series on a streaming platform’s front page, or the notification that tells us “today is a good day to continue the pending lesson.”
Education is not immune to this logic. More and more, digital learning environments incorporate small nudges that seek to guide students, teachers, and families towards certain behaviours. Task reminders, symbolic rewards, personalised learning paths… These are interventions so discreet that we barely notice them, but, as behavioural sciences show, they have a remarkable power to influence what we do.
We call this phenomenon digital edunudging. Its defenders see it as a lever to reduce school drop-out rates, improve motivation, and personalise teaching. Its critics, however, warn of the ethical risks of delegating to algorithms the ability to guide human will. In the following pages we will explore these lights and shadows, asking ourselves how far the psychology of learning can – and should – go in times of algorithms.
From nudging to edunudging: psychological and educational foundations
In 2008, Richard Thaler and Cass Sunstein published a book that would end up influencing public policy, economics, and, little by little, education as well: Nudge: Improving Decisions About Health, Wealth, and Happiness. Their thesis was simple: the way a choice is presented conditions the decision we make. They called this choice architecture. Based on this idea, options need not be prohibited, but rather the “menu of possibilities” must be organised so that the most beneficial alternative is also the easiest to choose.
A classic example: in a school cafeteria, if fruit is placed at eye level and sweets on a lower shelf, the probability that students choose the apple instead of the chocolate increases. No one forces them, no one takes away options. But the environment gently pushes towards a healthier decision.
Now let us bring this idea into the classroom. Teachers know well that it is not enough to give instructions or insist on the importance of studying. Procrastination, fear of failure, or simply forgetfulness weigh more than intention. But what would happen if we could redesign the learning context to encourage those behaviours that, in the long run, lead to academic success? Welcome to EduNudging!
Some researchers have shown how these small nudges applied to education – a reminder by SMS to parents, an automatic notification of a pending task, a comparison of progress with the group average – can influence student motivation and perseverance.
As Lauren Braithwaite explains in her article Edunudging: the future of learning? the key is not to impose, but to create conditions that invite action. Just as the horse is more likely to approach the water if the trough is within reach and has a hint of apple, students respond better when the learning path is presented clearly, accessibly, and attractively.
The digital leap: algorithms, big data, and hypernudging
The digital revolution has made nudges much more sophisticated. Every interaction on an educational platform leaves a trail of data: which exercises we complete most quickly, at what point we tend to abandon the session, how long we take to answer a question. All those micro-gestures, invisible to a teacher in a traditional classroom, are captured and processed by algorithms that promise to know the student almost better than they know themselves.
This gives rise to the idea of hypernudges: nudges powered by Big Data and machine learning that not only suggest a behaviour, but adjust it dynamically according to each click. They are living, changing, adaptive nudges. If you struggled to complete a task yesterday, today you will receive a more insistent notification; if your progress falters, tomorrow the platform will reward you with a virtual badge to motivate you. A learning process that promises to be more personalised, more motivating, and more effective. Practice already offers notable examples:
- Gamification: platforms like Classcraft turn academic routines into a role-playing game, with points, avatars, and rewards. It is not just about entertainment: the game mechanics act as a nudge that keeps students engaged.
- Learning analytics: systems like Moodle Analytics make it possible to identify patterns of demotivation and send preventive alerts to teachers or students.
- Adaptive platforms: services like LinkedIn Learning or Khan Academy recommend courses and exercises based on previous trajectories, like a Netflix for knowledge.
Lauren Braithwaite describes it clearly in her article: we are moving from manual nudges to algorithms that design, in real time, decision-making environments for thousands of students at once. What was once a teacher’s advice or a one-off reminder now becomes an underground current of automatic suggestions that accompany the student in every click.
Just as you do not lose weight by going to the gym two days in a row, a good EduNudge does not simply remind you of an exam deadline, but helps to cultivate the daily perseverance of learning.
Does it really work? Empirical evidence and limitations
Do these digital nudges really work? The scientific literature offers a nuanced answer: sometimes yes, sometimes no.
In her article, Lauren Braithwaite cites the case of Georgia State University. Since 2012, this institution has fed a predictive analytics system with millions of data points about its students: courses taken, grades, enrolment patterns. The algorithm identifies who is at risk of dropping out and activates personalised nudges: emails, alerts, meetings with tutors. The result, between 2011 and 2018, was an increase in the graduation rate from 48% to 55%. Modest at first glance, but enormous if we think of the thousands of academic lives that managed to reach the finish line.
However, not everything is a success. At the Vrije Universiteit Amsterdam, researchers tested sending personalised nudges by email to online statistics students. The messages were adapted to their motivational profile and their self-perceived ability. The result? No significant difference compared to the group that only received generic reminders. The algorithm, by itself, did not manage to move the needle.
And here lies the big question: are we changing underlying behaviours or simply dressing up indicators? As academia itself points out, only about 4% of research on nudges focuses on education. What we know is still insufficient and often depends on very specific contexts.
Lauren Braithwaite also highlights a very important aspect: the most effective nudges do not obsess over the goal (better grades, higher attendance), but with the small habits that lead there. Just as you do not lose weight by going to the gym two days in a row, a good EduNudge does not simply remind you of an exam deadline, but helps to cultivate the daily perseverance of learning.
Thus, between promising studies and others that are more disappointing, the provisional conclusion is clear: digital EduNudging has potential, but its effectiveness depends as much on design as on context, and we are still far from having robust and generalisable evidence.
Risks and ethical dilemmas of digital EduNudging
So far, digital EduNudging seems like a good ally of education: less school drop-out, more motivation, and teaching tailored to each student. However, it is worth pausing for a moment and asking ourselves: at what price?
The first dilemma is who decides which behaviours are desirable. In a traditional classroom, that decision was mediated by the teacher, in dialogue with the school culture and with families. In the digital world, algorithms and the companies that design them can become the new architects of choice. Should a platform decide which tasks deserve more attention or what learning style is “correct”? And what happens if those decisions do not respond to the student’s interest, but to institutional or commercial objectives?
The second risk is student autonomy. Nudge theory has always maintained that freedom remains intact because options are not eliminated, only reorganised. But if nudges are so subtle that we hardly perceive them, are we not limiting our ability to decide consciously? In education, where the goal is to form critical citizens, this tension becomes especially delicate.
There are also algorithmic biases. The data with which systems are trained are not neutral: they reflect pre-existing social and educational inequalities. An algorithm that detects “drop-out risk” can end up reinforcing stigma about certain student profiles, instead of opening opportunities.
Finally, there is the temptation of economic and political instrumentalisation. As Braithwaite notes in her article, behind Georgia State’s success is not only the well-being of students, but also the pressure to reduce university system costs and improve the profitability of public investment. Turning teachers into “nudge operators” may mean less space for creative pedagogy and more for indicator management.
In short, digital edunudging opens the door to a fundamental dilemma: are we using the psychology of learning to empower students or to adjust them to external metrics of success? The difference is subtle, but it marks a world of distance between forming autonomous people or producing obedient learners.