What Do Young People Think About Generative AI and How Do They Use It?

It’s no secret: generative artificial intelligence is transforming the way we learn and work. Young people have been among the first (though not the only ones) to jump on the bandwagon. A recent study indicates that more than half of individuals aged 14 to 22 are already using these tools for all kinds of tasks, from researching a topic to exploring their artistic side. However, the enthusiasm for generative AI also brings concerns related to privacy, misinformation, and employment. Understanding how new generations use this technology and what they expect from it is essential to harness its benefits, address its risks, and move toward a more inclusive and ethical digital future.

What Do Young People Think About Generative AI and How Do They Use It?

IA y jóvenes

Generative artificial intelligence has become a key piece in rethinking how we process information and create content. Its algorithms, designed to produce text, images, or even audio and video sequences, offer us possibilities that until not long ago were out of our reach. Unsurprisingly, this has turned the world of education upside down, where the technology has taken root—especially among young people, who see it as a channel to, at best, find answers to their questions.

How does this technology influence the education of teenagers and young adults? To what extent can it close gaps in access to knowledge in under-resourced environments? What ethical challenges does its adoption pose? A report titled Teen and Young Adult Perspectives on Generative AI offered a detailed look at these questions, focusing on a group of 1,274 individuals aged 14 to 22 surveyed in the United States. Its findings revealed statistical data on young people’s use of AI tools, along with direct testimonials from youth with diverse socioeconomic and cultural backgrounds. What do the data tell us? Let’s take a closer look.

What Do Young People Think?

Most Common Educational Uses

According to the data collected, 51% of participants have tried a generative AI tool at least once, though only 4% use it on a daily basis. Regarding the most frequent educational uses, the report identifies three main areas in which young people apply generative AI in their studies:

  • Information retrieval (53%): many see it as a more advanced search engine, capable of answering specific questions and offering organized explanations.
  • Idea generation (51%): the ability to generate texts, examples, or possible research topics is appealing when a student needs initial inspiration.
  • Homework support (46%): several students use it to organize essay structures, design summaries, or plan projects.

One respondent noted that their main interest lies in understanding how AI tackles complex math problems, something a regular browser usually cannot accomplish with the same level of detail. In general, these experiences show that the technology can act as a multi-skilled assistant—so long as it’s used thoughtfully.

Beyond that, and according to respondents, the dominant emotion among those who have “dived” into generative AI is curiosity—especially for its ability to solve problems almost instantly and its apparent skill in generating multiple ideas. (“It helps generate ideas for schoolwork” or “It reduces the time needed to complete tasks, find information faster, more accurately, and better tailored to your needs,” are some of the testimonials collected in the study.)

Some young people surveyed find in generative AI a space where they can ask questions without fear of judgment, especially about personal or complicated topics (“It helps me ask questions without feeling any pressure”), reinforcing the perception of AI as a resource for exploring ideas and getting answers when human interactions might not feel as safe or available.

The report highlights that, in the surveys, lower-income groups reported less knowledge of AI tools. This situation not only jeopardizes their chance to benefit from the educational potential of the technology, but also feeds a perception of disadvantage compared to their more privileged peers.

Concerns

On the concern side, 24% associate AI with cheating or plagiarism, which highlights an ethical dilemma that cannot be ignored. Additionally, 22% expressed concerns about data privacy, and another group felt the technology didn’t provide real value for their daily needs.

There are also doubts about the reliability of its answers and how it handles personal information. For example, 22% of young people mentioned worries about privacy and how generative AI tools manage their personal data (“AI makes it easier to compromise, steal, or hack personal information”), and 17% had heard that these tools can be inaccurate or biased in the information they provide.

In the school environment, the report offers some reflections on how the use of generative AI can impact the development of critical thinking and encourage certain practices that discourage deep learning. Along these lines, some young people noted that overuse of generative AI could lead to relying on automatic answers instead of actively engaging in analysis and problem-solving (“AI does the work for you, but you’re left without knowing how you got to that answer,” or “It’s too easy to just accept what the AI says without checking. It doesn’t make you question anything.”)

The study also mentions that educators and young people are concerned about the risk of using generative AI merely as a tool to complete tasks quickly, rather than leveraging it as a complement to develop reasoning and creative skills. This is especially relevant in tasks like essay writing, where the planning and editing process is an integral part of learning. It’s no surprise that discussions arise about the legitimacy of an essay based on an AI-generated idea. Could we be facing a scenario where human creativity is subordinated to the product of an algorithmic model? This question opens the door to reflecting on how to balance technological assistance with holistic learning.

To mitigate these risks, it is suggested to integrate pedagogical strategies that promote critical thinking while using generative AI, such as activities that require students to verify the reliability of generated answers and reflect on the information-gathering process.

Challenges and Opportunities in Vulnerable Contexts

One of the great potential virtues of generative AI lies in its ability to reduce educational barriers in low-resource areas. Many students in vulnerable contexts could find in these tools a virtual tutor that answers questions 24/7. This, for example, would make it possible to solve science exercises or draft academic writings without needing to wait for a teacher’s availability.

This potential, however, faces obstacles. Poor internet connection, lack of adequate devices, or lack of knowledge about how to navigate tech platforms continue to limit adoption.

In this sense, it’s a priority to ensure that the incorporation of generative AI doesn’t widen educational gaps. Several testimonials reflect the concern that people without digital skills or adequate infrastructure could be left behind (“If you don’t have good internet or a computer at home, you can’t use these tools. That makes some of us fall further behind than others,” “Without someone teaching you how to use it well, you could end up using AI in ways that are unhelpful or even harmful.”)

To counter these imbalances, it’s essential to design training programs that include AI tools, but with an ethical, responsible, and above all, inclusive approach.

Unequal access to technology remains one of the issues most concerning to educators and families. While some students can connect with AI systems through high-performance computers and broadband connections, others depend on public networks or shared devices in libraries.

The report highlights that, in the surveys, lower-income groups reported less knowledge of AI tools. This situation not only jeopardizes their chance to benefit from the educational potential of the technology, but also feeds a perception of disadvantage compared to their more privileged peers.

For now, the new generations seem open to coexisting with these platforms but emphasize the need for clear rules.

A Future of Coexistence With Clear Rules

How do young people perceive the future with the widespread use of this type of technology? According to the report, 41% of participants believe AI will bring both benefits and adverse consequences. Another 19% fear job loss and the possible spread of misinformation, while the rest are split between expecting it to become a catalyst for positive change and feeling uncertain about its real impact.

Underlying these percentages is the hope that, by delegating repetitive tasks to algorithms, humans can focus on solving problems that require deeper reasoning. However, no one can guarantee that this transition will be immediate or equitable.

For now, the new generations seem open to coexisting with these platforms but emphasize the need for clear rules. Most respondents believe educational institutions should regulate their use and design spaces where the workings of AI models are explained to avoid misunderstandings about their capabilities.

Generative AI continues to evolve, and its integration into education largely depends on how schools and the rest of society adapt to its presence. The report suggests that, despite some reservations, a considerable portion of youth recognizes the practical value of this technology for completing specific tasks.

The challenge lies in crafting policies and methodologies that encourage its use with critical thinking, so that learning is reinforced and intellectual responsibility isn’t eroded. The report’s authors believe that the participation of young people themselves in the creation of such policies is fundamental, as they will be the first to live daily with AI models throughout their academic and professional lives.

As one respondent put it: “The world is changing, and we are the future.” This statement sums up the importance of listening to the voices of those growing up in direct contact with information technologies.

You may also be interested in…