Me, myself and AIUnderstanding and safeguarding children’s use of AI chatbots
Internet Matters
Publikationsdatum:
|
![]() |
Dieses Biblionetz-Objekt existiert erst seit August 2025.
Es ist deshalb gut möglich, dass viele der eigentlich vorhandenen Vernetzungen zu älteren Biblionetz-Objekten bisher nicht erstellt wurden.
Somit kann es sein, dass diese Seite sehr lückenhaft ist.
Zusammenfassungen
AI chatbots are fast becoming part of children’s everyday lives. Children, especially those who are more
vulnerable, are engaging with AI chatbots not just as tools, but as companions – asking for advice,
sharing their daily lives and seeking friendship. Their growing use amplifies existing online risks and
introduces new ones, while protections for children have not kept pace with this rapid innovation.
Von Internet Matters im Text Me, myself and AI (2025) This report explores how children are interacting
with AI chatbots, which are computer programmes
designed to simulate conversation with a person.
Our findings show that while these tools can offer
clear benefits, such as 24/7 learning support and a
non-judgemental space to ask questions, they pose
risks to children’s safety and development.
A key concern highlighted by this research is that children are using AI chatbots in emotionally driven ways, including for friendship and advice, despite many of the popular AI chatbots not being built for children to use in this way. Almost a quarter (23%) of children who use AI chatbots have sought advice from the tools, and over a third (35%) of children who have used AI chatbots said chatting with an AI chatbot feels like talking to a friend, with this figure rising to 50% for vulnerable children.1 Furthermore, one in eight (12%) children who use AI chatbots said they talk to them because they have no one else to speak with. While these experiences can feel supportive, they also carry risks. Children may become overly reliant on AI chatbots or receive inaccurate or inappropriate responses, with experts suggesting they may also be less likely to seek help from trusted adults.
These themes are explored through insights drawn from focus groups, user testing and survey data, alongside desk research and expert interviews. Together, they paint a picture of how AI chatbots are reshaping childhood – and why coordinated action is needed urgently among various stakeholders to ensure children can explore their potential safely and positively.
Von Internet Matters im Text Me, myself and AI (2025) A key concern highlighted by this research is that children are using AI chatbots in emotionally driven ways, including for friendship and advice, despite many of the popular AI chatbots not being built for children to use in this way. Almost a quarter (23%) of children who use AI chatbots have sought advice from the tools, and over a third (35%) of children who have used AI chatbots said chatting with an AI chatbot feels like talking to a friend, with this figure rising to 50% for vulnerable children.1 Furthermore, one in eight (12%) children who use AI chatbots said they talk to them because they have no one else to speak with. While these experiences can feel supportive, they also carry risks. Children may become overly reliant on AI chatbots or receive inaccurate or inappropriate responses, with experts suggesting they may also be less likely to seek help from trusted adults.
These themes are explored through insights drawn from focus groups, user testing and survey data, alongside desk research and expert interviews. Together, they paint a picture of how AI chatbots are reshaping childhood – and why coordinated action is needed urgently among various stakeholders to ensure children can explore their potential safely and positively.
Dieser Text erwähnt ...
![]() Personen KB IB clear | Emily M. Bender , Wenxiang Fan , Timnit Gebru , Angelina McMillan-Major , Shmargaret Shmitchell , Jin Wang | |||||||||||||||||||||||||||
![]() Begriffe KB IB clear | Chatbot chat bot
, Eltern parents
, Generative Machine-Learning-Systeme (GMLS) computer-generated text
, GMLS als Therapeut:in
, Kinder children
| |||||||||||||||||||||||||||
![]() Texte |
|
Dieser Text erwähnt vermutlich nicht ... 
![]() Nicht erwähnte Begriffe | Chat-GPT, Eliza, GMLS & Bildung, LehrerIn, Schule |
Tagcloud
Zitationsgraph
Zitationsgraph (Beta-Test mit vis.js)
Volltext dieses Dokuments
![]() | Me, myself and AI: Artikel als Volltext ( : , 2489 kByte; : ) |
![]() | Me, myself and AI: Artikel als Volltext ( : , 2489 kByte; : ) |
Anderswo suchen 
Beat und dieser Text
Beat hat Dieser Text erst in den letzten 6 Monaten in Biblionetz aufgenommen. Beat besitzt kein physisches, aber ein digitales Exemplar. Eine digitale Version ist auf dem Internet verfügbar (s.o.). Aufgrund der wenigen Einträge im Biblionetz scheint er es nicht wirklich gelesen zu haben. Es gibt bisher auch nur wenige Objekte im Biblionetz, die dieses Werk zitieren.


Chatbot
Eltern
Generative Machine-Learning-Systeme (GMLS)
GMLS als Therapeut:in
Kinder

, 2489 kByte;
)
Biblionetz-History