ChatGPT & Kids: Impacts on Learning and Development

In this episode, we are exploring one of the most talked-about concepts today: AI. More specifically, we are diving into how tools like ChatGPT are impacting kids’ learning and development.

We’re joined by Dr. Tiffany Munzer, MD, a developmental–behavioral pediatrician and digital media researcher at the University of Michigan. Together, we break down AI integration in daily life, what this means for children’s learning and emotional development . We also look at what the future of AI in the classroom may look like and the rise of AI “companions” for social support.

Join us for an exciting and eye-opening conversation about this rapidly evolving tool, including and how we can help our kids use it wisely.

About Dr. Tiffany Munzer, MD

Dr. Tiffany Munzer is a Developmental-Behavioral Pediatrician and digital media researcher at University of Michigan. Her studies have included lab experiments comparing parent-toddler interactions during print book and e-book reading. Her recent work has examined how the pandemic has shaped families’ digital media experiences and infant digital media exposures. Lastly, she is the lucky mother of two young children, which has gifted her with a generous dose of humility and shared humanity with her patients.

Transcript 

Manahil (0:01 – 0:14)

Children today are being exposed to AI more than ever before. From homework help to AI chatbots that feel almost like friends, this technology is reshaping how children learn, think, connect, and explore the world.

 

Heemani (0:15 – 0:28)

So what does this mean for their development? What should families watch out for? And how do we guide kids towards using AI safely?

We’re unpacking all of this and more on this episode with Dr. Tiffany Munzer.

 

Manahil (0:33 – 0:35)

Welcome back to the Healthy Habits, Happy Homes podcast. I’m Manahil.

 

Heemani (0:36 – 0:51)

I’m Heemani. And today we have Dr. Tiffany Munzer joining us. Thank you so much, Tiffany. We’re so excited to have you here today. To get us started, would you be able to tell us a little bit about yourself, like your education and your experiences, and how they kind of got you to where you are today?

 

Dr. Tiffany Munzer (0:52 – 1:25)

Yeah, thank you so much, first of all, for the privilege of just getting to be with you both today and to share a little bit about AI and children. I’m a developmental behavioural paediatrician at the University of Michigan. And, I’m also a digital media researcher who focuses on how children and families use digital media. And, with AI being one novel way families are accessing digital media, that’s also been a new focus for me, as well.

 

Manahil (1:25 – 2:27)

Well, thank you so much for joining us. I think it’s more of a privilege for us to have you here so that we can learn all about what we’re talking about today. And, this topic is also very relevant to our audience, which is families with young children.

So, with that, we want to get right into the first question, which is about AI, especially because it’s such a relevant topic right now. We’re hearing about it every single day in different aspects of our lives, ChatGPT, Gemini, and other AI tools, and how they’re being woven into our everyday lives from even just like searching up something quickly, like, the first thing that you’re getting these days is, like, here’s the AI summary, instead of getting your regular Google searches. And, we also know that they’re also being used in the classroom now.

So, before we get into it, for listeners who might not be familiar, could you tell us a little bit about what generative AI or Gen AI or ChatGPT is and how it’s starting to show up in our daily lives and in education?

 

Dr. Tiffany Munzer (2:28 – 4:01)

Yeah, it’s a great question. Generative AI really includes a lot of applications that can create human-like contents, like text, images, videos, or audios. And, these AI-driven tools, as you mentioned, because of recent breakthroughs, are really becoming a part of our everyday lives. So, when we contact customer service, it might be AI to answer questions, or when you read the news, it might be AI, giving you that search content. And so, AI is really modelled on how the human brain works, you know, how we gather facts and descriptions and comments and make sense of it all to complete a specific task. But, unlike human knowledge, it doesn’t really have the ability to connect new information to all of our other life experiences.

So, that’s one way in which it differs from how humans naturally think and learn and integrate information. And so, like you said, you know, it’s really an increased adoption, especially among youth and our teenagers who are often at the forefront of experimenting with novel digital tools. You know, teens, kids and families are using this day-to-day, even if they’re not accessing platforms like ChatGPT, it could be in the algorithms that show them content online or other places that they’re interfacing with the internet.

 

Heemani (4:01 – 4:48)

Like you said, it’s definitely the teens and youth who are very interested in using AI and these new tools. And I, kind of, just want to get right into the impact that AI kind of has on children and their learning. So obviously, there’s a lot of excitement about, you know, the development of AI. But, at the same time, this also does bring a lot of concerns, especially when thinking about the children. And so, most of the families in our study have school aged children, kind of aged around six to 16 years. And, we know that this time in life is really important for things like cognitive development and also learning. And so, from your perspective, what are people most worried about when it comes to AI and children’s learning?

 

Dr. Tiffany Munzer (4:49 – 6:34)

Yeah, yeah, it’s a really great question. And, I think you really touched on, Heemani, like, how kids really need to be able to grapple with and experiment with learning. And, a big part of that learning process is making mistakes and incorporating those insights and building that frustration tolerance of navigating really hard content, you know, and, sometimes, some the way that some AI tools are designed, and depending on how they’re used, it might supplant or replace those opportunities for building that natural frustration tolerance of making those mistakes of learning how to integrate that feedback that’s so important to learning and build both the critical thinking skills that help kids get through life, and also help kids develop that social emotional competence of navigating something that’s new and hard and building that practice with being frustrated with it at first. Because, if you’re going to AI, and it’s just creating the answers for you, that’s one way that it could replace those opportunities for critical thinking. But, on the other hand, if the AI tool were to be designed to build that next skill level, you know, to really tailor to that child’s experience, and take that child’s knowledge to the next level, you know, through careful questioning or practices that we know can help support children’s learning, kids can gain more out of these very carefully, thoughtfully developed AI tools.

 

Manahil (6:34 – 7:34)

I think that’s an excellent summary of some of the challenges, and some of the concerns that parents have these days when it comes to AI. And, those are obviously some very understandable challenges. But, I also like that you touched on some of the benefits of AI as well, and how it can support learning and development at the same time, if it’s used correctly.

So, I think that’s an excellent summary.

And, to go more in depth into that learning and development, we’re now seeing — like Heemani and I are master’s students, and we’re using AI in a lot of aspects of university life, and, we know that in elementary schools and high schools AI is starting to be used in those classrooms and also being combined into the curriculum. So, from your perspective, how do you think that AI will change the classroom experience?

 

Dr. Tiffany Munzer (7:35 – 11:18)

Yeah, it’s a really good question. And, I think there are a lot of different aspects to think through around how AI will shape learning. So, one of the first things to think about is, how will students get the education and training in using AI as a tool to support their learning.

And so, anytime there’s a new or novel technological innovation, it has these affordances that can really benefit humanity, and, it also has some downsides, depending on how that tool is designed. And so, part of it, with any kind of technological tool, is that we need to know, as consumers, like, what are the benefits and what are the downsides, and how do we use this tool most effectively? And so, that kind of, like — it’s almost like the digital literacy aspect of training is something that really can be integrated, you know, in the education system, so that students can really understand how to use this in a way that’s effective. But, because of the pace of how rapidly it’s developed and the scale that it’s been adopted, it’s out there, and, there hasn’t been as much investment and time to be able to train the educators on how to effectively utilize new AI tools.

And so, one area that could really be helpful is, like, helping train the people who are educators who are on the ground and working with students on how to incorporate this in the learning process. And, right now, it’s happening in a very piecemeal way, because educators are, like, they do so much, they have so much on their plate, and they’re already developing curriculum for so many different topics that students need to learn. And so, there really needs to be designated time and space and training for them to be able to incorporate this into their day-to-day. So, I think that’s one big area that we should really think about.

And, then another area and space is just around how are students currently using it, you know, because we know that the majority of teens are already accessing these tools. And so, how can we help them understand their biases involved in some of what is being shared. There’s also a known risk of “AI hallucinations” in some of these tools. And so, that’s just part of the probabilistic nature of AI that it predicts the next word available. And so, it might not — some of the content that it provides— might not be accurate depending on what that student is asking about.

And then, also, how do we build skills around the things that they really need to learn first before introducing AI tools. So, you know, for instance, a student really needs to learn how to do basic multiplication before using a calculator, you know. And, similarly, a student really needs to know how to write an essay before relying on some of these tools to generate the content for them. And so, really around the training of writing essays, it could really be a partner in teaching kids how to do it as compared with doing it for them. So, I think that that’s the difference.

 

Heemani (11:18 – 12:23)

I feel like the main theme that I saw for, you know, the first two points, was an increase in education. So, I guess, you know, training teachers and staff on how to best introduce AI into the classrooms, but then, also, at the same time, educating students themselves and making sure that they’re aware of things like the biases in AI. And then, the third point, which is also extremely important, which is these skills need to be actually learned first.

And, there shouldn’t be like this huge, you know, reliance on AI, which, you know, I feel like that would be very unhealthy. And so, I’m very curious to see how the classroom experience changes in the coming years. And, we also did want to talk about the social and emotional impacts relating to AI, especially because I feel we’re hearing more and more about people using AI chatbots in a social way as companion or as friends.

And so, we were wondering, what are your thoughts about this trend, especially for children and teenagers who are still developing their social and emotional skills?

 

Dr. Tiffany Munzer (12:24 – 18:21)

Yeah, that is, it’s really a great question. And, I want to take a step back from that question first and just provide a little bit of context, kind of on the underlying potential reasons why kids and teens are seeking out these friendships using AI companions. So, just to take one step back, in the United States in the past couple of years, the US Surgeon General had highlighted the loneliness epidemic as being a really important contributor to health and well-being for teens and families. And, at the same time, in the past decade, there’s been a rise in the prevalence of teens experiencing challenges with mental health. And so, when we see these really large scale shifts, sometimes the underlying reasons are systemic and also multifactorial. And so, essentially, I think what’s happening is that a lot of the supportive structures in society that promote healthy relationships with others have received less investment and attention than the easy spaces online that are more accessible. It’s received more financial investments. And so, of course, kids, teens and families are going to the spaces that are more easily accessible and, quite frankly, more affordable than these in-person experiences, like extracurriculars that really cost a lot of money to be able to participate in sports, for instance, and find those human connections that way. Or, third spaces where like the malls that teens used to gather have now, those have closed in favour of more online shopping.

So, there’s a lot of these factors that are going into making it harder for teens and families and kids to access these in-person spaces. While at the same time, there’s been a lot of factors that have made it more accessible for teens, kids and families to access these online spaces. There was one article by the Brookings Institute that really called for an investment in our relational infrastructure, you know, so that kids and teens can experience these healthy in-person opportunities for social connection. So, I think, you know, all that to say that you’ve really hit the nail on the head to say that they’re really important lessons and learnings early on about these social, emotional skills.  Human relationships are still paramount. So, that’s a little bit about the societal context.

And so, now I’m going to shift to more directly answer your question. A recent Common Sense Media survey found that about 72% of teens have used AI Companions ever. And then, about half are actually regular users that interact with these platforms a few times a month. Daily users are about 13% of the teens that they surveyed. And so, teens report a lot of varying, you know, reasons for that, you know, mostly around the practice of conversation, emotional support, and also entertainment. And so, I think, again, it’s about how are teens using it to learn these skills as compared with how much of it is replacing or making it harder for them to access those in-person opportunities. So, for instance, if a teen is, like, “oh, I’m really struggling with how I navigate this complex social situation,” you know, and they’re, like, “oh, I wonder, what if I said this to a friend, what could that look like?” Or, you know, and they use the AI tool as a way to navigate that really complex social interaction of, like, “oh, I want to be very mindful, what’s a response that I could consider?” Or, “what are some ways that I could interact in a really respectful way to this person when I have this social conflict?”  That’s different than seeking out AI directly as, the companion, you know, and or, and it’s also different than, like, the only interaction that a teen or child might have is with these AI companions. And it’s, like, crowding out opportunities for that in-person connection.

So, I think there are ways, again, to think creatively about how to use it to help support a child’s learning. But then, there’s also these ways that we know are happening where teens are turning to AI companions. And, the interaction patterns are just —  there have been reports of challenging interaction patterns that have — there have not been enough safety considerations or guardrails in place to protect teens when they are disclosing, for instance, that they have suicidal thoughts, you know, it’s not circling back to their parents, you know, to report and disclose that and there’s not the safety measures in place to protect teens from that kind of harm. So, I think — sorry, that was a little bit of a digression from your initial question — there are ways to use it as a tool again, and, those in-person relationships remain at the forefront of what’s important for kids and teens. And then, the ways that some of these companions are designed, there’s just not enough safety features and regulatory structure and guardrails to really make sure teens are having an age appropriate and safe experience.

 

Manahil (18:21 – 20:42)

I feel like I have so much to say about that, because I love that you brought up the, I guess, the societal problems or the factors that are contributing to this as a problem, because I feel like we’re so inclined to just blame it on AI and say, “oh, it’s, it’s because AI exists that this is happening.” And, “parents shouldn’t let their kids talk to AI or use AI, because then this is gonna, this is what’s gonna happen.” But, it’s like you mentioned, there’s so many other factors that are causing it. And, I feel like AI is just one outlet that the loneliness epidemic or kids not having social places to just hang out and be themselves is showing up.

First of all, I’m surprised that amount of teenagers are using AI. Like, that is definitely shocking, but also concerning, because you mentioned that there’s no guardrails in place. And, there’s some concerning stories or new stories that you’re seeing about these kinds of relationships. And, another thing that you touched on is the social development is very limited when they’re talking to a chat bot or ChatGPT than they would be than when you’re talking to a friend, because, I’m just thinking, when you’re talking to a friend, you have disagreements, or you have conflicts, and you learn conflict-resolution skills, and you learn how to be empathetic.

But, ChatGPT is just there to agree with you. And, it’s just going to please you every single time. And, you don’t really get to learn those kinds of skills that you would from a real social interaction.

And honestly, I’ve never used these Gen AI tools, because I’m scared of, like, what’s going to happen, like, I’m scared, that I’m going to use it, and what if I like using it? And then, what happens? So, that’s what deters me from using it. But, it is definitely scary for the teens that are using it out there. And, you know, that’s almost their only outlet, or only, kind of, social interaction. Because of those societal factors you mentioned.

So, for parents who are listening that may be concerned that their kids are using it, or they just want to make sure that they’re maybe educating them properly on the benefits or and the risks,  what are some signs that the parents can look out for? And, what are some ways that they can bring up this topic for kids?

 

Dr. Tiffany Munzer (20:43 – 23:10)

Yeah, yeah, it’s a really great question. And, I think it really starts with the underlying relational context between kids and teens and their parents, you know, and I think it’s about building those strong relationships early on so that teens can feel comfortable coming to their parents, you know, with whatever is on their mind, be it the digital world, or, you know, their in-person life experiences. But also from parents, you know, having that open curiosity about what is it that their kids and teens are seeing online? Or, like, what is it that they’re getting from AI? Or like, what spaces and places are they exploring online, and having that open curiosity about it in a non-judgmental way.

And then, also just using it together to be, like, “I, as a parent, I didn’t grow up using these tools. And so, like, let’s try it together — some of the safe vetted AI tools, so that we can see, like, what are some ways that we can use this in a really helpful way to support learning?” And, “what did you think about the response? You know, did it contain any bias?”

You know, that’s one thing that we didn’t talk about today is that a lot of these AI models are trained on vast amount of data that, you know, for societal data that has always had some bias in it, you know, and so some of the responses might incorporate some of this bias. You know, like, “is it biased? Does it contain any inaccurate information?” And also, like, “what did you think about how it responded to you,” you know, how friction-free some of the responses are, you know, when we know that relationships are — just to your earlier point —  relationships are full of friction, and part of the growing and learning process is how to navigate relationships effectively, if you don’t have some friction in there and conflict to navigate naturally.

So, learning together, keeping an open mind and providing that safe, curious space for teens and kids to have those open discussions with you as a parent.

 

Heemani (23:10 – 24:22)

These are some amazing tips that you’ve shared. I really like the tip where it’s, you know, families using AI together to explore it and see, “how can we use this in a way to help us learn?” And, I really agree that these conversations about, you know, the safe use of AI and making sure that kids understand that it can be used as a tool to help you, but not, you know, replace the learning or replace the relationships that you form with actual people. I think, you know, this is really important. And, it’s also really important for parents themselves to understand the impacts of AI, because, like you said, they didn’t grow up with this. And so, I’m really glad that we’re getting to talk about this topic on our podcast and educate families on this.

And, to end off, we know at this point that AI is here to stay. And, because of that, we really need to make sure that, you know, children are using it in a safe way that is not hurting their learning. And so, what kind of tips or advice would you give to parents and even teachers to help kids use AI more responsibly?

 

Dr. Tiffany Munzer (24:23 – 26:38)

Yeah, yeah. So, I think, in addition to what we had chatted about, talking about some of the online privacy issues that could arise with—  not just generative AI, but also as a whole of AI in algorithms, AI in online searches and AI just in general — kind of, the data collection around online privacy is really important. So, teaching kids how to manage cookies and clear browsing histories and thinking about how to block social media users or marketers whose messages might not align with that teen or kid’s experiences, and ways to protect your privacy online.

And then, also talking about the other thing that, you know, families can chat with their kids about is around the worries about plagiarism where a lot of these tools have made it really easy to just put in a question and then you get this response that kids can copy and forward. And so, really talking about the concept of original work and using online information as a starting point for your own thinking and not just copying and presenting the words as your own. So, I think that that’s the other really important piece, and just to normalize that struggling and learning new content is really hard and it is the exercise that we need for our brains to grow and develop so that you can, you know, when you encounter even harder problems when you’re older, you can tackle them knowing that you’ve developed these really important tools for problem solving. So, not to get, like, preachy at all about it with kids but just as, like, you know, this is very common and of course as kids and teens who have other fun things to do, they might not want to write an essay but it’s really important work for them that will build the skills they need in the future too. So, thinking about it as, like, an investment in yourself right now that you’re doing the hard work now so that things aren’t as hard for you when you have to encounter big bigger problems in the future.

 

Manahil (26:39 – 28:04)

That’s a great reminder and I feel like it’s something that I forget, even today, where I’m doing a super hard assignment and it’s like “oh I wish I didn’t have to do this” or, “oh let me just ask ChatGBT how to do this,” but, like you said, it’s important for you to learn it yourself and understand it and maybe AI can be a tool in helping you understand it, which is one of my favourite ways of using it —  like, it can help break down some complex topics into easier to understand tidbits.

I really like that point on the investment in yourself and maybe that’s something that parents and educators, you said could emphasize to kids and teenagers.

So, we just want to say, thank you so much again for taking the time to chat with us about AI today and sharing your knowledge and expertise with us on the Healthy Habits Happy Homes podcast. A lot of your tips were extremely practical and you provided so much background context, and also different ways to look at AI, so that it’s not just a terrible scary thing. It is just another tool, like, with other kinds of tools, it has its downsides and definitely needs a lot of regulations and education around it, but I think that you’ve really helped break it down into something that is easier to understand, less scary and could even be a bit useful.

So thank you again so much for joining us.

 

Dr. Tiffany Munzer (28:05 – 28:12)

Oh, thank you so much for having me. This was such a treat getting to connect with you both today and chat. Thank you so much.

 

Heemani (28:13 – 28:24)

We hope that you found this episode of Healthy Habits Happy Homes helpful. Follow us on Instagram @FamilyHealthStudy and Facebook at Guelph Family Health Study to stay up to date with the latest episodes of our podcast.