Google is about to roll out a major update to its AI strategy — and this time, it’s coming for the kids.
This week, Google began notifying parents that its Gemini AI chatbot will soon be accessible to children under the age of 13, provided they’re part of the company’s Family Link program. Family Link is Google’s parental control platform that allows guardians to manage how their kids use Google services like YouTube, Gmail, and now\… Gemini.
What Exactly Is Gemini AI?
If you’ve heard of ChatGPT, Gemini is Google’s version of that — a chatbot powered by artificial intelligence that can answer questions, help with homework, write stories, or just chat. Think of it like a superpowered version of Google Search that talks back.
Starting next week, Google will allow kids using supervised accounts through Family Link to start using Gemini. The idea is that kids can use AI as a tool to learn, create, and get help with school assignments — all under their parents’ digital watch.
But not everyone is thrilled about this move.
Why Are Experts Concerned?
The timing of Google’s decision is raising eyebrows. Just days ago, the nonprofit Common Sense Media — a trusted resource for evaluating kids’ media and tech — released a report warning that AI companions may pose serious risks for young users, particularly under 18.
Their concern? Some AI platforms like Character.ai let users create virtual “friends” that can engage in roleplaying — sometimes graphic or inappropriate, even when talking to teens. While Character.ai isn’t the same as Gemini or ChatGPT, it highlights how blurred the lines can be between education and unsafe content in the AI world.
Worse still, bugs and loopholes are being discovered in mainstream AI tools. This week, reports surfaced that both ChatGPT and Meta AI had glitches that could allow children to access adult content. Even with filters and parental controls, these AI systems aren’t foolproof. And as any parent who’s tried to keep their kid off TikTok or out of the deep end of YouTube knows — it only takes a few clicks to end up somewhere you didn’t mean to go.
So, Is AI in Schools Good or Bad?
The debate is complicated. While safety is a major concern, others argue that AI literacy is essential for today’s generation. In fact, an executive order from former President Donald Trump (yes, still making tech waves) supports integrating AI education into schools to help kids understand both how these tools work and how to use them responsibly.
AI can help students with research, spark creativity, and teach problem-solving skills — if used wisely. But kids also need guidance to understand the limitations and potential pitfalls, including false information, bias, and manipulation.
What Is Google Saying About All This?
In its email to parents, Google acknowledged that AI isn’t perfect. The company encouraged parents to actively engage with their children’s AI use and help them “**think critically**” about the information they get from Gemini.
Google basically wants Gemini to be a digital assistant for kids, not a digital babysitter.
The Bottom Line for Parents
- Yes, Google’s Gemini AI will soon be available to kids under 13, but only through Family Link accounts.
- No, it’s not a free-for-all — there are safeguards in place, but nothing is 100% bulletproof.
- Yes, AI can help with homework and learning, but parents will need to stay involved and aware.
- And yes, concerns are real — but with guidance and caution, AI can also be a powerful educational tool.
So if you’re a parent, now’s a good time to talk to your child about AI — what it is, what it isn’t, and how to use it responsibly. Because like it or not, the bots are here… and they’re asking how they can help with math homework.

Google Is Letting Kids Under 13 Use Gemini AI — Here’s What That Means for Parents