top of page

The Algorithm That Has Promised to Keep Our Kids Safe

On facial recognition, tech promises, and the safety we can't outsource.


I was scrolling through the feeds when a headline caught my eye: "Roblox introduces facial recognition to protect children from adults."


My first thought? Wow-someone's doing something.


My second thought? Wait. Really?


Here's the thing, I've been researching digital citizenship for years, and every time a platform announces a shiny new safety feature, parents breathe a collective sigh of relief. We want to believe the tech will handle it. That someone, somewhere, has built the thing that will keep our kids safe while we're making dinner or answering emails or just trying to get through the day.


This time it was clear to me that we were all being sold something that sounded better than it actually was.


The Promise vs. The Reality


Roblox's new system uses facial recognition to estimate a user's age and sort them into age bands. Kids can only chat with others in their age group or the one next to it. The goal? Stop adults from talking to children.


On paper, it sounds reassuring. In practice? It's a bit messier.


The technology can get ages wrong by a year or two. A 10-year-old might be read as 12. An 8-year-old as 6. For some things, that margin doesn't matter much. But when we're talking about protecting kids from predatory behaviour, those one or two years can mean everything.


It gets shakier with younger children and certain demographic groups. Research shows these systems struggle with accuracy across different racial backgrounds, genders, and socioeconomic groups. If someone really wants to lie about their age, they can work around it with photos, videos, or by gaming the system.


So what does that leave us with? A tool that might help, but definitely won't solve the problem on its own.


What This Really Means


Here's what I've learned, both as a researcher and as someone raising kids in this digital mess: there is no algorithm that can replace you.


An algorithm doesn't know your child. It doesn't know their vulnerabilities, their maturity level, or the specific situations that might make them uncomfortable. It can't have a conversation about what to do if someone asks for personal information. It can't build the trust that makes your child want to come to you when something feels off.


You can.


The most effective safety strategy isn't a single piece of technology. It's connection. It's staying curious about their world. It's building the kind of relationship where they know they can tell you anything, even the stuff that feels embarrassing or scary.


And honestly? That's harder than installing an app. But it's also the only thing that actually works.


TRY THIS


If you're wondering how to stay involved without hovering or panicking, here are a few things that help:


Keep the conversation going. Ask who they're chatting with on Roblox (or any platform). Not as an interrogation, just genuine curiosity. "Who's your favourite person to play with?" opens more doors than "Who are you talking to?"


Name what's happening. When you see news like this, talk about it. "Did you know Roblox is checking ages now? What do you reckon about that?" Let them share their perspective.


Build critical thinking together. Help them recognise red flags in online interactions. What makes someone trustworthy? How do they know if someone is who they say they are?


Trust your instincts and theirs. If something feels off to you or to your child, that matters. Create space for those gut feelings to be voiced and taken seriously.


Set boundaries you can actually keep. Work together on clear family agreements about online communication. When everyone understands the "why," they're more likely to follow through.


You're the Safety Net


Roblox's facial recognition is a step. It shows they're listening to concerns and trying to do better. That matters, however, it's not the solution. The technology has real limits, and even if it worked perfectly, it still couldn't replace what you bring to your child's safety: relationship, conversation, and guidance.


So yes, it's good that platforms are trying, and yes, you should still stay involved, stay curious, and stay connected.


Your child's safety online isn't about finding the perfect technological fix. It's about being the parent who shows up, asks questions, listens well, and builds the kind of trust that lasts long after the algorithm updates.


If you're looking for tools to help you have these conversations in a calm, structured way, the Digital Family Agreement Kit and our online courses are here. Real strategies, no fear tactics, built for real life.


You've got this!

Comments


bottom of page