These days, scary things hide in images and videos— and silence can become a weapon. With the rise of AI-generated images and videos, what once was un
These days, scary things hide in images and videos— and silence can become a weapon. With the rise of AI-generated images and videos, what once was unimaginable is now real: people making explicit content of others without consent, then sharing it as “jokes,” “pranks,” or worse.
That’s why talking openly and early with young people matters more than ever.
📜 Laws Are Catching Up — But Awareness Needs to, Too
In May 2025, the Take It Down Act became law in the United States. The Act makes it illegal to knowingly publish or share intimate images — real or AI-generated — without the person’s consent, and mandates that platforms remove such images quickly. Wikipedia+2The White House+2
I’m reserving final judgement on this law but I am encouraged that lawmakers see the need. We still have a long way to go and laws alone aren’t enough. Because technology moves fast, and deepfakes are easy to make. So many young people may not even realize what’s being done is wrong — or that it’s illegal. Young people these days are extremely comfortable with technology. It has been commonplace since they were infants.
Even further, there are laws on the books against adults harming children and yet….. yeah.
That’s why education and awareness are critical.
🧠 Why Talking to Kids Matters
Teenagers are being targeted. Deepfake pornography and non-consensual intimate imagery aren’t just “online rumors.” They’re real dangers for youth. Nationwide Children’s Hospital+219th News+2
It’s about consent, dignity, respect. When someone’s image — real or fake — is sexualized without permission, it’s a violation of their body, identity, and trust.
Prevention starts with knowledge. Many young people don’t understand what deepfakes are or how easily someone could create them. Simple conversations can turn vulnerability into strength.
Empowerment through awareness. When we talk openly, we give young people language to express discomfort, fear, or confusion. We give them tools to say “no,” report abuse, or reach for help.
💬 How to Have the Conversation (Without Shame or Fear)
Use age-appropriate language. Start with the basics: explain what deepfakes are — manipulated images or videos made by computers — and why it matters.
Talk about respect and consent like you talk about boundaries in real life.
Encourage open communication. Let them know it’s always ok to come to you if something online scares or troubles them — no judgment, only support.
Emphasize their rights. Remind them that their image belongs to them. Sharing or creating intimate images without permission is wrong — and now, it’s illegal under federal law.
Teach digital literacy. Help them learn to spot signs of manipulation or deepfakes — and to think twice before sharing anything that seems off.
🌱 What This Means for Our Community
As a community that cares deeply about dignity, safety, and justice — we have to do more than watch events unfold.
We must:
Raise awareness with one another-among parents, caregivers, mentors, faith community and young people about deepfakes and their dangers.
Normalize conversations about consent, digital privacy, and respect.
Build safe spaces for youth to speak up without shame.
Support victims — believe them when they’re scared. Help them access resources, report abuse, and reclaim their dignity.
Because when we wake up to these threats — and speak up — we become part of the protection.
If you ever feel unsure, overwhelmed, or like you don’t know where to start: you’re not alone. Talking is the first, powerful step.
As adults, mentors, guardians — we hold space for safety, trust, and healing.
As a community, we can make the internet a place of dignity, not danger.
As families — we can teach love, respect, and boundaries that hold even in digital shadows.
Let’s talk. Let’s protect. Let’s care.