Social Media Influencers, AI, Deepfakes and Young People Programme

This webinar supports foster carers with the baseline understanding of AI needed to have open conversations with young people about digital literacy.

Webinar
3 hours

Overview of Social Media Influencers, AI, Deepfakes and Young People Programme

From online shopping to asking a smart speaker for the weather forecast, AI is now a part of most of our daily lives, often in ways we don’t even notice. But alongside the benefits come risks like deepfakes, AI-generated misinformation and inappropriate content – posing new risks to children and young people.

Online content can shape our values, beliefs and aspirations, and it can be surprisingly hard to separate the real deal from harmful material.

This introductory workshop gives foster carers a crash course in AI—how it’s shaping the online world, the risks it brings, and how to help children think critically about what they see. Learn how to spark open conversations, build digital literacy, and empower young people to make safer choices online.

Looking for online courses? Click Learning has over 300 online courses for adult care, education, and children's services.
Enquire now

Aims of Social Media Influencers, AI, Deepfakes and Young People Programme

  • Understand how young people use AI powered apps and social media, the benefits, and why it is so important in youth culture.
  • Know how young people can be exposed to inappropriate, fake and harmful content generated via algorithms, despite filters and legislation intended to protect them.
  • Understand the ways that AI technology can be used to create deepfake videos and audio, along with the potential risks to children.
  • Increase awareness of online predators and the different ways that AI can be misused to target children.
  • Understand the importance of raising awareness with young people to recognise risks and report harmful content so they can enjoy digital technology and AI safely.
  • Know what to do and where to report safeguarding concerns.