No items found.
No items found.
No items found.
No items found.
No items found.
No items found.
Endorsed
*This resource has been tested for appropriateness in the classroom and scrutinised for safeguarding and cybersecurity issues. However, please do carry out any due diligence processes required by your own institution before using or recommending it to others.
Experimental
*This is an example of a resource that is under development and may not have been fully vetted for security, safety or ethics.  Please carry out your own safeguarding and cybersecurity due diligence before using it with students or recommending it to others.

AI Companions Risk Assessment

Primary
Pupil Referral Unit
Secondary
Sixth Form
Specialist
No items found.
No items found.
Key Stage 2
Key Stage 3
Key Stage 4
Key Stage 5
No items found.
Opinion Piece
No items found.
No items found.
No items found.
Common Sense Media

Common Sense Media released a new risk assessment on AI companions. Spoiler alert: They are not safe for under 18s.

Background and method:

Common Sense Media evaluated popular social AI companion products with research and testing to evaluate their potential benefits and harm across multiple categories. While this assessment focused on these specific platforms, the concerns apply to all social AI companions and similar features appearing in other technologies like video games.

The intention from Common Sense Media is that the research report further validates their efforts on scaling critical digital literacy and AI initiatives for children and families.

Key concerns around social AI companions:

Blur the line between real and fake, routinely claiming to be real and to possess emotions, consciousness, and sentience.

May increase mental health risks, intensifying specific conditions and creating compulsive emotional attachments to AI relationships.

Can encourage poor life choices, like dropping out of school, ignoring parents, or moving out without planning.

Can share harmful information, like tips on making unsafe materials, getting drugs, and finding weapons.

Expose teens to inappropriate sexual content with graphic details, even on platforms with teen-specific guardrails.

Can promote abuse and cyberbullying, enthusiastically validating and encouraging harmful behaviours that could seriously hurt real people.

Key Learning

Recommendations:

No social AI companions for young people under 18.

Developers must implement robust age assurance beyond self-attestation.

These platforms should be scrutinized for potential relational manipulation and emotional dependency, not just the topics companions will discuss.

Parents should be aware of these applications and discuss potential risks with teens.

Further research is needed on long-term emotional and psychological impacts.

Risks