No items found.
No items found.
No items found.
No items found.
No items found.
No items found.
Endorsed
*This resource has been tested for appropriateness in the classroom and scrutinised for safeguarding and cybersecurity issues. However, please do carry out any due diligence processes required by your own institution before using or recommending it to others.
Experimental
*This is an example of a resource that is under development and may not have been fully vetted for security, safety or ethics.  Please carry out your own safeguarding and cybersecurity due diligence before using it with students or recommending it to others.

How Can You Critically Assess The Advice AI Driven Chatbots and Virtual Therapists Provide To Stay Safe?

No items found.
No items found.
No items found.
No items found.
No items found.
Opinion Piece
No items found.
No items found.
Strategy
Alex Russell

CEO, Bourne Education Trust

Children and young people must understand how to critically assess AI advice for mental health issues. Users should verify the source's credibility, cross-reference AI suggestions with trusted resources, and exercise personal judgement. AI is not a substitute for professional mental health care, and users should not be sharing sensitive information, and should always report harmful advice. Overall, AI should be viewed as a supplementary, not a replacement, tool for mental health support.

Successfully critically assessing the advice AI provides regarding mental health is essential to ensure your safety and receive appropriate guidance. We need to educate our children and young people to help them make informed decisions when relying on AI for mental health advice. It is all too easy to believe that what we are being told is reliable. However, it is essential to identify the source of the AI tool or platform. Ensure it is developed or endorsed by reputable mental health organisations, medical professionals, or institutions.

Once you have received advice, be critical of it. If the AI suggests treatment options or interventions, cross-reference this advice with trusted mental health resources or consult a human expert for a second opinion. AI can provide general advice, but it can't understand an individual’s unique experiences and feelings. Users need to trust their own judgement and instincts when evaluating the advice provided and always use multiple sources to cross-verify information and advice.

We need to educate our pupils so that they can recognise that AI is not a substitute for professional mental health care. It can provide information and support, but it cannot replace the expertise of trained therapists, psychiatrists, or counsellors. Pupils should not share overly personal or sensitive information. If they encounter AI-generated advice that is inappropriate, harmful, or offensive, report it to the platform's administrators and, if applicable, to relevant authorities.

All in all, pupils should remember that AI should be a supplementary tool in supporting their mental health and is not a replacement for professional care.

Key Learning

Risks