AI Working Group

Forward Thinking with Ethics and AI for Clinicians

As we stand on the cusp of a transformative era, it is clear that Artificial Intelligence (AI) is ushering in significant changes for medical and psychological practices.

  • 𝟯𝟵% of young people have used AI for emotional support.

  • 𝟲𝟮% stated they were not comfortable with the idea of using AI alone for emotional support.

  • 𝟳𝟲% were more positive about the use of AI as an addition to a care journey led by a human expert.

With AI reshaping how consent, policy documents, and ethical situations are handled—often ahead of formal guidance from regulatory bodies—it's essential that we unite to navigate these uncharted waters together.

That’s why we are establishing a working group...

...to create a community where doctors and psychologists can share insights, offer support, and collaborate on best practices. You will not only benefit collectively from what's produced but also enhance your practice and maintain the highest ethical standards.

Join us by actively participating in the working group meetings and we'll develop such things as, efficient and effective consent forms, AI policy documents, and history questions to determine client AI usage. Let's embrace an approach that helps clinicians improve their practice, responsibly and effectively.

Facilitated by Dr. Claire Sira, neuropsychologist and CPA board member, and Tom Hudock, co-founder of Hyperfocus, these group sessions begin in January 2026. We welcome Ph.D. students and psychologists from all backgrounds, specialties, and stages of career. This won't be for clinical consultation or supervision.

There will be a nominal fee for joining, as you will receive written documents you can use directly in your practice. Clinicians paying for Hyperfocus are free to attend.