Automated psychology (CBT) is apparently a research topic now
I have an internal feeling it should be unethical, but why exactly? If it will help people or reach a reasonable performance at one point?
(directing people's thoughts automatically sounds dangerous though)
www.alphaxiv.org/abs/2501.09426
*Of course as a helper for therapists this is always a good tool (aid rather than replace is always easier to explain)
#ethics #ai #llm #psychology
@LChoshen Since CBT to a large degree involves teaching cognitive skills and behavior changes, there does seem like a good opportunity for “automation”, especially in the context of not having enough trained clinicians and the large number of people needing/available for treatment to be matched with the limited number of therapists. There is already a lot in the mental health space that attempts to automate or provide better/easier access for individuals to learn/practice these things independent of a clinician. Their is the possibility of reducing stigma/approachability for some who don’t want to disclose their mental health concerns to another human by using automated services, but I think there may be a greater number of people who become concerned about the potential surveillance aspects of sharing this information with whoever is behind the automated therapy tools. The greater use case with something like this is ease of access, because something like this should be available 24/7 almost anywhere.
@calbearo So you are saying basically, this is great, and the only issue is not to share the client's data (which is easy to ensure).
Why is that so different from Eliza? The issue with Eliza was just that it didn't work?
@LChoshen
CBT is actually quite harmful to a lot of people seeking psychological help, so having it administered by an AI instead of a human who can observe how a person is responding sounds very, very dangerous to me.