9 Comments
User's avatar
Gary Sharpe's avatar

I can see two kinds of therapy apps, one to enhance human-to-human connection (such as remote delivery that increases access. e.g. Better Health app) or replaces it. On the latter, Lex Fridman often talks about having a personal AI "buddy" who helps us stay on track and grow as person. I can envisage this as an artificial form of "daemon" in Phillip Pullman's "His Dark Materials" books.

Expand full comment
Brad & Butter's avatar

I would view the former as "mind poker" and the latter as "spiritual solitaire". Two different games with different purposes. One is for consultive thinking, the other a narrative builder for feeling (looks at JBP in disgust).

Expand full comment
Devaraj Sandberg's avatar

As philosophy becomes increasingly influenced by cog neuroscience, and exciting new discoveries are made, I'm realising more clearly the shortcomings of AI when it comes to acting as a potential therapist, counsellor or personal assistant.

Of the 4 types of knowledge, AI is good at propositional, okay at procedural, but pretty much hopeless at perspectival and participatory knowledge. Trouble is, it is the latter two that create meaning and connection in our lives. Thus, as it stands, all AI can do is mimic some of the cues that relate to the latter two, and this will only work for a while.

I'm increasingly of the opinion that the great techno revolution, that our culture is currently banging on about, will actually be an utter fail. It has been constructed by people who are usually quite severely dissociated from the body and, whilst exciting and new in places, I think the bulk of it is just dissociated minds seeking to somehow avenge themselves on society at large! When I read players in this field like Nick Land or Elise Bohan (who I also like) that seems to be a constant background theme.

Expand full comment
Brad & Butter's avatar

I stand on the counter-view that propositional and procedural knowledge are already good criteria to filter off most mainstream therapists, and the only reason is that the values of perspective and anticipation are hard to differentiate from the others.

There is a 2x2 breakdown on therapeutic work, and that most "spiritual" work are a market for lemons (ripe for exploitation). "Functional" work carries less exploitation risk. https://twitter.com/vgr/status/1118212507368181760

The author of this chain noted that middle and working class "therapy" is bunk, and that for therapy to work at all is a sign of affluence or talent. This might explain why AI counseling have value to only a small demographic with low trust in therapist overproduction.

https://twitter.com/vgr/status/1450889117244399623

Therapist overproduction is in similar vein to lawyer glut, or in Peter Turchin's terms, how in bad economies the professionals pool is artificially inflated. And since Land focused on how technology breaks the economy... It would also imply the increasing inability to find "therapists" that is not merely credentialed but are psychologically competent.

Perhaps the most optimistic way of seeing this is to treat LLMs as a "Turing Test" for theraputic and counselling work. The people who don't need it do not need "talk therapy" either, and the people who need it has no time for dealing with bad apples.

Expand full comment
Devaraj Sandberg's avatar

As I wrote recently, I actually think LLMs could save the world. Couple of reasons...

1) the value of human propositional intelligence and memory will soon be pretty much zero. They'll be no need to learn all the old nonsense we used to learn, or give any regard to people high in propositional intelligence.

2) the ubiquity of fake news and fake information will be so high that humans will be forced to revert to other forms of epistemology to navigate their way through life. Different symbols will need to be exapted onto our frontals lobes to direct out thinking. 500 years of Protestant reformation propositional (yawn) nonsense and sense of selfhood will be over.

Being it on!

Expand full comment
Jennifer Arnold's avatar

I have been drawing a lot lately , and was thinking about the AI art thing that is supposedly happening big now . I was looking at this piece of beautiful sheer papyrus that I was using , and I thought " THIS used to be the AI " instead of drawing in clay , or on a rock , the smooth flat light colored paper was the crystal window through which the information was placed and then observed " We must see AI as a medium , not as an entity , even if it grows lips to speak - Supposedly there is a therapy called "Chair Therapy" where you place an empty chair - the shadow of a seated human infront of you , and speak , and that could be helpful actually - why not ? People write journals into the night ... and wake up feeling better . To me the real quesiton is ... THE DATA MINING , where the AI system is collecting information and possibly doing something with it . For me , the main reason for human to human contact is so that another creature of spirit can potentially resolve or assist , or discover something that actually leads to new creative understanding of reality and our process . This the AI cannot do .... yet @!

Expand full comment
Brad & Butter's avatar

GPT-Neo. Open Source language model that you can modify in your own home. The ultimate notebook solution right after Notion/Rome/Obsidian/Logseq/... There are times where humans REFUSE to help others like a bystander. Play solitaire, defeat (true forced) loneliness.

Expand full comment
haymanjoyce's avatar

Here is what ChatGPT had to say about it...

[user] could AI provide therapy to a human

[AI] There is ongoing research into using artificial intelligence (AI) to provide therapy to humans. Some approaches involve using AI to deliver therapy directly to patients through chatbots or virtual assistants, while others involve using AI to assist human therapists in providing therapy.

One potential benefit of using AI for therapy is that it can provide support to individuals who may not have access to a therapist, either because of geographic location or because of limited availability of therapists. AI-based therapy may also be more convenient for some people, as it can be accessed online or through a mobile device.

However, it is important to note that AI-based therapy is not a replacement for human therapy and should be used in conjunction with human support. AI is not able to provide the same level of empathy, compassion, and understanding as a human therapist, and it is not able to fully replace the human element of the therapeutic relationship.

[user] do the shortcomings of AI-based therapy (versus human-based therapy) matter if the outcomes are the same

[AI] The short answer is that it depends on the individual and their specific needs. For some people, the outcomes of AI-based therapy may be just as effective as human-based therapy. However, for others, the lack of a human element may be a significant limitation.

One of the key elements of therapy is the therapeutic relationship between the therapist and the patient. This relationship is based on empathy, compassion, and understanding, and it is an essential part of the healing process. While AI may be able to provide support and guidance, it is not able to fully replicate the human element of the therapeutic relationship.

Additionally, the effectiveness of therapy may depend on the individual's ability to form a connection with their therapist. For some people, this may be easier to do with a human therapist than with an AI.

Overall, it is important to consider the individual's specific needs and preferences when deciding whether AI-based therapy is the right fit. It may be helpful to try both human-based and AI-based therapy and see which one works better for you.

[user] In theory (and, perhaps, in practice), can humans form connections with AI that are as strong (or stronger) than the connections they are able to form with other humans?

[AI] It is possible for humans to form strong emotional connections with artificial intelligence (AI). This can happen in a variety of contexts, including through the use of social media and virtual assistants, as well as through the use of AI-powered robots and other devices that are designed to interact with humans in a more personal way.

However, it is important to note that the nature of the connection between humans and AI is different than the connections that humans form with other humans. AI does not have emotions or personal experiences in the same way that humans do, and it is not able to fully understand or relate to human emotions in the same way that another human can.

As a result, the connections that humans form with AI may be more limited in scope and depth than the connections they form with other humans. While these connections can be meaningful and satisfying in their own way, they are not able to fully replicate the richness and complexity of human relationships.

Expand full comment
Livio Marcheschi's avatar

Smart app. Smart therapist.

Expand full comment