in this article
- Why Consider an AI Trip Sitter?
- The Empathy Gap and the Embodiment Problem
- What is the Realistic Role for AI?
- Conclusion
Are you 18 or older?
Please confirm that your are 18 years of age or older.
You are not allowed to access the page.
Disclaimer: The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of Chemical Collective or any associated parties.
The ancient practice of psychedelic guidance stretches back for millennia. The act of “holding space” is profoundly human, predicated on mutual trust and understanding. This ancient role of “trip sitter”, or guide, which was once the role of the shaman, the head of the tribe, is now assumed by therapists and trusted peers. The subtle, non-verbal cues, empathy, and compassion required to fulfil this role effectively are, I stress again, built on a deeply human foundation. It is your humanity itself which provides all the tools required.
However, in our fast-paced, quick-fix, heavily siloed, and sadly disconnected society, we lack the means to guard this space with the care and attention it requires. A vivid illustration of this? In America alone, a recent survey showed that 40 million people have experienced a “bad trip” in their lifetime. The far-reaching implications of these often horrifying experiences could stretch for generations. If difficult psychedelic experiences are not adequately integrated into our lives moving forward, they can leave scars and confusion, and only serve to further alienate us from our peers.
Is it any wonder, then, that in this fractured societal framework we are currently enmeshed within, that we are turning to AI, rather than humanity, for guidance? While AI and psychedelics may not have historically shared any common ground, a growing number of researchers believe there exists the potential for these powerful tools to combine in generating innovative mental health treatment approaches. On forums such as Reddit, a growing number of people are reporting turning to Large Language Models (LLMs) for assistance both during and after their psychedelic experiences. The collision of the ongoing “psychedelic renaissance” and the AI revolution is changing how we interact with these substances.
The increased acceptance of psychedelics into the mainstream of course means there is a rapidly increasing necessity for trained or experienced guides. Currently, we lack the capacity; several generations of prohibition have stunted our relationship to these chemicals, and we are only just beginning to once again scratch the surface of their power. AI provides a potentially groundbreaking solution, which, as with anything AI-related, is seemingly inevitable, whether it be in social care, in care homes/care in the community contexts, or in the realm of psychedelics. While the positives – easy, free, accessible advice for anyone with an internet connection – may initially seem promising, if we consider the scenario a little more deeply, I think it becomes a deeply unsettling prospect.
The critical question is: Can what we deem as the essential functions of a trip sitter genuinely be fulfilled by artificial means? Can we extract human empathy and compassion from code? This is not merely a technological curiosity or a fun idea. This is on a global scale and will likely affect millions of individuals. It is a profound philosophical challenge that asks us to think very carefully about what it truly means to be a trip sitter – especially when we know we are in a position of authority over someone in an extremely vulnerable state.
This article will seek to explore the conflict at the heart of this issue. Are we on the way towards genuine, scalable digital empathy – a valuable, universal, code-based psychedelic guide? Or, is there a smaller, more realistic and responsible role AI can play in this space? First up, why would we even consider this path?
As touched upon in the intro, the primary driver behind integrating AI into psychedelic care is the issue of scale. Currently, our model of psychedelic-assisted therapy is not only in its infancy, but also requires a great deal of time with highly trained professionals, in highly specialised environments. This of course makes it extremely expensive and therefore inaccessible for the vast majority of people. As it often is, technology can be a societal leveller here, democratising psychedelic support for all. Supporters of this AI-driven solution could provide guided preparation content, therapeutic support throughout, and integration services following the experience – for anyone, at a fraction of the price. I’ll quickly state here that when I say a fraction of the price, I mean for an individual.
This article is not the place to dive into AI’s increasing impact on the planet as a whole, as a result of the massive computing power and, therefore, electricity it requires to function, but bear in mind it is real and comes with a cost, regardless of the perceived day-to-day benefits we perceive.
Beyond the simple fact of accessibility, we can look at how AI might, in fact, provide better support than an individual. In theory (disregarding the politics of the people producing the LLM itself), AI should be better placed to provide unbiased, data-driven support. It could even integrate with the user’s smart tech, biometric data from wearables, for example, real-time indicators of stress like heart rate, which would negate the need for external analysis of body language, an area which is ripe for misinterpretation. AI has internet access, and is itself a gigantic database, and therefore possesses pretty much the sum of all human knowledge. The potential benefits of this cannot be ignored. I think particularly powerful here will be the AI’s ability to skirt around the woo-woo and unsubstantiated claims of psychedelic therapy sought out in contexts other than the strictly clinical – ayahuasca retreats, for example, are rife with this.
Jerome Sarris, in a 2024 study published in The New York Academy of Sciences, states:
Artificial intelligence (AI) and psychedelic medicines are among the most high-profile evolving disruptive innovations within mental healthcare in recent years. Although AI and psychedelics may not have historically shared any common ground, there exists the potential for these subjects to combine in generating innovative mental health treatment approaches.
Not only can AI be used in the study of the psychedelic substances themselves, but it can also actively be involved in drug discovery, as with vaccine creation during the COVID-19 pandemic. In the future, it could vet potential patients and eventually even be involved in administering uniquely targeted psychedelics for individuals themselves. The most immediately tangible application of the technology is in post-session integration.
Companies as well known as Headspace have already released an AI support/journaling platform called Ebb. It is described as your “compassionate AI companion”. You could quite simply modify platforms such as this to be solely trained on and hyper-focused on the trip sitter role, which is, in a lot of ways, pretty analogous to more generic support bots. This increased ability for users who wouldn’t otherwise easily have access to the means to analyse, explain, and integrate their experiences coherently is, if correctly managed, potentially invaluable.
Despite the promise of this technology, the concept of the AI guide begins to falter somewhat if we look a little closer. The core aspect of the role of a trip sitter, I believe, is empathy. Obviously, an LLM does not have the ability to express empathy. As already stated, the level of vulnerability when anyone is under the influence of psychedelics is extremely high. The sense of trust and safety required to navigate this is what psychotherapists call the “therapeutic alliance”, which is a bond of trust and mutual understanding. How would you even begin to establish that sort of relationship with a chatbot, a creation which basically amounts to an extremely advanced offshoot of autocorrect? Would it even be healthy to do so?
In her book Alone Together, Sherry Turkle, a sociologist at MIT and a prominent critic of the effect technology is having on humanity, breaks down the issue with this relationship:
We have invented inspiring and enhancing technologies, yet we have allowed them to trivialize our human connections. We have confused conversation with connection and substituted a stream of information for the slow work of intimacy. A machine can be programmed to ask the right questions, but it cannot know the experience of being human. It offers the illusion of companionship without the demands of friendship.
Simply put, AI can only ever offer at best a simulation of empathy, an impression of human compassion. This issue is compounded by the “embodiment problem”, which states that “AI systems that lack a physical embodiment can never be truly intelligent.” A traditional trip sitter is not just providing verbal guidance; they are a supportive presence in the physical space. A human guide can offer, for example, a reassuring touch, a simple glass of water. Their existing knowledge of the person undergoing the experience may allow them to perceive subtle shifts in their behaviour, whereas, at least currently, an AI would be contextually blind.
These philosophical issues are accompanied by potentially huge ethical implications. The issue of security for one. The transcript of a psychedelic-assisted therapy session is plainly an extremely sensitive and personal document. The questions of who might own this data, the security of said data, and the potential for it to be used to further train models are very real and pertinent. A further potential risk is the phenomenon of AI hallucinations, which is a worryingly regular challenge users of LLMs encounter. It simply means the AI fabricates information and confidently presents it as factual. This presents a catastrophic risk in a therapeutic context, in which a user is in such a highly suggestible state.
Recognising the limitations of this technology does not mean we have to reject it entirely. Not getting caught up in the ongoing, apparently imminent arrival of infallible super-intelligence will allow us to define its proper role. In my opinion, in its current form, it is not suitable to fulfil the role of a trip sitter. But as a sophisticated analytical tool which we can apply in the context of psychedelic-assisted therapy, it will be invaluable, and perhaps even game-changing.
The most ethical path forward is a “human-in-the-loop” model. This employs the AI as an augment to the therapist, not a replacement for them. This leverages the AI’s strengths – data access and speed, while preserving the humanity and genuine empathy necessary for this space to function well and support those involved – to benefit both patient and clinician.
Dr. Thomas Insel, former Director of the U.S. National Institute of Mental Health, described this potential synergy in an NPR interview:
It’s not about replacing therapists at all. It’s about trying to give them tools that they’ve never had before.
AI should be used for specific tasks with clearly defined boundaries. Insel continues, in an interview with Ezra Klein of the New York Times:
What we can do with technology is begin to get the kind of measurement that will help us… It’s about giving [providers] data so they can be more effective.
AI could provide a perfect adjunct to the process of preparation and screening of patients, providing interested parties with standardised educational materials/questionnaires, etc. This is already being explored in traditional forms of therapy. The most powerful tool currently available and seemingly safe is somewhat similar to the Headspace “Ebb” AI companion mentioned already. A smart journal which slowly gets to know you over time would be an invaluable tool. We easily get lost in ourselves, in denial, or with a simple lack of awareness. A tool which would allow us to see through all of that concisely would avoid these typically human problems. Organising data and making it accessible is an area in which AI excels.
The imminent prospect of AI trip sitters and AI involvement in therapeutic practice as a whole forces us to confront our own values. The deeply felt and necessary impact of human empathy and support gained from a genuine human presence is invaluable, especially in our most vulnerable moments. The current limitations of AI do not allow it to adequately fulfil this role. Perhaps in some Bladerunner-esque future, it will either have the capability to experience these emotions – and therefore will be able to effectively transmit them to us – or it will have the ability to adequately fool us, so we believe it does. For right now, the role is primarily a human one. Honestly, I believe it should stay that way, and we should be tackling the other end of the issue – the lack of access to adequate care, rather than trying to plaster over the issue with more tech.
The most responsible and effective future for this technology lies in its data-processing power. We are not going to outsource therapy to algorithms, we are going to harness them – building intelligent tools to support and empower us. The challenge here is not a technological one. We already have the means to support individuals through psychedelic therapy, and our breadth of knowledge in this arena is only increasing. We just lack the capacity. Why is that? How do we solve that issue? That’s where our attention should be, regardless of the extremely shiny, fast-moving technological means of managing the situation, rather than bettering it.
Some tasks are fine to be delegated and automated. Some are, for want of a better word, sacred. If we can clearly define these boundaries, AI can be a powerful addition to all forms of therapy, psychedelic-assisted or otherwise, and further our ability to help and support people all around the world.
David Blackbourn | Community Blogger at Chemical Collective
David is one of our community bloggers here at Chemical Collective. If you’re interested in joining our blogging team and getting paid to write about subjects you’re passionate about, please reach out to Sam via email at samwoolfe@gmail.com
Welcome to Chemical Collective.
Create an account to earn 200 welcome points.
Already have an account? Sign in
Check out our Community Blog and get involved with the conversation. You will be awarded 50 x ChemCoins for each comment up to a limit of 250 total ChemCoins.
Have you purchased any of our products? Reviews and reports are so important to the community. Share your honest opinion, and we’ll reward you with 50 ChemCoins for each review!
Every time you complete an order with us, you’ll be awarded ChemCoins for each Euro spent.
Welcome to Chemical Collective.
Create an account to earn 200 welcome points.
Already have an account? Sign in
Earn commission every time someone makes a purchase through your link.
When you become an affiliate, you will be allocated a unique link to share with your friends, followers, subscribers, or Aunt Susan.
You can choose to payout the commission earned once per month, or save it up to receive on a rainy day! Commission earned is 5% of the total order value per referral.
Contact us to join the Chemical Collective family and become an affiliate.
share your toughts
Join the Conversation.