Many people experience psychedelic AI art as giving life to their personal experiences: a confirmation of what those experiences were like. But I’ve wondered how certain consequences of AI art seem to run up against some common positive messages contained in psychedelic experiences. One is connectedness. Specifically, people often come away from meaningful psychedelic experiences feeling more connected to others. I’m not sure if seeing AI art in psychedelic content helps to foster this feeling. To me, it often feels very alien and detached from human experience.
While barely noticed, and sometimes only weakly experienced, the use of human-created art in content helps readers to better connect to the content the person has created. (I honestly feel no emotional reaction when reading AI-generated articles featuring AI-generated images.) Perhaps in the future, AI art can so perfectly replicate human art and photography that it will create an equal emotional reaction (this isn’t something to celebrate, necessarily). But until then, relying on AI art in psychedelic content may lead people to feel more detached from the human voice behind the content.
Another way in which psychedelic AI art can disconnect us is through the loss of employment for artists and the loss of artwork. If we continue to rely on AI art for content, then we will collaborate with each other less in creative contexts. Creativity is essential to our humanity; it connects us in the world of work, and without it, we become increasingly disconnected. Instead, we connect on matters less human and aesthetic: efficiency, productivity, and profit. And related to the earlier point about how human art evokes our emotions, if we are exposed to less human art in the world, this could further disconnect us from the lives of others. That’s something we don’t need, given how much of our lives are lived online already.
One more tangible effect of overrelying on psychedelic AI art, which I see as conflicting with a common message of the psychedelic experience, is the environmental impact of generative AI. As Adam Zewe explains in an article for MIT News:
The computational power required to train generative AI models that often have billions of parameters, such as OpenAI’s GPT-4, can demand a staggering amount of electricity, which leads to increased carbon dioxide emissions and pressures on the electric grid.
Furthermore, deploying these models in real-world applications, enabling millions to use generative AI in their daily lives, and then fine-tuning the models to improve their performance draws large amounts of energy long after a model has been developed.
Beyond electricity demands, a great deal of water is needed to cool the hardware used for training, deploying, and fine-tuning generative AI models, which can strain municipal water supplies and disrupt local ecosystems. The increasing number of generative AI applications has also spurred demand for high-performance computing hardware, adding indirect environmental impacts from its manufacture and transport.
Compare this negative consequence with a common message and attitude contained within psychedelic experiences: concern for the environment. Research also supports this link between psychedelics and increased connection to nature. If those in the psychedelic field want to protect the environment in a meaningful way, then the environmental costs of generative AI have to be considered when creating content related to psychedelics. There are some barriers to making this happen, however. Noman Bashir, who is a Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium (MCSC), notes in the MIT News article, “an everyday user doesn’t think too much about that. The ease-of-use of generative AI interfaces and the lack of information about the environmental impacts of my actions means that, as a user, I don’t have much incentive to cut back on my use of generative AI.”
A similar problem arises with platforms like Amazon, Shein, and Temu. People often aren’t aware of just how unethical the business practices behind these companies are – and, even when people have this awareness, the convenience of using them often defeats the desire to boycott their use. Moreover, as I discuss in another blog post, increased concern for the environment after psychedelic use doesn’t always translate into pro-environmental behaviour, for various reasons.
Nevertheless, the existing disconnect between awareness and action doesn’t mean the disconnect is inevitable. I think there can come a point at which awareness of a problem reaches a certain threshold, after which the impact of the problem just feels heavier – more real. At the very least, the use of psychedelic AI art should be thought of as an ethical issue as much as an aesthetic one. By doing so, we can better weigh up the benefits and costs of using this form of generative AI.
Sam Woolfe | Community Blogger at Chemical Collective | www.samwoolfe.com
Sam is one of our community bloggers here at Chemical Collective. If you’re interested in joining our blogging team and getting paid to write about subjects you’re passionate about, please reach out to Sam via email at samwoolfe@gmail.com
share your toughts
Join the Conversation.