The Weird, the Eerie and the Artificial
We’ve felt at certain times something being off, something just at the edges of perception. In Philip Larkin’s Aubade, he describes death as being “just on the edge of vision”, and talks about it as being almost a knowable but yet unknown horror. We’ve all had experiences with death, could’ve been seeing roadkill or a family member passing in the night. The experience of being dead however, is something unexperienced. When we do see someone as having passed there is something strange about it, having seen someone, or something, as having existed and lived, but is now not. There is a missingness, where something once was is no longer. There is familiarity, but there is now none.
Perhaps it is also now a playground where there are no children playing, but no apparent reason for them to not. There is no decay, or problems with the equipment. Simply no children. During the Covid-19 pandemic, seeing empty roads, offices and other public spaces elicited a feeling of the familiar that had become unfamiliar via a vacating of people.
In Stanisław Lem’s Solaris, scientists study a seemingly sentient ocean-like planet, called Solaris. Humans, having built a research station, begin studying the planet. The planet reacts to the studying, creating various material manifestations of the researchers’ traumas. These are not ghosts, or hallucinations, they are real and true physical beings the researchers can tough and interact with. However, they are neither truly original people, nor are they purely immaterial psychological objects, like a hallucination. Why Solaris responds to the human researchers in this way is not known. Solaris, the planet, neither tries to explain or justify itself through the manifestations.
Mark Fishers The Weird and the Eeerie discuss the categories the two previous examples occupy. Fisher describes the ‘Weird’ as being some “instrusion” or something “so strange it makes us feel it should not exist”. In Solaris, the planets creation of physical manifestations of trauma exists outside of the concept of human rationality. Throughout the book, Lems writes about how humanity in general, wants to seek more of itself and more of Earth, that we “are searching for an ideal image of our own world”. Nothing that humanity sees or looks for is beyond human rationality.
In contrast, Fisher defines the Eerie as the “presence that which does not belong”, or “something present when there should be nothing, or is there nothing present when there should be something”.
Artificial Intelligence (AI), in the form of Large Language Models (LLMs) in the form of attention based transformers, occupy only the Eerie, when their outputs are not in the form of the explicitly familiar and not the Weird. The Weird is in the form of the new and the novel, the Eerie is that which is something off the path of the familiar. When it is Eerie, it is because of a lack of alignment to the norms of a social system. Or, more in the vein of Luhmann, alignment is an adherence to the codes and programs of a social system, and Eerie-ness is therefore misalignment.
to be eerie
Sesame released their AI voice chat assistant, which represents quite a leap from the Eerie into the familiar. There is a sort of halting talking to it, but still exudes a strange-ness. Why does it feel artificial? Perhaps the constant question asking, or the lack of any specific stance, a feeling that what your talking to is an avoidant blob of noise that feels like it is missing something.
Even when interacting with people, there can be a feeling of Eerie-ness, where there is a missing-ness to the conversation or interaction, that it feels stilted, awkward or fake. A sort of typical conversation where you can feel something predatory in the other conversant. Similarly, a dog might raise its hackles in smelling a coyote, but not react to other dogs. This is all to say, it is not necessarily only AI that would have this mode of Eerie-ness in conversation or interaction, but it exists in others as well.
weirding
The supernatural contains many instances of weird things, or things that emanate the sensation of wrongness. Something weird is a thing that, according to Fisher, “so strange that it makes us feel that it should not exist, or at least it should not exist here”. If that thing does exist however, it escapes taxonomizing or categories. Whereas the eerie is the presence of things that exist in reality in a strange manner or place, or lack of things from a place, the weird is the “indescribable”. Fisher makes this point through Lovecraftian tales, where Lovecraft creates stories and tales about things that exist outside of our natural laws. Our previous concepts and frameworks fail us, unable to provide answers or tools to deal with this wrong thing.
Due to alignment training to achieve ‘helpful, honest and harmless’ chatbots, LLMs have had any notion of ‘weirdness’ eliminated from them. They do not present large weird challenges to our frameworks or conceptions, otherwise they wouldn’t be helpful. The ultimate effect of removing the ‘weirdness’ is that any forcing of new ideas, in presenting new objects or things that exist out side of current frameworks, would not occur. For something to fit into the ‘helpful’ paradigm it must help meet user objectives, for it to be harmless it must mitigate risks and protect well-being. The weird in fact is not helpful, and it may be harmful, as we cannot categorize it, or apply our implicit risk analysis developed from our interactions with the natural world.
Thus, our paradigms are not challenged and thus can never emerge, as it requires the weird. LLMs then represent a flattening of the horizon of the future, nothing new can emerge, the texture of tomorrow is then smooth, no burrs to challenge, to force changes.
new hauntologies
Hauntology is the concept of past aesthetics or ideas returning to the present, as if to haunt it. The persistence of the past in the present. In his article What is Hauntology? in the Film Quarterly, Fisher defines it as the “failure of the future” or more succinctly, “lost futures”. A preponderance of LLMs and AI represent categorical shift towards technological hauntology, in that LLMs and AI consume the past via training on data, and bring it into the future via inference. With a increasing use for everyday thinking and ideas, these systems represent an acceleration in a reduction of the “capacity to conceive of a world radically different from on in which we currently live”.