The Aesthetic Experience of Machine Learning: Grounding Control

November 24th, 2019  |  Published in November 2019

Turkish media artist Refik Anadol’s public art installation synthesizing architecture and intermedia, Machine Hallucination, is quite possibly one of the most interesting recent new media works of our epoch. This is explicitly because the piece not only integrates machine learning but presents a neural net in action. The piece has a dataset of over 300 million input photographic images of landscape imagery and abstract object studies. We become audiences to visual selection, as machine learning collapses these inputs by adapting to varied “weights,” or emergent associational values. Embedding photography within machine intelligence, Anadol’s work—debuting at Artechoice in Chelsea Market—prompts an understanding of new media vis-a-vis architectonics. As such, Anadol affirms that Big Data and neuro-inferential machine learning can, truly, serve a pharmacological purpose (that is, as both poison and cure). Rather than abducting machine learning for predictive processing re: marketing, analysis or metadata’s capitalist purposiveness, Anadol’s work creates a variegated kaleidoscopic complex fabric that invites immersion. At its most redolent and slow-moving moments, the works recall the object-studies of American Abstraction’s field paintings, while, during moments of naturalist whimsy, familiar images are mediated by the bleeding orphic dyes of Kupka, Léger and Delaunay’s Fauvism. Digitally fettered in this harmony, Anadol’s work strikes a balance between mathematical sublimity (given its scale) and the inquisitive task of navigating prayed perception.

All remnants of placid painterly quietude are quickly disrobed as the immersive amphitheater that is Machine Hallucination whirs at a rapid pace. To simply don the hackneyed descriptor of “surrealism” when describing the flashing pieces would not do them justice; the work also introduces an ontological query: ought we consider this a “piece’ or a series of “moving pieces”? To qualify Machine Hallucination as “cinematic” seems folly—domestic indices of organic life, such as fauna, cacti, and cityscapes vacillate between beclouded shadows and fractured materiality, a dialectic practice that reproduces the uncanny process of hallucination. However, this hallucination is not one that can be replayed, as the machine learning algorithm is involved in a constant process of rewriting itself, continually reinscribing its historical latticework. Thus, as a cactus subsumes the sun’s flaxen rays, its thorns turned orange, an inky aquamarine plash swathes the entirety of the screen(s), as the New York cityscape bleeds through. Redolent of suspended dancers and pedestrian passageways, a florid botany melts into flickering shadows, the process recalling the reassembly of cloaked memories. Anadol’s works are not simply pedagogical studies of relational aesthetics or technological encroachments upon sublimity but, instead, recollect the unveiling of everyday life from the recesses of memory.

Culling ambient audio and sputtering visual associations learned through reviewed images, the StyleGAN algorithm powering the work produces abstract colors and forms, including patterns of pointillistic quality. The 60×40 inch screen runs eight-minute loops of “data paintings” based on perceptual difference; consequently, the images create a photographic recursion of memory constantly involved in permutation. Anadol’s work also offers a historiography of algorithmic study as based on visual operations. The amorphous play, a durational experience, demonstrates how permutative difference cannot simply be reduced to the schematic of geometric composition. The work offers us a means of conceiving of an alternative history of the algorithms, that of agential play and plasticity.

Works such as these also demonstrate how we must philosophically conceive of contemporaneous machineology. One such theoretical vantage is offered by Matteo Pasquinelli’s June 2019 e-flux article “Three Thousand Years of Algorithmic Rituals: The Emergence of AI from the Computation of Space. Here, we find Pasquinelli examining the history of machine intelligence by beginning with the topology of Hindu culture, examining the schematic geometric composition of the falcon in the Agnicayana ritual. Parsing Frank Rosenblatt’s Perceptron, an early photoreceptor-machine capable of self-sufficient deep learning, the author asserts that “algorithms are among the most ancient and material practices, predating many human tools and all modern machines.” He then reflects on “Vision Machine,” a concept coined by the philosopher Paul Virilio, and predicts its disappearance in the era of artificial intelligence and industrialized vision, concluding that topological transformations similar to those proposed by Perceptron are being grafted into a technique of memory. Indeed, as Anadol demonstrates, memory is one element of machine learning as evinced by its aesthetic experience; however, the durational engagement with the piece is undoubtedly inseparable from the processual “becoming” we are privy to, as architecture disintegrates into formative and suspended entities that bear indices of recognizability while, simultaneously, being liquidated of any stable specific signifier-signified relationship. Stop signs become diffracted specs of light, diffused pixels—the epistemological underpinnings grounding habitual reality are inverted. Truly, I have never been truly haunted by an artwork before.

Beginning with the construction of geometric forms and cosmic entities qua the Agnicayana rituals, Pasquinelli traces the genealogy of AI, rooting the its perceptual agency “among the most ancient and material practices, predating many human tools and all modern machines.” By tying geometric and social segmentation into this algorithmic impulse, the author defers to Ernst Kapp’s now-defunct model of “organ projection”: that every tool, immaterial or not, is a liberation of the body’s organs. In this case, algorithms reify the operative, sense-making tool that extends the computational organ per excellence: the brain. However, the aesthetic experience may, as Yuk Hui denotes, offers sensibility that cannot be reduced to positivist functionalism: Anadol’s haunting belies computation, for it requires an external perceptual agency to make sense of what would, otherwise, merely be lines of associative code.

Thus, perhaps the aesthetic realm offers us a mean to reconceive of the algorithm. According to Pasquinelli, the algorithm is defined: (1) as an abstract diagram that emerges from the repetition of a process, an organization of time, space, labor, and operations: it is not a rule that is invented from above but emerges from below; (2) the division of this process into finite steps in order to perform and control it efficiently; (3) a solution to a problem, an invention that bootstraps beyond the constrains of the situation (any algorithm is a trick); (4) most importantly, an algorithm is an economic process, as it must employ the least amount of resources in terms of space, time, and energy, adapting to the limits of any situation.

Rather than this definition of the algorithm, the aesthetic experience recalls French philosopher of technology, Bernard Stiegler’s literature, where we find the thesis of media mnemonics as pharmakon. To be fair, Stiegler prudently demonstrates that while the internet and automation’s increasingly self-annotating techniques offer the potentials for emancipation (e.g. hacktivism or collective educational communities), practices like metadata collection, data mining, and the assemblage of social graphs impose capitalist proletarianization by exteriorizing human experience onto digital platforms, resulting in the loss of savoir-faire, or what he calls “knowledge of how to make do.” The cognitivist class of data-producers has replaced the standard terms of proletarianization, which was based on identifiable labor.  For Stiegler, a new digital culture-to-come must collectively individuate, producing “new moral beings” who are “de-proletarianized.”

Stiegler’s alternative future points toward the commons and philia—Anadol’s work is literal instantiation of the ethos of the commons involved in a playful experience that makes no attempt to hide its monstrous scale (the cacophony of ambient-cum-noise music is a reminder). Today’s machines can no longer be considered as the extensions of bodily organs—as previously conceived by the 20th-century anthropologists and media theorists such as Arnold Gehlen and Marshal McLuhan—but agents in a field of techniques and parts of a network of pathologically distorted relations. The distortion of relations becomes visually manifest in Anadol’s work, where spatial dispositions suggest interoceptive interference.

One criticism of such generative and predictive artistic models such as Anadol’s, of course, could readily remark that they merely offer a palliative solution or, even worse, a distraction form the truly insidious nature of Big Data and metadata collection. This criticism could be empowered by instantiations such as Cambridge Analytica’s recent ubiquitous endeavor of data collection, utilized for predictive marketing and data brokerage. Recall that predictive coding is a leading theory of how brains perform probabilistic inference, and such neuro-inferential models are no longer limited to the biomorphic terrain but are used to facilitate the adaptive modeling of data collection, as in the case of Google and Amazons’ context-rich experiences (shopping online, video streaming services). Here, information is collected and compressed into various inputs that process through the bottleneck of “pruning”—rather than the being restrained by contingency, we see how such systems proffer plasticity by instrumentalizing contingency.

If Anadol’s work is privy to such criticism, however, it also provides the general public with a visual metaphor for how such phenomena works. If Big Data’s modulation is not merely based on the storage-recollection system of static inputs but, instead, is stilted on inductive learning, then it readily escapes schematization. This is particularly the case for those not well versed in the discourse on Algorithmic Generalized Intelligence, Hebbian learning and other specialized discourses re: theoretical computation. Consequently, Anadol’s work proves to be a pharmakon, even if it is unintentionally so—we now have a conception of how Big Data’s control operates and how the arachnean web of datafication is an ongoing and elastic process. Without schematization, we are only further alienated from such processes, and the aesthetic experience becomes critical for political use. While works such as Machine Hallucination may not be sufficient as ends in themselves, they undoubtedly serve a valuable pedagogical purpose, holding a mirror—or, more accurately, a microscope—with which to analyze the operative imperceptible processes informing digital automatization. The aesthetic experience grounds control, while the burden is now on us to further prod our knowledge into political action.

–Ekin Erkan

Comments are closed.