A-me: Augmented memories

Hypothetical Memory Recall Device

    • Aura Lab
    • Augmented Reality, auralization, Immersive Experience

A-me is a hypothetical memory recall device that enables the user to immersively experience human memories. It aims at criticizing paradigms related to brain/mind imaging that is currently under discussion in the scientific community. A-me investigates the implications of approaches to brain science, such as reductionism, and phenomenology. Thus, it is a symbolization of the current scientific discussions on human behavior through brain activity analysis. A-me is also technical research on Augmented Reality techniques applied to brain visualization and auralization.


A-ME IS A HYPOTHETICAL MEMORY RECALL DEVICE THAT ENABLES THE USER TO IMMERSIVELY EXPERIENCE HUMAN MEMORIES.

This contribution presents the concept of an optical see-through Augmented Reality system to overlay a volume rendered MRI scan onto a medical head phantom. In addition, the installation features a binaural rendering system to auralize memories as enveloping 3D soundscapes. The user will be able to navigate the brain by using a tracked probe in a similar way neurosurgeons examine brain injuries during preoperative planning. While navigating the brain, the user will find active areas in specific parts of the nervous structure. Pointing at them with the probe will trigger an stored emotional experience in the form of a visual and auditory representation of its neural activity.

This project started from the need to discuss and criticize the above mentioned concepts and theories related to brain / mind definitions. To this end we propose the development of an interactive art installation exemplifying a hypothetical memory reading apparatus. The artwork aims at exposing the current state of the art on brain mapping while offering the possibility to activate an experience. It is known that long term memories are located in the hyppocampus which is an area in the medial temporal lobe of the brain. The device is using real tomographic data from a scanned corpse, which can be navigated by manipulating a tracked probe. The interaction led by the visitor will determine a selection of a memory which will trigger an audiovisual response.

The installation works as follows. In the exhibition space, there will be an area properly equipped to reveal the experience. One stereo 3d screen, six tracking cameras, a half-silvered glass,
and a head manikin will be standing on a table (see Fig.1). The visitor will be equipped with high-end wireless headphones, tracked shutter glasses and a tracked probe. Looking through the glass, the visitor will see an MRI volume visualization registered on the dummy-head. The visitor will be able to navigate different areas of the brain by manipulating the probe. Visually merged with the real data there will be some active hotspots indicating the location of memory. By pointing precisely at one of them an immersive audio-visual response will be triggered in the form of brain activity registered on the actual tomographic data. When moving further away from such a hotspot, the device will merge more and more soundscapes of neighboring memories resulting – from a certain distance on – in complete auditory chaos.

A project by Jordi Puig
In collaboration with:
Dirk Schröder Immersive Sound Design and Auralization Concept
Tore Landsem Industrial design
Rune Svensrud Image processing
Carles Gutierrez Coding support
Developed at SenseIT, funded by Q2S and Picturing The Brain.