There are a plethora of theories and data that remain two dimensional, all stacked on top of each other, unreachable from the general public. Sensorial (In)verse is an attempt to bridge this distance and produce an immersive experience of the novel astronomical observations visible for all. It is an installation that allows users to explore the movement of stars in the Orion constellation over time. It does this through gestural control and produces an interactive audio-visual representation of the data.
Astrometry, the science of charting the sky, is one of the oldest branches of astronomy and a discipline in which Europe excels. The European Space Agency (ESA) pioneered space astrometry with the Hipparcos Mission (launched in 1989) and, more recently, with the Gaia Mission (launched in 2013). Gaia is a project initiated by ESA to produce a three-dimensional map of our Milky Way galaxy by charting the motions of a thousand million stars, their luminosity, composition and temperature. This mission aims to provide a more in depth knowledge about the origin of the universe as well as the evolutionary history of our galaxy. The data derived by Gaia provides novel positional and radial velocity measurements and enables us to fathom our positionality in this vast cosmos.
In order to make astronomical data be accessible for a broader audience, Sensorial (In)verse translates the data derived from ESA’s Hipparcos Mission (representative of the Gaia data for the initial prototype) in the form of an interactive installation and represents it in a multimedia framework. This astrometry representation would enable better understanding of the mapping of the life path of the clusters and much more awareness about earth’s positionality within the galaxy.
With this project, we have devised a methodology to produce representations of data collected by ESA in the form of a multimedia output. To produce a multi-sensorial work, the project is developed by employing three elements: visualisation, sonification, and interactivity.
The universe is not static; there is constant motion and change. Therefore, visualisation enables us to view the dataset in a more comprehensive form. In the domain of visualisation, it is considered that the majority of existing visual representations of the known universe are viewed from the northern hemispheric perspective. Keeping that into consideration, we developed a representation of the dataset from a non-Earth-based perspective. To accomplish this, a space shuttle is personified as it navigates the constellation. As the user moves closer to the object in the visualisation, the HIPPARCOS identifiers (names of the stars observed) appear.
Sonification makes astronomical data accessible to visually impaired individuals, allowing them to explore space through sound, as a whole creating an immersive astronomical soundscape for the visually impaired people. The interactivity element is added by using Kinect, through which the user can interact with the installation and navigate through the travelling paths of the stars over time.