Posted: Oct 28, 2025
Traditional data visualization has predominantly operated within the visual domain, despite evidence that human cognition benefits from multisensory integration. This paper introduces Synesthetic Computing, a radical departure from conventional approaches that systematically bridges auditory and visual modalities for enhanced data comprehension. Our work is inspired by neurological synesthesia, where stimulation of one sensory pathway leads to automatic experiences in another, but implemented through computational means rather than biological mechanisms. The limitations of unimodal data representation have become increasingly apparent as datasets grow in complexity and dimensionality. While sophisticated visualization techniques exist, they often struggle to convey the subtle temporal dynamics, harmonic relationships, and multidimensional patterns that auditory representations might capture more effectively. Conversely, purely auditory data representations lack the spatial precision and simultaneous overview capabilities of visual methods. Our research addresses three fundamental questions: (1) Can we establish a mathematically sound mapping between auditory and visual information spaces? (2) Does cross-modal data representation enhance pattern recognition and anomaly detection? (3) What novel insights emerge when analyzing data through multiple sensory channels simultaneously? This work contributes a formal framework for sensory transduction in computing, implements a functional bidirectional system, and provides empirical evidence for the cognitive benefits of multisensory data interaction.
Downloads: 0
Abstract Views: 2182
Rank: 488088