Many people with visual disabilities mainly use audio feedback as a primary modality for interaction. Representing the visual environment with appropriate sounds contributes to make it intelligible to the blind. This audio-encoded environment still needs to be accessed in the same way as sighted people scan visual contents with their gaze. A finger-based scanning could be seen as a gaze-like strategy for those with visual impairments to be able of sensing an audio-represented context. We present in this work a computational interface that meets both, the visual-audio codification and the multi-touch interaction, so as to enlarge legibility of the environment for the blind and to facilitate navigating to desired locations, exploration, and serendipitous discovery. The core of this interface is the color and depth codification into musical instruments sounds, which effectively provides spatial awareness, audio revealing of boundaries and obstacles detection. The main contribution of our work is the assistance provided by this interface toward an active interaction of the user with his fingers that makes it possible to selectively explore, to discover points of interest, develop personalized strategies for navigating, and, in general, enjoy a greater sense of independence.