beatTracts is an interactive soundscape consisting of several elliptical zones each containing their own track of music. Only when a body enters a zone will a track play. Each track has been composed to compliment the other which enables a harmonious layering of music.
Inspired by a personal struggle with vision loss, beatTracts is experienced in complete darkness and participants rely on spatial audio cues to perceive the surroundings and interactions.
There are three essential components to beatTracts: a focus on audioception (hearing) over opthalmoception (sight), occurrence in real space and real time, and the interactivity among persons present.
To achieve the soundscape, four tracts (i.e., large areas) are delineated (invisibly to participants), each holding their own instrument identity and positioned near its correlated speaker. The identities include piano, percussion, trap, and synthesizer. Within each tract, 2-3 elliptical zones of varying sizes are arranged and each zone contains a single track of music. The music tracks are continuously aligned in tempo to allow for a harmonious layering should multiple zones be activated simultaneously.
Though the space is expansive and free of physical barriers each movement and change in population affect the dynamics and locations of the sounds, including the instrument tracks, volume, and tempo.
Six parts work together within the beatTracts system: camera, openFrameworks, Ableton Live, audio interface, speakers. The inputs are the number of persons and their x,y location and the outputs are volume, tempo, and location of different sounds (see Figure 3).
OpenFrameworks uses openCV to parse through the visual information detected by the camera. The camera sets an initial picture of the space to use as a constant and with each new frame, the shifts in pixels against the constant are interpreted as blobs (or people). Once blobs are detected, several if statements are run to determine the center x,y locations (centroid) of each blob. If within a specific zone, the volume is calculated along a curve function and a message with the volume for the zone (or track) activated is sent via LiveOSC to Ableton Live (an audio program). Depending on the number of blobs detected (where one blob is equal to one person), a message with a specific tempo is also sent via LiveOSC to Ableton Live. The greater the number of blobs, the greater the tempo.