A live tracking dance installation
In these two weeks Processing workshop we realized an interactive music visualization for live stage performances. We created a music-reactive point cloud using two Kinects facing each other to track any object or person that stands in the area in between. In this case, the cloud builds up a 360° model of the person dancing in between the two Kinects while the cloud itself reacts to live data from Ableton. We uncovered more than one major problem by trying to use both Kinect's signals at once and work with the data in Processing. That's why we ended up using seven libraries to get the point cloud to react as we wanted.