We possess strong associations between the materiality of objects and sounds commonly made with them. This implies that an object’s function is intimately bound up with the sound it makes when we handle it. Within the Embodied Generative Music (EGM) project, which took place from 2007 until 2010, Gerhard Eckel, Deniz Peters and David Pirrò created a virtual instrument that deconstructs and reconstructs the connection between bodily movements and generated sounds that exists when a traditional instrument is played. They invited dancers to explore this instrument, actually an entire virtual sonic environment, and its movement-sound relationships in the project’s aesthetic lab.1 The expectation was that dancers have a greater sensitivity to tactile-kinesthetic-sonic stimulation in a digital environment than average movers. My own research contributes to developing understandings of human movement and kinesthesia in interaction design which, in turn, can inform the design of movement-based interaction.2