Results 1 -
7 of
7
On the development of a system for gesture control of spatialization
- In Proceedings of the International Computer Music Conference
, 2006
"... This paper presents our current approach to the development of a system for controlling spatialization in a performance setup for small ensemble. We are developing a Gesture Description Interchange Format (GDIF) to standardize the way gesture-related information is stored and shared in a networked c ..."
Abstract
-
Cited by 14 (5 self)
- Add to MetaCart
(Show Context)
This paper presents our current approach to the development of a system for controlling spatialization in a performance setup for small ensemble. We are developing a Gesture Description Interchange Format (GDIF) to standardize the way gesture-related information is stored and shared in a networked computer setup. Examples are given of our current GDIF namespace, the gesture tracking subsystem developed to use this namespace and patches written to control spatialization and mapping using gesture data. 1
Gesture Control of Sound Spatialization for Live Musical Performance
"... Abstract. This paper presents the development of methods for gesture control of sound spatialization. It provides a comparison of seven popular software spatialization systems from a control point of view, and examines human-factors issues relevant to gesture control. An effort is made to reconcile ..."
Abstract
-
Cited by 8 (2 self)
- Add to MetaCart
(Show Context)
Abstract. This paper presents the development of methods for gesture control of sound spatialization. It provides a comparison of seven popular software spatialization systems from a control point of view, and examines human-factors issues relevant to gesture control. An effort is made to reconcile these two design- and parameter-spaces, and draw useful conclusions regarding likely successful mapping strategies. Lastly, examples are given using several different gesture-tracking and motion capture systems controlling various parameters of the spatialization system. 1
Interaction with the 3d reactive widgets for musical performance
- In Proceedings of Brazilian Symposium on Computer Music (SBCM09
, 2009
"... Abstract. While virtual reality and 3D interaction open new prospects for musical performance, existing immersive virtual instruments are often limited to single process instruments or musical navigation tools. We believe that immersive virtual environments may be used to design expressive and effic ..."
Abstract
-
Cited by 5 (2 self)
- Add to MetaCart
(Show Context)
Abstract. While virtual reality and 3D interaction open new prospects for musical performance, existing immersive virtual instruments are often limited to single process instruments or musical navigation tools. We believe that immersive virtual environments may be used to design expressive and efficient multi-process instruments. In this paper we present the 3D reactive widgets. These graphical elements enable efficient and simultaneous control and visualisation of musical processes. Then we describe Piivert, a novel input device that we have developed to manipulate these widgets, and several techniques for 3D musical interaction. hal-00633750, version 1- 19 Oct 2011 1.
Widgets réactifs 3D pour l’interaction musicale, in "Actes des Journées d’Informatique Musicale - JIM’08", to appear
"... Notre travail porte sur l’utilisation de l’interaction 3D immersive pour la performance musicale. De nombreuses recherches ont été menées sur l’utilisation d’interfaces graphiques pour le contrôle musical. Elles ont notamment décrit l’intérêt que présentent les widgets réactifs, éléments graphiques ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Notre travail porte sur l’utilisation de l’interaction 3D immersive pour la performance musicale. De nombreuses recherches ont été menées sur l’utilisation d’interfaces graphiques pour le contrôle musical. Elles ont notamment décrit l’intérêt que présentent les widgets réactifs, éléments graphiques qui permettent à la fois le contrôle de processus sonores et la visualisation d’informations sur ces processus. D’autres recherches ont montré les possibilités apportées par la réalité virtuelle en matière d’immersion et d’interaction. Néanmoins, aucune des applications 3D musicales développées jusqu’à maintenant n’exploite les avantages des widgets réactifs. Nous cherchons donc à explorer cette voie. Pour cela, nous avons développé un outil de création d’interfaces 3D, Poulpe3D. Il nous semble primordial de chercher la meilleure façon d’associer les paramètres visuels des widgets et les paramètres perceptifs sonores afin de permettre un contrôle efficace et un retour d’informations pertinent. Pour cela, nous nous appuyons sur plusieurs pistes de recherche, qui nous conduisent à penser qu’il est impossible de fixer ces associations de manière objective et à envisager une série de tests en utilisant un outil 3D de mapping intégré à Poulpe3D. Cet outil donne la possibilité à chaque utilisateur de configurer les liens selon ses préférences. 1.
SOUND SPATIALIZATION CONTROL BY MEANS OF ACOUSTIC SOURCE LOCALIZATION SYSTEM
"... This paper presents a system for controlling the sound spa-tialization of a live performance by means of the acous-tic localization of the performer. Our proposal is to al-low a performer to directly control the position of a sound played back through a spatialization system, by moving the sound pro ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
This paper presents a system for controlling the sound spa-tialization of a live performance by means of the acous-tic localization of the performer. Our proposal is to al-low a performer to directly control the position of a sound played back through a spatialization system, by moving the sound produced by its own musical instrument. The proposed system is able to locate and track the position of a sounding object (e.g., voice, instrument, sounding mo-bile device) in a two-dimensional space with accuracy, by means of a microphone array. We consider an approach based on Generalized Cross-Correlation (GCC) and Phase Transform (PHAT) weighting for the Time Difference Of Arrival (TDOA) estimation between the microphones. Be-sides, a Kalman filter is applied to smooth the time series of observed TDOAs, in order to obtain a more robust and accurate estimate of the position. To test the system con-trol in real-world and to validate its usability, we devel-oped a hardware/software prototype, composed by an array of three microphones and a Max/MSP external object for the sound localization task. We have got some preliminary successfully results with a human voice in real moderately reverberant and noisy environment and a binaural spatial-ization system for headphone listening. 1.
Interacting with the 3D Reactive Widgets for Musical Performance
"... Abstract. While virtual reality and 3D interaction provide new possibilities for musical applications, the existing immersive virtual instruments are limited to single process instruments or musical navigation tools. In this paper we present the 3D reactive widgets. These graphical elements enable s ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract. While virtual reality and 3D interaction provide new possibilities for musical applications, the existing immersive virtual instruments are limited to single process instruments or musical navigation tools. In this paper we present the 3D reactive widgets. These graphical elements enable simultaneous control and visualization of musical processes in 3D immersive environments. They also rely on the live-looping technique, allowing to build complex musical sequences. We describe the interaction techniques that we have designed and implemented to manipulate these widgets, including a virtual ray and tunnels. After having expressed the lack of expressivity and efficiency of the existing input devices for sound production gestures, we finally set the requirements for an appropriate device for musical interaction in 3D immersive environments. 1.
OSC Virtual Controller
"... The number of artists who express themselves through mu-sic in an unconventional way is constantly growing. This trend strongly depends on the high diffusion of laptops, which proved to be powerful and flexible musical devices. However laptops still lack in flexible interface, specifically designed ..."
Abstract
- Add to MetaCart
(Show Context)
The number of artists who express themselves through mu-sic in an unconventional way is constantly growing. This trend strongly depends on the high diffusion of laptops, which proved to be powerful and flexible musical devices. However laptops still lack in flexible interface, specifically designed for music creation in live and studio performances. To resolve this issue many controllers have been developed, taking into account not only the performer’s needs and habits during music creation, but also the audience desire to visually understand how performer’s gestures are linked to the way music is made. According to the common need of adaptable visual interface to manipulate music, in this pa-per we present a custom tridimensional controller, based on Open Sound Control protocol and completely designed to work inside Virtual Reality: simple geometrical shapes can be created to directly control loop triggering and parameter modification, just using free hand interaction.