• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

A VR Interface for Collaborative 3D Audio Performance

by Martin Naef, Daniel Collicott
Add To MetaCart

Tools

Sorted by:
Results 1 - 7 of 7

On the development of a system for gesture control of spatialization

by M. T. Marshall, N. Peters, A. R. Jensenius, J. Boissinot, J. Braasch - In Proceedings of the International Computer Music Conference , 2006
"... This paper presents our current approach to the development of a system for controlling spatialization in a performance setup for small ensemble. We are developing a Gesture Description Interchange Format (GDIF) to standardize the way gesture-related information is stored and shared in a networked c ..."
Abstract - Cited by 14 (5 self) - Add to MetaCart
This paper presents our current approach to the development of a system for controlling spatialization in a performance setup for small ensemble. We are developing a Gesture Description Interchange Format (GDIF) to standardize the way gesture-related information is stored and shared in a networked computer setup. Examples are given of our current GDIF namespace, the gesture tracking subsystem developed to use this namespace and patches written to control spatialization and mapping using gesture data. 1
(Show Context)

Citation Context

...spatialization of sound. Recently a number of projects have begun to deal with this and have developed systems which allow control of spatialization using gestures. Systems such as that described in (=-=Naef and Collicott 2006-=-) allow a performer to control the spatialization of a number of sound sources through the use of a virtual reality (VR) display and a custom built dataglove. Others, such as the ZKM Klangdom (Ramakri...

Gesture Control of Sound Spatialization for Live Musical Performance

by Mark T. Marshall, Joseph Malloch, Marcelo M. W
"... Abstract. This paper presents the development of methods for gesture control of sound spatialization. It provides a comparison of seven popular software spatialization systems from a control point of view, and examines human-factors issues relevant to gesture control. An effort is made to reconcile ..."
Abstract - Cited by 8 (2 self) - Add to MetaCart
Abstract. This paper presents the development of methods for gesture control of sound spatialization. It provides a comparison of seven popular software spatialization systems from a control point of view, and examines human-factors issues relevant to gesture control. An effort is made to reconcile these two design- and parameter-spaces, and draw useful conclusions regarding likely successful mapping strategies. Lastly, examples are given using several different gesture-tracking and motion capture systems controlling various parameters of the spatialization system. 1
(Show Context)

Citation Context

...d tracking system along with handheld mice. The idea of using gesture control together with immersive virtual environments and sound spatialization has been addressed in a number of other works also. =-=[5]-=- describes a system for collaborative spatial audio performance in a virtual environment. Multiple users may interact with the system at the same time to manipulate the position of objects in 3D space...

Interaction with the 3d reactive widgets for musical performance

by Florent Berthaut, Myriam Desainte-catherine, Martin Hachet, Université De Bordeaux - In Proceedings of Brazilian Symposium on Computer Music (SBCM09 , 2009
"... Abstract. While virtual reality and 3D interaction open new prospects for musical performance, existing immersive virtual instruments are often limited to single process instruments or musical navigation tools. We believe that immersive virtual environments may be used to design expressive and effic ..."
Abstract - Cited by 5 (2 self) - Add to MetaCart
Abstract. While virtual reality and 3D interaction open new prospects for musical performance, existing immersive virtual instruments are often limited to single process instruments or musical navigation tools. We believe that immersive virtual environments may be used to design expressive and efficient multi-process instruments. In this paper we present the 3D reactive widgets. These graphical elements enable efficient and simultaneous control and visualisation of musical processes. Then we describe Piivert, a novel input device that we have developed to manipulate these widgets, and several techniques for 3D musical interaction. hal-00633750, version 1- 19 Oct 2011 1.
(Show Context)

Citation Context

... 1998]. Finally, among the existing multi-process 3D instruments, part of them, like the WAVE software from Valbom et al. [Valbom and Marcos, 2005] or the application developed by Martin Naef et al. [=-=Naef and Collicot, 2006-=-], provide limited visual feedback and interaction since they tend to emulate hardware controllers. Other instruments rely on gaming software or devices, like the 3D instrument Fijuu [Oliver and Pickl...

Widgets réactifs 3D pour l’interaction musicale, in "Actes des Journées d’Informatique Musicale - JIM’08", to appear

by Florent Berthaut, Scrime Labri, Université De Bordeaux, Myriam Desainte-catherine, Scrime Labri, Université De Bordeaux, Martin Hachet
"... Notre travail porte sur l’utilisation de l’interaction 3D immersive pour la performance musicale. De nombreuses recherches ont été menées sur l’utilisation d’interfaces graphiques pour le contrôle musical. Elles ont notamment décrit l’intérêt que présentent les widgets réactifs, éléments graphiques ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Notre travail porte sur l’utilisation de l’interaction 3D immersive pour la performance musicale. De nombreuses recherches ont été menées sur l’utilisation d’interfaces graphiques pour le contrôle musical. Elles ont notamment décrit l’intérêt que présentent les widgets réactifs, éléments graphiques qui permettent à la fois le contrôle de processus sonores et la visualisation d’informations sur ces processus. D’autres recherches ont montré les possibilités apportées par la réalité virtuelle en matière d’immersion et d’interaction. Néanmoins, aucune des applications 3D musicales développées jusqu’à maintenant n’exploite les avantages des widgets réactifs. Nous cherchons donc à explorer cette voie. Pour cela, nous avons développé un outil de création d’interfaces 3D, Poulpe3D. Il nous semble primordial de chercher la meilleure façon d’associer les paramètres visuels des widgets et les paramètres perceptifs sonores afin de permettre un contrôle efficace et un retour d’informations pertinent. Pour cela, nous nous appuyons sur plusieurs pistes de recherche, qui nous conduisent à penser qu’il est impossible de fixer ces associations de manière objective et à envisager une série de tests en utilisant un outil 3D de mapping intégré à Poulpe3D. Cet outil donne la possibilité à chaque utilisateur de configurer les liens selon ses préférences. 1.
(Show Context)

Citation Context

...ntrôles audio ou midi et pour lequel l’interaction s’effectue grâce à des capteurs à 6 degrés de liberté tenus par l’utilisateur. Une autre application, développée par Martin Naef et Daniel Collicott =-=[17]-=-, permet de contrôler les volumes et la spatialisation de sources sonores par l’orientation et la position d’objets virtuels associés. La manipulation est ici réalisée à l’aide de gants de données et ...

SOUND SPATIALIZATION CONTROL BY MEANS OF ACOUSTIC SOURCE LOCALIZATION SYSTEM

by Daniele Salvati, Sergio Canazza
"... This paper presents a system for controlling the sound spa-tialization of a live performance by means of the acous-tic localization of the performer. Our proposal is to al-low a performer to directly control the position of a sound played back through a spatialization system, by moving the sound pro ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
This paper presents a system for controlling the sound spa-tialization of a live performance by means of the acous-tic localization of the performer. Our proposal is to al-low a performer to directly control the position of a sound played back through a spatialization system, by moving the sound produced by its own musical instrument. The proposed system is able to locate and track the position of a sounding object (e.g., voice, instrument, sounding mo-bile device) in a two-dimensional space with accuracy, by means of a microphone array. We consider an approach based on Generalized Cross-Correlation (GCC) and Phase Transform (PHAT) weighting for the Time Difference Of Arrival (TDOA) estimation between the microphones. Be-sides, a Kalman filter is applied to smooth the time series of observed TDOAs, in order to obtain a more robust and accurate estimate of the position. To test the system con-trol in real-world and to validate its usability, we devel-oped a hardware/software prototype, composed by an array of three microphones and a Max/MSP external object for the sound localization task. We have got some preliminary successfully results with a human voice in real moderately reverberant and noisy environment and a binaural spatial-ization system for headphone listening. 1.
(Show Context)

Citation Context

... design of different equipments, such as multichannel devices with faders, control software with mouse and joystick for twodimensional movement, sophisticated software with 3D virtual reality display =-=[10]-=-, sensors interfaces such as data gloves based system, head trackers and camera-based tracking systems [11]. In [12], the authors propose a system to allow real-time gesture control of spatialization ...

Interacting with the 3D Reactive Widgets for Musical Performance

by unknown authors
"... Abstract. While virtual reality and 3D interaction provide new possibilities for musical applications, the existing immersive virtual instruments are limited to single process instruments or musical navigation tools. In this paper we present the 3D reactive widgets. These graphical elements enable s ..."
Abstract - Add to MetaCart
Abstract. While virtual reality and 3D interaction provide new possibilities for musical applications, the existing immersive virtual instruments are limited to single process instruments or musical navigation tools. In this paper we present the 3D reactive widgets. These graphical elements enable simultaneous control and visualization of musical processes in 3D immersive environments. They also rely on the live-looping technique, allowing to build complex musical sequences. We describe the interaction techniques that we have designed and implemented to manipulate these widgets, including a virtual ray and tunnels. After having expressed the lack of expressivity and efficiency of the existing input devices for sound production gestures, we finally set the requirements for an appropriate device for musical interaction in 3D immersive environments. 1.
(Show Context)

Citation Context

...98]. Finally, among the existing multi-processes 3D instruments, part of them, like the WAVE software from Valbom et al. [Valbom and Marcos, 2005] or the application developped by Martin Naef et al. [=-=Naef and Collicot, 2006-=-], have limited visual feedback and interaction possibilities since they tend to emulate hardware controllers. The other part of these instruments rely on gaming software or devices, like the 3D instr...

OSC Virtual Controller

by Andrea Brogni, Darwin Caldwell
"... The number of artists who express themselves through mu-sic in an unconventional way is constantly growing. This trend strongly depends on the high diffusion of laptops, which proved to be powerful and flexible musical devices. However laptops still lack in flexible interface, specifically designed ..."
Abstract - Add to MetaCart
The number of artists who express themselves through mu-sic in an unconventional way is constantly growing. This trend strongly depends on the high diffusion of laptops, which proved to be powerful and flexible musical devices. However laptops still lack in flexible interface, specifically designed for music creation in live and studio performances. To resolve this issue many controllers have been developed, taking into account not only the performer’s needs and habits during music creation, but also the audience desire to visually understand how performer’s gestures are linked to the way music is made. According to the common need of adaptable visual interface to manipulate music, in this pa-per we present a custom tridimensional controller, based on Open Sound Control protocol and completely designed to work inside Virtual Reality: simple geometrical shapes can be created to directly control loop triggering and parameter modification, just using free hand interaction.
(Show Context)

Citation Context

...a gloves. The heaviest difference from real playing was the lack of tactile feedback. More unconventional ways of interaction were presented by Rodet et al. [12], Campbell et al. [2] and Neaf and al. =-=[9]-=-, where, according to the context, it was possible to define various metaphors, which lead to different implementations, more or less close to reality. Particularly, in Neaf et al. [9] the authors int...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University