DMCA
A VR Interface for Collaborative 3D Audio Performance
Citations: | 7 - 0 self |
Citations
109 |
Bodyspace: Anthropometry; Ergonomics and the Design of Work”, London: Taylor and Francis,
- Pheasant
- 1998
(Show Context)
Citation Context ...ding position was chosen to enhance performance aesthetics; in this position, fine motor control is best achieved with a hand position 50-100mm above elbow height and within the ‘normal working area’ =-=[15]-=- which equates to approximately one forearm’s span from the body. Hence the working volume chosen is a cube of approximately 0.4m centered directly in front of abdomen, and the audio scene is scaled a... |
75 | CavePainting: A fully immersive 3D artistic medium and interactive experience.
- Keefe, Feliz, et al.
- 2001
(Show Context)
Citation Context ...pping research also tending to focus on instrumental richness, rather than the intelligibility of the mapping.sHistorically, the use of VR as an expressive medium has concentrated on visual art, e.g. =-=[8]-=-. To date, there has been little experimental research on the use of VR interfaces for sound control [1],[10]. The DIVA system [6] was among the first to use VR for musical performance. A VR evaluatio... |
63 |
Ambisonics in multichannel broadcasting and video,”
- Gerzon
- 1985
(Show Context)
Citation Context ...iversity of Technology, introducing concepts such as a virtual air guitar [9]. Spatial perception and rendering of sound is well understood, ranging from amplitude panning approaches [16], Ambisonics =-=[4]-=- to large speaker arrays for rendering wave fields or head-related transfer function methods predominantly used with headphones. Virtual reality toolkits usually include some form of spatial sound ren... |
37 |
Electronic music interfaces: new ways to play
- Paradiso
- 1997
(Show Context)
Citation Context ...reality technology. It builds upon technology and interaction paradigms developed independently in the two fields. There is a wealth of research literature concerning novel musical interface devices; =-=[14]-=- includes a comprehensive overview. Many artists have successfully employed VR-gloves and non-contact sensing in live performance; examples of early innovation being the work carried out at STEIM, and... |
36 | Real-time spatial processing of sounds for music, multimedia and interactive human-computer interfaces,” - Jot - 1997 |
25 | Spatialized audio rendering for immersive virtual environments.
- NAEF, STAADT, et al.
- 2002
(Show Context)
Citation Context ...ndering system spatializes the audio source objects using a volume panning approach, deriving the data from the scene graph. All audio sources are rendered using the blue-c API sound rendering system =-=[13]-=- that supports spatialization of a large number of sound sources with arbitrary speaker configurations. The audio system supports audio file playback either from memory (e.g. for short loops), streami... |
23 |
The importance of parameter mapping
- Wanderley, Paradis
- 2002
(Show Context)
Citation Context ...lly employed VR-gloves and non-contact sensing in live performance; examples of early innovation being the work carried out at STEIM, and by Jaron Lanier and Tod Machover. In the areas of mapping [3],=-=[5]-=- and visualization of sound, experimental research is relatively sparse, with mapping research also tending to focus on instrumental richness, rather than the intelligibility of the mapping.sHistorica... |
22 | The blue-c distributed scene graph.
- NAEF, LAMBORAY, et al.
- 2003
(Show Context)
Citation Context ... all sound sources. This essentially provides multiple instances of the user interface to a single audio rendering system. The synchronization is based upon the blue-c Distributed Scene Graph (bcDSG) =-=[11]-=- that synchronizes the scene graph data structure across multiple machines and manages concurrency issues including locking to make sure no two users can modify the same object concurrently. Although ... |
20 | Design of virtual three-dimensional instruments for sound control
- Mulder
- 1998
(Show Context)
Citation Context ...g.sHistorically, the use of VR as an expressive medium has concentrated on visual art, e.g. [8]. To date, there has been little experimental research on the use of VR interfaces for sound control [1],=-=[10]-=-. The DIVA system [6] was among the first to use VR for musical performance. A VR evaluation framework has been built at the Helsinki University of Technology, introducing concepts such as a virtual a... |
15 |
Performance factors in control of highdimensional spaces
- Garnett, Goudeseune
- 1999
(Show Context)
Citation Context ...ssfully employed VR-gloves and non-contact sensing in live performance; examples of early innovation being the work carried out at STEIM, and by Jaron Lanier and Tod Machover. In the areas of mapping =-=[3]-=-,[5] and visualization of sound, experimental research is relatively sparse, with mapping research also tending to focus on instrumental richness, rather than the intelligibility of the mapping.sHisto... |
14 | DIVA virtual audio reality system
- Huopaniemi, Savioja, et al.
(Show Context)
Citation Context ...se of VR as an expressive medium has concentrated on visual art, e.g. [8]. To date, there has been little experimental research on the use of VR interfaces for sound control [1],[10]. The DIVA system =-=[6]-=- was among the first to use VR for musical performance. A VR evaluation framework has been built at the Helsinki University of Technology, introducing concepts such as a virtual air guitar [9]. Spatia... |
12 | Uniform spreading of amplitude panned virtual sources, [in
- Pulkki
- 1999
(Show Context)
Citation Context ...t the Helsinki University of Technology, introducing concepts such as a virtual air guitar [9]. Spatial perception and rendering of sound is well understood, ranging from amplitude panning approaches =-=[16]-=-, Ambisonics [4] to large speaker arrays for rendering wave fields or head-related transfer function methods predominantly used with headphones. Virtual reality toolkits usually include some form of s... |
11 | A manifold interface for kinesthetic notation in high-dimensional systems - Choi - 2000 |
6 | Blue-C API: A Multimedia and 3D Video Enhanced Toolkit for Collaborative VR and Telepresence
- Naef, Staadt, et al.
- 2004
(Show Context)
Citation Context ...e user interface to modify the scene; and the audio rendering system (see Fig. 1). The audio rendering system as well as the “glue”-code required to combine the elements is provided by the blue-c API =-=[12]-=-, a virtual reality toolkit originally designed for collaborative and tele-presence applications. User interface Synchronization Fig. 1. System overview. 3.1 User Interface Scene graph Audio renderer ... |
2 |
The role of emerging visualisation technologies in delivering competitive market advantage
- Anderson, Kenny, et al.
- 2002
(Show Context)
Citation Context ...pping.sHistorically, the use of VR as an expressive medium has concentrated on visual art, e.g. [8]. To date, there has been little experimental research on the use of VR interfaces for sound control =-=[1]-=-,[10]. The DIVA system [6] was among the first to use VR for musical performance. A VR evaluation framework has been built at the Helsinki University of Technology, introducing concepts such as a virt... |
1 |
Experiments with Virtual Reality Instruments
- Mäki-Pataola
(Show Context)
Citation Context ... system [6] was among the first to use VR for musical performance. A VR evaluation framework has been built at the Helsinki University of Technology, introducing concepts such as a virtual air guitar =-=[9]-=-. Spatial perception and rendering of sound is well understood, ranging from amplitude panning approaches [16], Ambisonics [4] to large speaker arrays for rendering wave fields or head-related transfe... |