Results 1 - 10
of
32
Touch Projector: Mobile Interaction through Video
"... In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote scree ..."
Abstract
-
Cited by 63 (13 self)
- Add to MetaCart
(Show Context)
In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is “projected” onto the target display in view, as if it had occurred there. This literal adaptation of Tani’s idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing. Author Keywords Mobile device, input device, interaction techniques, multitouch,
Like Bees Around the Hive: A Comparative Study of a Mobile Augmented Reality Map
"... We present findings from field trials of MapLens, a mobile augmented reality (AR) map using a magic lens over a paper map. Twenty-six participants used MapLens to play a location-based game in a city centre. Comparisons to a group of 11 users with a standard 2D mobile map uncover phenomena that aris ..."
Abstract
-
Cited by 33 (1 self)
- Add to MetaCart
(Show Context)
We present findings from field trials of MapLens, a mobile augmented reality (AR) map using a magic lens over a paper map. Twenty-six participants used MapLens to play a location-based game in a city centre. Comparisons to a group of 11 users with a standard 2D mobile map uncover phenomena that arise uniquely when interacting with AR features in the wild. The main finding is that AR features facilitate place-making by creating a constant need for referencing to the physical, and in that it allows for ease of bodily configurations for the group, encourages establishment of common ground, and thereby invites discussion, negotiation and public problem-solving. The main potential of AR maps lies in their use as a collaborative tool. Author Keywords Augmented reality, mobile maps, mobile use, field studies.
Projector Phone: A Study of Using Mobile Phones with Integrated Projector for Interaction with Maps
"... First working prototypes of mobile phones with integrated pico projectors have already been demonstrated and it is expected that such projector phones will be sold within the next three years. Applications that require interaction with large amounts of information will benefit from the large project ..."
Abstract
-
Cited by 22 (6 self)
- Add to MetaCart
(Show Context)
First working prototypes of mobile phones with integrated pico projectors have already been demonstrated and it is expected that such projector phones will be sold within the next three years. Applications that require interaction with large amounts of information will benefit from the large projection and its high resolution. This paper analyses the advantages and disadvantages of an integrated projector when interacting with maps, and discusses findings useful for the development of mobile applications for projector phones. We report in particular the implementation of an application that uses either the screen of the mobile phone, the projection or a combination of both. These three options were compared in a user study in which the participants had to perform three different tasks with each option. The results provide clear evidence for the positive aspects of using a built-in projector, but also show some negative aspects related to text input.
PACER: fine-grained interactive paper via camera-touch hybrid gestures on a cell phone
- In Proc. CHI ’10, ACM(2010
"... PACER is a gesture-based interactive paper system that supports fine-grained paper document content manipulation through the touch screen of a cameraphone. Using the phone’s camera, PACER links a paper document to its digital version based on visual features. It adopts camerabased phone motion detec ..."
Abstract
-
Cited by 14 (1 self)
- Add to MetaCart
(Show Context)
PACER is a gesture-based interactive paper system that supports fine-grained paper document content manipulation through the touch screen of a cameraphone. Using the phone’s camera, PACER links a paper document to its digital version based on visual features. It adopts camerabased phone motion detection for embodied gestures (e.g. marquees, underlines and lassos), with which users can flexibly select and interact with document details (e.g. individual words, symbols and pixels). The touch input is incorporated to facilitate target selection at fine granularity, and to address some limitations of the embodied interaction, such as hand jitter and low input sampling rate. This hybrid interaction is coupled with other techniques such as semi-real time document tracking and loose physical-digital document registration, offering a gesturebased command system. We demonstrate the use of PACER in various scenarios including work-related reading, maps and music score playing. A preliminary user study on the design has produced encouraging user feedback, and suggested future research for better understanding of embodied vs. touch interaction and one vs. two handed interaction.
Virtual Projection: Exploring Optical Projection as a Metaphor for Multi-Device Interaction
- In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
"... Figure 1. Virtual Projection is inspired by its optical counterpart for transferring information between handhelds and stationary displays such as tabletops, PC displays or large public displays. By fixing the virtual projection to the display, the frustum can also be used to (a) select regions, (b) ..."
Abstract
-
Cited by 12 (2 self)
- Add to MetaCart
(Show Context)
Figure 1. Virtual Projection is inspired by its optical counterpart for transferring information between handhelds and stationary displays such as tabletops, PC displays or large public displays. By fixing the virtual projection to the display, the frustum can also be used to (a) select regions, (b) interactively apply filters, and (c) post multiple views. Handheld optical projectors provide a simple way to overcome the limited screen real-estate on mobile devices. We present virtual projection (VP), an interaction metaphor inspired by how we intuitively control the position, size, and orientation of a handheld optical projector’s image. VP is based on tracking a handheld device without an optical projector and allows selecting a target display on which to position, scale, and orient an item in a single gesture. By relaxing the optical projection metaphor, we can deviate from modeling perspective projection, for example, to constrain scale or orientation, create multiple copies, or offset the image. VP also supports dynamic filtering based on the projection frustum, creating overview and detail applications, and selecting portions of a larger display for zooming and panning. We show exemplary use cases implemented using our optical feature-tracking framework and present the results of a user study demonstrating the effectiveness of VP in complex interactions with large displays. Author Keywords Interaction technique; mobile device; handheld projection.
Evaluating Automatically Generated Location-Based Stories for Tourists
- Proceedings CHI 2008
, 2008
"... Abstract Tourism provides over six percent of the world's gross domestic product. As a result, there have been many efforts to use technology to improve the tourist's experience via mobile tour guide systems. One key bottleneck in such location-based systems is content development; existi ..."
Abstract
-
Cited by 11 (5 self)
- Add to MetaCart
(Show Context)
Abstract Tourism provides over six percent of the world's gross domestic product. As a result, there have been many efforts to use technology to improve the tourist's experience via mobile tour guide systems. One key bottleneck in such location-based systems is content development; existing systems either provide trivial information at a global scale or present quality narratives but at an extremely local scale. The primary reason for this dichotomy is that, although good narrative content is more educationally effective (and more entertaining) than a stream of simple, disconnected facts, it is time-intensive and expensive to develop. However, the WikEar system uses narrative theory-informed data mining methodologies in an effort to produce high-quality narrative content for any location on Earth. It allows tourists to interact with these narratives using their camera-enabled cell phones and an innovative interface designed around a magic lens and paper map metaphor. In this paper, we describe a first evaluation of these narratives and the WikEar interface, which reported promising, but not conclusive, results. We also present ideas for future work that will use this feedback to improve the narratives.
PhotoMap: Using Spontaneously taken Images of Public Maps for Pedestrian Navigation Tasks on Mobile Devices
"... In many mid- to large-sized cities public maps are ubiquitous. One can also find a great number of maps in parks or near hiking trails. Public maps help to facilitate orientation and provide special information to not only tourists but also to locals who just want to look up an unfamiliar place whil ..."
Abstract
-
Cited by 11 (3 self)
- Add to MetaCart
(Show Context)
In many mid- to large-sized cities public maps are ubiquitous. One can also find a great number of maps in parks or near hiking trails. Public maps help to facilitate orientation and provide special information to not only tourists but also to locals who just want to look up an unfamiliar place while on the go. These maps offer many advantages compared to mobile maps from services like Google Maps Mobile or Nokia Maps. They often show local landmarks and sights that are not shown on standard digital maps. Often these ‘YOU ARE HERE ’ (YAH) maps are adapted to a special use case, e.g. a zoo map or a hiking map of a certain area. Being designed for a fashioned purpose these maps are often aesthetically well designed and their usage is therefore more pleasant. In this paper we present a novel technique and application called PHOTOMAP that uses images of ‘YOU ARE HERE ’ maps taken with a GPS-enhanced
Evaluation of an off-screen visualization for magic lens and dynamic peephole interfaces
- In Proc. MobileHCI
, 2010
"... Map navigation is often limited due to the inherent size re-strictions of mobile devices ’ displays. Using a magic lens to interact with physical objects has been proposed as a way to reduce this limitation. The dynamic peephole interface is an alternative approach where a device is moved across a v ..."
Abstract
-
Cited by 6 (2 self)
- Add to MetaCart
(Show Context)
Map navigation is often limited due to the inherent size re-strictions of mobile devices ’ displays. Using a magic lens to interact with physical objects has been proposed as a way to reduce this limitation. The dynamic peephole interface is an alternative approach where a device is moved across a virtual surface. In this paper we study the effect of an additional visualization of objects beyond the screen on magic lens and dynamic peephole interfaces. In the conducted experiment the participants had to select points of interest shown on a map. We show that an additional visualization of off-screen objects decreases the task completion time and reduces the perceived task load. The advantage of an off-screen visual-ization is much higher than the difference between using a magic lens instead of a dynamic peephole interface.
GeoGazemarks: Providing gaze history for the orientation on small display maps
- In Proceedings of the 14th International Conference on Multimodal Interaction, ICMI ’12
, 2012
"... Orientation on small display maps is often difficult because the visible spatial context is restricted. This paper pro-poses to provide the history of a user’s visual attention on a map as visual clue to facilitate orientation. Visual at-tention on the map is recorded with eye tracking, clustered ge ..."
Abstract
-
Cited by 5 (5 self)
- Add to MetaCart
(Show Context)
Orientation on small display maps is often difficult because the visible spatial context is restricted. This paper pro-poses to provide the history of a user’s visual attention on a map as visual clue to facilitate orientation. Visual at-tention on the map is recorded with eye tracking, clustered geo-spatially, and visualized when the user zooms out. This implicit gaze-interaction concept, called GeoGazemarks, has been evaluated in an experiment with 40 participants. The study demonstrates a significant increase in efficiency and an increase in effectiveness for a map search task, compared to standard panning and zooming.
Playing it Real: Magic Lens and Static Peephole Interfaces for Games in a Public Space
- Proceedings of MobileHCI
, 2012
"... Magic lens and static peephole interfaces are used in numerous consumer mobile phone applications such as Augmented Reality browsers, games or digital map applications in a variety of contexts including public spaces. Interface performance has been evaluated for various interaction tasks involving s ..."
Abstract
-
Cited by 4 (3 self)
- Add to MetaCart
(Show Context)
Magic lens and static peephole interfaces are used in numerous consumer mobile phone applications such as Augmented Reality browsers, games or digital map applications in a variety of contexts including public spaces. Interface performance has been evaluated for various interaction tasks involving spatial relationships in a scene. However, interface usage outside laboratory conditions has not been considered in depth in the evaluation of these interfaces. We present findings about the usage of magic lens and static peephole interfaces for playing a find-and-select game in a public space and report on the reactions of the public audience to participants ‟ interactions. Contrary to our expectations participants favored the magic lens over a static peephole interface despite tracking errors, fatigue and potentially conspicuous gestures. Most passersby did not pay attention to the participants and vice versa. A comparative laboratory experiment revealed only few differences in system usage.