Results 1 - 10
of
17
An Empirical Study of Machine Learning Techniques for Affect Recognition in Human-Robot Interaction
- Pattern Analysis & Applications
, 2006
"... Abstract – Given the importance of implicit communication in human interactions, it would be valuable to have this capability in robotic systems wherein a robot can detect the motivations and emotions of the person it is working with. Recognizing affective states from physiological cues is an effect ..."
Abstract
-
Cited by 46 (6 self)
- Add to MetaCart
(Show Context)
Abstract – Given the importance of implicit communication in human interactions, it would be valuable to have this capability in robotic systems wherein a robot can detect the motivations and emotions of the person it is working with. Recognizing affective states from physiological cues is an effective way of implementing implicit human-robot interaction. Several machine learning techniques have been successfully employed in affect-recognition to predict the affective state of an individual given a set of physiological features. However, a systematic comparison of the strengths and weaknesses of these methods has not yet been done. In this paper we present a comparative study of four machine learning methods- K-Nearest Neighbor, Regression Tree, Bayesian Network and Support Vector Machine as applied to the domain of affect recognition using physiological signals. The results showed that Support Vector Machine gave the best classification accuracy even though all the methods performed competitively. Regression Tree gave the next best classification accuracy and was the most space and time efficient.
Safe planning for human–robot interaction
- in: IEEE International Conference on Robotics and Automation, 2004
"... This paper presents a strategy for improving the safety of human–robot interaction by minimizing a danger criterion during the planning stage. This strategy is one part of the overall methodology for safe planning and control in human–robot interaction. The focus application is a hand-off task betwe ..."
Abstract
-
Cited by 29 (3 self)
- Add to MetaCart
(Show Context)
This paper presents a strategy for improving the safety of human–robot interaction by minimizing a danger criterion during the planning stage. This strategy is one part of the overall methodology for safe planning and control in human–robot interaction. The focus application is a hand-off task between an articulated robot and an inexpert human user. Two formulations of the danger criterion are proposed: a criterion assuming independent safety-related factors, and a criterion assuming mutually dependent factors. Simulations of the proposed planning strategy are presented for both 2D and 3D robots. The results indicate that a criterion based on scaled mutually dependent factors such as the robot inertia and the human robot distance generates safe, feasible paths for interaction. © 2005 Wiley Periodicals, Inc. 1.
Empirical Results from Using a Comfort Level Device in Human-Robot Interaction Studies
- In Proceeding of the 2006 AMC Conference on Human-Robot Interaction
"... This paper describes an extensive analysis of the comfort level data of 7 subjects with respect to 12 robot behaviours as part of a human-robot interaction trial. This includes robot action, proximity and motion relative to the subjects. Two researchers coded the video material, identifying visible ..."
Abstract
-
Cited by 23 (7 self)
- Add to MetaCart
This paper describes an extensive analysis of the comfort level data of 7 subjects with respect to 12 robot behaviours as part of a human-robot interaction trial. This includes robot action, proximity and motion relative to the subjects. Two researchers coded the video material, identifying visible states of discomfort displayed by subjects in relation to the robot's behaviour. Agreement between the coders varied from moderate to high, except for more ambiguous situations involving robot approach directions. The detected visible states of discomfort were correlated with the situations where the comfort level device (CLD) indicated states of discomfort. Results show that the uncomfortable states identified by both coders, and by either of the coders corresponded with 31 % and 64 % of the uncomfortable states identified by the subjects ' CLD data (N=58), respectively. Conversely there was 72 % agreement between subjects ’ CLD data and the uncomfortable states identified by both coders (N=25). Results show that the majority of the subjects expressed discomfort when the robot blocked their path or was on a collision course towards them, especially when the robot was within 3 meters proximity. Other observations include that the majority of subjects experienced discomfort when the robot was closer than 3m, within the social zone reserved for human-human face to face conversation, while they were performing a task. The advantages and disadvantages of the CLD in comparison to other techniques for assessing subjects ' internal states are discussed and future work concludes the paper.
Anxiety Detection during Human-Robot Interaction," presented at
- IEEE International Conference on Intelligent Robots and Systems
, 2005
"... Abstract- This paper describes an experiment to determine the feasibility of using physiological signals to determine the human response to robot motions during direct human-robot interaction. A robot manipulator is used to generate common interaction motions, and human subjects are asked to report ..."
Abstract
-
Cited by 20 (3 self)
- Add to MetaCart
(Show Context)
Abstract- This paper describes an experiment to determine the feasibility of using physiological signals to determine the human response to robot motions during direct human-robot interaction. A robot manipulator is used to generate common interaction motions, and human subjects are asked to report their response to the motions. The human physiological response is also measured. Motion paths are generated using a classic potential field planner and a safe motion planner, which minimizes the potential collision force along the path. A fuzzy inference engine is developed to estimate the human response based on the physiological measures. Results show that emotional arousal can be detected using physiological signals and the inference engine. Comparison of initial results between the two planners shows that subjects report less anxiety and surprise with the safe planner for high planner speeds. Index Terms – human-robot interaction, physiological signal monitoring, affective state estimation, safety. I.
A probabilistic model of human motion and navigation intent for mobile robot path planning
- In International Conference on Autonomous Robots and Agents
"... AbstractIn order to effectively plan paths in environments inhabited by humans, robots must accurately predict human motion. Typical approaches to human prediction simply assume a constant velocity which is not always valid. This paper proposes to determine the likely navigation intent of humans and ..."
Abstract
-
Cited by 10 (0 self)
- Add to MetaCart
(Show Context)
AbstractIn order to effectively plan paths in environments inhabited by humans, robots must accurately predict human motion. Typical approaches to human prediction simply assume a constant velocity which is not always valid. This paper proposes to determine the likely navigation intent of humans and use that to predict human motion. Navigation intent is determined by the function and structure of the environment. Manually assigned functional places are combined with automatically extracted navigation way-points to de£ne a number of likely navigation targets within the environment. To predict human motion toward these targets, a probabilistic model of human motion is proposed which is based on motion probability grids generated from observed motion. The models of human navigation intent and motion are integrated with an autonomous mobile robot system, with a laser range sensor detecting humans moving within the environment, and a path planning system. The models of human navigation intent and motion are veri£ed using real captured human motion data from an of£ce environment. Examples of human motion prediction are also presented. I.
E (2006) Estimating robot induced affective state using hidden Markov models
- In: Dautenhahn K (ed) RO-MAN 2006—the 15th IEEE international symposium on robot and human interactive communication
, 2006
"... Abstract — In order for humans and robots to interact in an effective and intuitive manner, robots must obtain information about the human affective state in response to the robot’s actions. This secondary mode of interactive communication is hypothesized to permit a more natural collaboration, simi ..."
Abstract
-
Cited by 8 (0 self)
- Add to MetaCart
(Show Context)
Abstract — In order for humans and robots to interact in an effective and intuitive manner, robots must obtain information about the human affective state in response to the robot’s actions. This secondary mode of interactive communication is hypothesized to permit a more natural collaboration, similar to the “body language ” interaction between two cooperating humans. This paper describes the implementation and validation of a Hidden Markov Model for estimating human affective state in real-time, using robot motions as the stimulus. Inputs to the system are physiological signals such as heart rate, perspiration rate, and facial muscle contraction. Affective state was estimated using a two dimensional valence-arousal representation. A robot manipulator was used to generate motions simulating human-robot interaction, and human subjects were asked to report their response to the motions. The human physiological response was also measured. Robot motions were generated using both a nominal potential field planner and a recently reported safe motion planner that minimizes the potential collision forces along the path. The robot motions were tested with 36 subjects. This data was used to train and validate the HMM model. The results of the HMM affective estimation are also compared to a previously implemented fuzzy inference engine. I.
2007b. Survey of psychophysiology measurements applied to human-robot interaction
- In 16th IEEE International Symposium on Robot & Human Interactive Communication
"... Abstract—This paper reviews the literature related to the use of psychophysiology measures in human-robot interaction (HRI) studies in an effort to address the fundamental question of appropriate metrics and methodologies for evaluating HRI research, especially affect. It identifies four main method ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
(Show Context)
Abstract—This paper reviews the literature related to the use of psychophysiology measures in human-robot interaction (HRI) studies in an effort to address the fundamental question of appropriate metrics and methodologies for evaluating HRI research, especially affect. It identifies four main methods of evaluation in HRI studies: (1) self-report measures, (2) behav-ioral measures, (3) psychophysiology measures, and (4) task performance. However, the paper also shows that using only one of these measures for evaluation is insufficient to provide a complete evaluation and interpretation of the interactions between a robot and the human with which it is interacting. In addition, the paper describes exemplar HRI studies which use psychophysiological measures; these implementations fall into three categories: detection and/or identification of specific emotions of participants from physiological signals, evaluation of participants ’ responses to a robot through physiological signals, and development and implementation of real-time control and modification of robot behaviors using physiological signals. Two open research questions on psychophysiological metrics were identified as a result of this review. I.
Alissandrakis A., 'Methodological Issues of Annotating Vision Sensor Data Using Subjects Own Judgement of Comfort in a Robot Human Following Experiment
- Proceedings of The 15th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN06
, 2006
"... Abstract—When determining subject preferences for Human-Robot Interaction, an important issue is the interpretation of the subjects ’ responses during the trials. Employing a non-intrusive approach, this paper discusses the methodological issues for annotating vision data by allowing the subjects to ..."
Abstract
-
Cited by 4 (2 self)
- Add to MetaCart
(Show Context)
Abstract—When determining subject preferences for Human-Robot Interaction, an important issue is the interpretation of the subjects ’ responses during the trials. Employing a non-intrusive approach, this paper discusses the methodological issues for annotating vision data by allowing the subjects to indicate their comfort using a handheld Comfort Level Device during the trials. In previous research, the analysis of collected comfort and vision data was made difficult due to problems concerning the manual synchronisation of different modalities. In the current paper, we overcome this issue by real-time integration of the subject’s feedback on subjective comfort into the video stream. The implications for more efficient analysis of Human-Robot Interaction data, as well as possible future developments of this approach are discussed. T I.
12 A New Approach to Implicit Human-Robot Interaction Using Affective Cues
"... Abstract – It is well known that in social interactions, implicit communication between the communicators plays a significant role. It would be immensely useful to have a robotic system that is capable of such implicit communication with the operator and can modify its behavior if required. This pap ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
Abstract – It is well known that in social interactions, implicit communication between the communicators plays a significant role. It would be immensely useful to have a robotic system that is capable of such implicit communication with the operator and can modify its behavior if required. This paper presents a framework for human-robot interaction in which the operator's physiological signals were analyzed to infer his/her probable anxiety level and robot behavior was adapted as a function of the operator affective state. Peripheral physiological signals were measured through wearable biofeedback sensors and a control architecture inspired by Riley's original information-flow model was developed to implement such human-robot interaction. The target affective state chosen in this work was anxiety. The results from affect-elicitation tasks for human participants showed that it is possible to detect anxiety through physiological sensing in real-time. A robotic experiment was also conducted to demonstrate that the presented control architecture allowed the robot to adapt its behavior based on operator anxiety level.
A New Approach to Implicit Human-Robot Interaction Using Affective Cues
"... Abstract -It is well known that in social interactions, implicit communication between the communicators plays a significant role. It would be immensely useful to have a robotic system that is capable of such implicit communication with the operator and can modify its behavior if required. This pap ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract -It is well known that in social interactions, implicit communication between the communicators plays a significant role. It would be immensely useful to have a robotic system that is capable of such implicit communication with the operator and can modify its behavior if required. This paper presents a framework for human-robot interaction in which the operator's physiological signals were analyzed to infer his/her probable anxiety level and robot behavior was adapted as a function of the operator affective state. Peripheral physiological signals were measured through wearable biofeedback sensors and a control architecture inspired by Riley's original information-flow model was developed to implement such human-robot interaction. The target affective state chosen in this work was anxiety. The results from affect-elicitation tasks for human participants showed that it is possible to detect anxiety through physiological sensing in real-time. A robotic experiment was also conducted to demonstrate that the presented control architecture allowed the robot to adapt its behavior based on operator anxiety level.