Results 11 - 20
of
10,823
Table 3: Types of wrinkles due to facial expression
Table 3: Types of wrinkles due to facial expression
Table 7.1: Recognition of Facial Expressions
1995
Table 6: Results of comparing facial expressions and situations.
Table B.1: Predictions of Facial Expression Changes
2004
Table 1. Effects of Expressivity parameters over head, facial expression and gesture.
2006
"... In PAGE 7: ...t al., 2005]. 4.2 Synthesis Table1 shows the effect that each one of expressivity parameter has on the production of head movements, facial espressions and gestures. The Spatial Ex- tent (SPC) parameter modulates the amplitude of the movement of arms, wrists (involved in the animation of a gesture), head and eyebrows (involved in the animation of a facial expression); it influences how wide or narrow their dis- placement will be during the final animation.... ..."
Cited by 4
Table 1. Comparisons of facial expression recognition algorithms. Number of Number of
"... In PAGE 9: ... People have used different classification algorithms to catego- rize these emotions. In Table1 , we compare several facial expression recognition algorithms. In general, these algorithms perform well compared to trained human recognition of about 87% as reported by Bassili1.... ..."
Table 4: Confusion matrix of the combined facial expression classifier
2004
"... In PAGE 5: ...00 0.77 Table4 shows the confusion matrix of the combined facial expression classifier to analyze in detail the limitation of this emotion recognition system. The overall performance of this classifier was 85.... In PAGE 5: ...02 0.92 Table 6 shows the performance of the bimodal system when the acoustic emotion classifier (Table 1) and the combined facial expressions classifier ( Table4 ) were integrated at decision-level, using different fusing criteria. In the weight-combining rule, the modalities are weighted according to rules extracted from the confusion matrices of each classifier.... In PAGE 6: ...classifier (79% and 81%, Table4 ), and significantly worse than in the feature-level bimodal classifier (95%, 92%, Table 5). However, happiness (98%) and sadness (90%) are recognized with high accuracy compared to the feature-level bimodal classifier (91% and 79%, Table 5).... ..."
Cited by 11
Table 6 Hypothesized vocal co-occurences with facial expressions
1994
"... In PAGE 7: ...Table 1 Summary of Darwin apos;s observations of vocal expression of emotion 19 Table 2 Summary of results on vocal indicators of emotional states 20 Table 3 Tension-flow rhythms as unique combinations of attributes 28 Table 4 Affinities and emotional states as described by Kestenberg 29 Table 5 Mean rankings and modal descriptors of auditory assessments of acoustic parameters of infant vocalizations during four facial expressions 31 Table6 Hypothesized vocal co-occurences with facial expressions 34 Table 7 Hypothesized vocal predictors of hedonic tone 35 Table 8 Hypothesized vocal predictors of discrete emotions 3 Table 9 Variables hypothesized to distinguish emotion pairs of the same hedonic tone 36 Table 10 Hypothesized Affinities of emotional states 37 Table 11 Hypothesized Tension-Flow Attributes of emotional states 37 Table 12 Hypothesized Shape-Flow Elements of emotional states 38 Table 13 Variables under study 40 Table 14 Max Codes 43 Table ISAffex Codes 5 Table 16 Tension-flow attributes results for tracing of segment 26 by Rater 1 58 Table 17 Inter-observer correlations ofbody movement observations 61 Table 18 Inter-observer agreement percentages and correlations of vocalization variables 63 Table 19 Facial expression by vowel content of vocal expression 65 Table 20 Facial expression by vocal expression descriptor 66 Table 21 Means and standard deviations ofbody movement variablesfor four facial expressions 68 Table 23 Selected Significant Body Movement Inter-Correlations 72 Table 24 Body Movement Shape Flow Inter-correlations 73 Table 25 Selected Significant Correlations of Body Movement and Vocalization Variables 74 Table 26 Selected Significant Vocalization Inter-Correlations 75 Table 27 Tests of Significance, Multivariate. Univariate 79 Table 28 Variables significantly correlating with segment length 80 Table 29 Analyses of Covariance.... In PAGE 43: ...These hypotheses are also shown in Table6 below: Table 6 Hypothesized vocal co-occurences with facial expressions ... In PAGE 102: ... Referring back to the hypotheses in Table 9, anger does appear to be signaled with the vowel sound quot;a quot; as in quot;pat quot; while sadness is most likely signaled either with quot;e quot; as in quot;pet quot; or quot;i quot; as in quot;pit quot;. Referring to Table6 , anger does seem to be signaled with a wide range of pitch while interest and sadness are more likely expressed with narrow pitch range. Interest is probably soft while anger is usually loud.... ..."
Table 2: Facial expression recognition performance for different classification features.
"... In PAGE 18: ...18 performance. The observations obtained as a result of the PCA produced poorer facial expression recognition performance, than the original FAP observations (see Table2 ). We concluded that the amount of training data was not sufficient to provide reliable PCA and produce recognition performance improvement.... In PAGE 18: ... In addition, in order to determine the best choice of classification features, we performed experiments using first (delta-D) and second (acceleration-A) order derivatives as observations, by appending them to the original FAP vectors or using them individually. The recognition results obtained in these experiments are also shown in Table2 . Since the original FAPs produced the best results they were used as observations in the remaining experiments.... ..."
Results 11 - 20
of
10,823