Journals starting with affc

AffCom( Vol No. ) * *IEEE Trans. on Affective Computing

AffCom(1) * Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
* Broadening the Scope of Affect Detection Research
* CAO: A Fully Automatic Emoticon Analysis System Based on Theory of Kinesics
* Cross-Corpus Acoustic Emotion Recognition: Variances and Strategies
* Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis
* Empathic Touch by Relational Agents
* Inducing Genuine Emotions in Simulated Speech-Based Human-Machine Interaction: The NIMITEK Corpus
* Intimate Heartbeats: Opportunities for Affective Communication Technology
* Optimal Arousal Identification and Classification for Affective Computing Using Physiological Signals: Virtual Reality Stroop Task
* Smile When You Read This, Whether You Like It or Not: Conceptual Challenges to Affect Detection
* Support Vector Machines to Define and Detect Agitation Transition
11 for AffCom(1)

AffCom(10) * Adaptive Bayesian Source Separation Method for Intensity Estimation of Facial AUs, An
* AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild
* AM-FED+: An Extended Dataset of Naturalistic Facial Expressions Collected in Everyday Settings
* Audio-Visual Emotion Recognition in Video Clips
* Automatic Analysis of Facial Actions: A Survey
* Automatic Assessment of Depression Based on Visual Cues: A Systematic Review
* Building Naturalistic Emotionally Balanced Speech Corpus by Retrieving Emotional Speech from Existing Podcast Recordings
* Capturing Feature and Label Relations Simultaneously for Multiple Facial Action Unit Recognition
* Copula Ordinal Regression Framework for Joint Estimation of Facial Action Unit Intensity
* Cross-Corpus Acoustic Emotion Recognition with Multi-Task Learning: Seeking Common Ground While Preserving Differences
* Discriminative Spatiotemporal Local Binary Pattern with Revisited Integral Projection for Spontaneous Facial Micro-Expression Recognition
* Dynamic Pose-Robust Facial Expression Recognition by Multi-View Pairwise Conditional Random Forests
* Editorial of Special Issue on Human Behaviour Analysis In-the-Wild
* Emotion Classification Using Segmentation of Vowel-Like and Non-Vowel-Like Regions
* Emotions Recognition Using EEG Signals: A Survey
* Estimating Audience Engagement to Predict Movie Ratings
* Exploring User Experience with Image Schemas, Sentiments, and Semantics
* Fuzzy Histogram of Optical Flow Orientations for Micro-Expression Recognition
* Hidden Smile Correlation Discovery Across Subjects Using Random Walk with Restart
* Human-Human Interactional Synchrony Analysis Based on Body Sensor Networks
* Identifying Stable Patterns over Time for Emotion Recognition from EEG
* Interest as a Proxy of Engagement in News Reading: Spectral and Entropy Analyses of EEG Activity Patterns
* ISLA: Temporal Segmentation and Labeling for Audio-Visual Emotion Recognition
* Listen to Your Face: Inferring Facial Action Units from Audio Channel
* Magnifying Subtle Facial Motions for Effective 4D Expression Recognition
* Methodology for the Automatic Extraction and Generation of Non-Verbal Signals Sequences Conveying Interpersonal Attitudes, A
* Model for an Affective Non-Expensive Utility-Based Decision Agent, A
* Modelling Affect for Horror Soundscapes
* MorpheuS: Generating Structured Music with Constrained Patterns and Tension
* Multi-Objective Based Spatio-Temporal Feature Representation Learning Robust to Expression Intensity Variations for Facial Expression Recognition
* Multi-Velocity Neural Networks for Facial Expression Recognition in Videos
* Multimodal Human-Human-Robot Interactions (MHHRI) Dataset for Studying Personality and Engagement
* New Methods for Stress Assessment and Monitoring at the Workplace
* Novel Audio Feature Projection Using KDLPCCA-Based Correlation with EEG Features for Favorite Music Classification
* Novel Method Based on OMPGW Method for Feature Extraction in Automatic Music Mood Classification, A
* Predicting Social Emotions from Readers' Perspective
* Probabilistic Multigraph Modeling for Improving the Quality of Crowdsourced Affective Data
* Profiling Players Using Real-World Datasets: Clustering the Data and Correlating the Results with the Big-Five Personality Traits
* Studying the Scope of Negation for Spanish Sentiment Analysis on Twitter
* Toward Automating Oral Presentation Scoring During Principal Certification Program Using Audio-Video Low-Level Behavior Profiles
* Transfer Linear Subspace Learning for Cross-Corpus Speech Emotion Recognition
* Visual Biofeedback and Game Adaptation in Relaxation Skill Transfer
42 for AffCom(10)

AffCom(11) * Affective Recognition in Dynamic and Interactive Virtual Environments
* Analysis and Classification of Cold Speech Using Variational Mode Decomposition
* Approaches to Automated Detection of Cyberbullying: A Survey
* Automatic ECG-Based Emotion Recognition in Music Listening
* Automatic Estimation of Taste Liking Through Facial Expression Dynamics
* Automatic Recognition of Children Engagement from Facial Video Using Convolutional Neural Networks
* Case-Based Reasoning Model for Depression Based on Three-Electrode EEG Data, A
* Classifying Affective Haptic Stimuli through Gender-Specific Heart Rate Variability Nonlinear Analysis
* Co-Clustering to Reveal Salient Facial Features for Expression Recognition
* Computational Analyses of Thin-Sliced Behavior Segments in Session-Level Affect Perception
* Computational Study of Primitive Emotional Contagion in Dyadic Interactions
* Context Sensitivity of EEG-Based Workload Classification Under Different Affective Valence
* Continuous, Real-Time Emotion Annotation: A Novel Joystick-Based Analysis Framework
* Deep Physiological Affect Network for the Recognition of Human Emotions
* Defining Laughter Context for Laughter Synthesis with Spontaneous Speech Corpus
* Detecting Unipolar and Bipolar Depressive Disorders from Elicited Speech Responses Using Latent Affective Structure Model
* Dimensional Affect Recognition from HRV: An Approach Based on Supervised SOM and ELM
* Discrete Probability Distribution Prediction of Image Emotions with Shared Sparse Learning
* EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks
* Emotion Recognition Based on High-Resolution EEG Recordings and Reconstructed Brain Sources
* Emotion Recognition in Simulated Social Interactions
* Emotion Recognition on Twitter: Comparative Study and Training a Unison Model
* Exploring Domain Knowledge for Facial Expression-Assisted Action Unit Activation Recognition
* Facial Expression Recognition with Neighborhood-Aware Edge Directional Pattern (NEDP)
* Feature Learning from Spectrograms for Assessment of Personality Traits
* Feature Selection Based Transfer Subspace Learning for Speech Emotion Recognition
* Film Mood and Its Quantitative Determinants in Different Types of Scenes
* Gaming Away Stress: Using Biofeedback Games to Learn Paced Breathing
* Generalized Two-Stage Rank Regression Framework for Depression Score Prediction from Speech
* Haptic Expression and Perception of Spontaneous Stress
* HED-ID: An Affective Adaptation Model Explaining the Intensity-Duration Relationship of Emotion
* Human Observer and Automatic Assessment of Movement Related Self-Efficacy in Chronic Pain: From Exercise to Functional Activity
* Idiom-Based Features in Sentiment Analysis: Cutting the Gordian Knot
* Intensional Learning to Efficiently Build Up Automatically Annotated Emotion Corpora
* Investigation of Partition-Based and Phonetically-Aware Acoustic Features for Continuous Emotion Prediction from Speech, An
* Look! It's Moving! Is It Alive? How Movement Affects Humans' Affinity Living and Non-Living Entities
* Low-Level Characterization of Expressive Head Motion Through Frequency Domain Analysis
* Mutual Information Based Adaptive Windowing of Informative EEG for Emotion Recognition, A
* Novel Audio Features for Music Emotion Recognition
* Novel Technique to Develop Cognitive Models for Ambiguous Image Identification Using Eye Tracker, A
* Objectivity and Subjectivity in Aesthetic Quality Assessment of Digital Photographs
* Personalised, Multi-Modal, Affective State Detection for Hybrid Brain-Computer Music Interfacing
* Personalized Multitask Learning for Predicting Tomorrow's Mood, Stress, and Health
* Physiological Detection of Affective States in Children with Autism Spectrum Disorder
* Pipelined Neural Networks for Phrase-Level Sentiment Intensity Prediction
* Predicting Personality from Book Preferences with User-Generated Content Labels
* Realistic Transformation of Facial and Vocal Smiles in Real-Time Audiovisual Streams
* Segment-Based Methods for Facial Attribute Detection from Partial Faces
* Singing Robots: How Embodiment Affects Emotional Responses to Non-Linguistic Utterances
* Social Signal Detection by Probabilistic Sampling DNN Training
* Speech Synthesis for the Generation of Artificial Personality
* Strategies to Utilize the Positive Emotional Contagion Optimally in Crowd Evacuation
* Toward Constructing a Real-time Social Anxiety Evaluation System: Exploring Effective Heart Rate Features
* Tracking Dynamics of Opinion Behaviors with a Content-Based Sequential Opinion Influence Model
* Unobtrusive Inference of Affective States in Virtual Rehabilitation from Upper Limb Motions: A Feasibility Study
* Unsupervised Adaptation of a Person-Specific Manifold of Facial Expressions
* Using Temporal Features of Observers' Physiological Measures to Distinguish Between Genuine and Fake Smiles
* Visually Interpretable Representation Learning for Depression Recognition from Facial Images
58 for AffCom(11)

AffCom(12) * Adapting Software with Affective Computing: A Systematic Review
* AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups
* Applying Probabilistic Programming to Affective Computing
* Architecture for Emotional Facial Expressions as Social Signals, An
* Autoencoder for Semisupervised Multiple Emotion Detection of Conversation Transcripts
* Automatic Detection of Mind Wandering from Video in the Lab and in the Classroom
* Automatic Recognition of Facial Displays of Unfelt Emotions
* Bi-Hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition, A
* Bio-Inspired Deep Attribute Learning Towards Facial Aesthetic Prediction
* Biofeedback Arrests Sympathetic and Behavioral Effects in Distracted Driving
* Brain Dynamics During Arousal-Dependent Pleasant/Unpleasant Visual Elicitation: An Electroencephalographic Study on the Circumplex Model of Affect
* Brain's Night Symphony (BraiNSy): A Methodology for EEG Sonification
* Capturing Emotion Distribution for Multimedia Emotion Tagging
* Compensation Techniques for Speaker Variability in Continuous Emotion Prediction
* Computational Model of Focused Attention Meditation and Its Transfer to a Sustained Attention Task, A
* Computer Vision Analysis for Quantification of Autism Risk Behaviors
* Cross-Cultural and Cultural-Specific Production and Perception of Facial Expressions of Emotion in the Wild
* Decoding the Perception of Sincerity in Dialogues
* Deep Facial Action Unit Recognition and Intensity Estimation from Partially Labelled Data
* Deep Learning for Human Affect Recognition: Insights and New Developments
* Deep Learning for Spatio-Temporal Modeling of Dynamic Spontaneous Emotions
* Designing an Experience Sampling Method for Smartphone Based Emotion Detection
* Early Detection of User Engagement Breakdown in Spontaneous Human-Humanoid Interaction
* EEG-Based Brain Computer Interface for Emotion Recognition and Its Application in Patients with Disorder of Consciousness, An
* Effects of Computerized Emotional Training on Children with High Functioning Autism
* Embodied Robot Models for Interdisciplinary Emotion Research
* EmoBed: Strengthening Monomodal Emotion Recognition via Training with Crossmodal Emotion Embeddings
* Empirical Evidence Relating EEG Signal Duration to Emotion Classification Performance
* Exploiting Multi-CNN Features in CNN-RNN Based Dimensional Emotion Recognition on the OMG in-the-Wild Dataset
* Exploring Macroscopic and Microscopic Fluctuations of Elicited Facial Expressions for Mood Disorder Classification
* Facial Expression Recognition with Identity and Emotion Joint Learning
* Faded Smiles? A Largescale Observational Study of Smiling from Adolescence to Old Age
* Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity
* Feature Pooling of Modulation Spectrum Features for Improved Speech Emotion Recognition in the Wild
* First Impressions Count! The Role of the Human's Emotional State on Rapport Established with an Empathic versus Neutral Virtual Therapist
* Hybrid Cognitive Architecture with Primal Affect and Physiology, A
* Improving Attention Model Based on Cognition Grounded Data for Sentiment Analysis
* Improving Cross-Corpus Speech Emotion Recognition with Adversarial Discriminative Domain Generalization (ADDoG)
* Induction and Profiling of Strong Multi-Componential Emotions in Virtual Reality
* Induction of Emotional States in Educational Video Games Through a Fuzzy Control System
* Integrating Deep and Shallow Models for Multi-Modal Depression Analysis: Hybrid Architectures
* Inter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video Watching
* Introduction to the Special Issue On Computational Modelling of Emotion
* Jointly Aligning and Predicting Continuous Emotion Annotations
* Layered-Modeling of Affective and Sensory Experiences using Structural Equation Modeling: Touch Experiences of Plastic Surfaces as an Example
* Leveraging Affective Hashtags for Ranking Music Recommendations
* Longitudinal Observational Evidence of the Impact of Emotion Regulation Strategies on Affective Expression
* MatchNMingle Dataset: A Novel Multi-Sensor Resource for the Analysis of Social Interactions and Group Dynamics In-the-Wild During Free-Standing Conversations and Speed Dates, The
* Mathematical Description of Emotional Processes and Its Potential Applications to Affective Computing, A
* Modeling Emotion in Complex Stories: The Stanford Emotional Narratives Dataset
* Multi-Feature Based Network Revealing the Structural Abnormalities in Autism Spectrum Disorder
* Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database
* Multilevel Longitudinal Analysis of Shooting Performance as a Function of Stress and Cardiovascular Responses
* Multimodal Classification of Stressful Environments in Visually Impaired Mobility Using EEG and Peripheral Biosignals
* Neural Attentive Network for Cross-Domain Aspect-Level Sentiment Classification
* Novel Classification Strategy to Distinguish Five Levels of Pain Using the EEG Signal Features, A
* On Social Involvement in Mingling Scenarios: Detecting Associates of F-Formations in Still Images
* On the Effect of Observed Subject Biases in Apparent Personality Analysis From Audio-Visual Signals
* On the Influence of Affect in EEG-Based Subject Identification
* Ordinal Nature of Emotions: An Emerging Approach, The
* Over-Sampling Emotional Speech Data Based on Subjective Evaluations Provided by Multiple Individuals
* Partial Reinforcement in Game Biofeedback for Relaxation Training
* Personality Traits Classification Using Deep Visual Activity-Based Nonverbal Features of Key-Dynamic Images
* Predicting Emotionally Salient Regions Using Qualitative Agreement of Deep Neural Network Regressors
* Prediction of Car Design Perception Using EEG and Gaze Patterns
* Recognizing Induced Emotions of Movie Audiences from Multimodal Information
* Review on Nonlinear Methods Using Electroencephalographic Recordings for Emotion Recognition, A
* Scalable Off-the-Shelf Framework for Measuring Patterns of Attention in Young Children and Its Application in Autism Spectrum Disorder, A
* SchiNet: Automatic Estimation of Symptoms of Schizophrenia from Facial Behaviour Analysis
* Sentiment Polarity Classification at EVALITA: Lessons Learned and Open Challenges
* Sparse MDMO: Learning a Discriminative Feature for Micro-Expression Recognition
* Spatio-Temporal Encoder-Decoder Fully Convolutional Network for Video-Based Dimensional Emotion Recognition
* Special Issue on Automated Perception of Human Affect from Longitudinal Behavioral Data
* Speech-Driven Expressive Talking Lips with Conditional Sequential Generative Adversarial Networks
* Survey on Emotional Body Gesture Recognition
* Survey on Style in 3D Human Body Motion: Taxonomy, Data, Recognition and Its Applications
* Task Load Estimation from Multimodal Head-Worn Sensors Using Event Sequence Features
* Towards a Prediction and Data Driven Computational Process Model of Emotion
* Towards Analyzing and Predicting the Experience of Live Performances with Wearable Sensing
* Towards Disorder-Independent Automatic Assessment of Emotional Competence in Neurological Patients with a Classical Emotion Recognition System: Application in Foreign Accent Syndrome
* Towards Transparent Robot Learning Through TDRL-Based Emotional Expressions
* Using Circular Models to Improve Music Emotion Recognition
* Video Affective Content Analysis by Exploring Domain Knowledge
* Video-Based Depression Level Analysis by Encoding Deep Spatiotemporal Features
84 for AffCom(12)

AffCom(13) * Abnormal Attentional Bias of Non-Drug Reward in Abstinent Heroin Addicts: An ERP Study
* ACSEE: Antagonistic Crowd Simulation Model With Emotional Contagion and Evolutionary Game Theory
* Active Learning Paradigm for Online Audio-Visual Emotion Recognition, An
* Adapted Dynamic Memory Network for Emotion Recognition in Conversation
* Adapting the Interplay Between Personalized and Generalized Affect Recognition Based on an Unsupervised Neural Framework
* Affect Estimation in 3D Space Using Multi-Task Active Learning for Regression
* Affect in Multimedia: Benchmarking Violent Scenes Detection
* Affective Audio Annotation of Public Speeches with Convolutional Clustering Neural Network
* Affective Dynamics and Cognition During Game-Based Learning
* Affective Dynamics: Causality Modeling of Temporally Evolving Perceptual and Affective Responses
* Affective Dynamics: Principal Motion Analysis of Temporal Dominance of Sensations and Emotions Data
* Affective Impression: Sentiment-Awareness POI Suggestion via Embedding in Heterogeneous LBSNs
* Affective Video Content Analysis via Multimodal Deep Quality Embedding Network
* Affective Words and the Company They Keep: Studying the Accuracy of Affective Word Lists in Determining Sentence and Word Valence in a Domain-Specific Corpus
* All-in-One: Emotion, Sentiment and Intensity Prediction Using a Multi-Task Ensemble Framework
* Analyzing Group-Level Emotion with Global Alignment Kernel based Approach
* Are You Really Looking at Me? A Feature-Extraction Framework for Estimating Interpersonal Eye Gaze From Conventional Video
* Arousal Video Game AnnotatIoN (AGAIN) Dataset, The
* Aspect-Based Sentiment Analysis with New Target Representation and Dependency Attention
* Aspect-Based Sentiment Quantification
* Aspect-Opinion Correlation Aware and Knowledge-Expansion Few Shot Cross-Domain Sentiment Classification
* Automatic Detection of Reflective Thinking in Mathematical Problem Solving Based on Unconstrained Bodily Exploration
* Automatic Estimation of Self-Reported Pain by Trajectory Analysis in the Manifold of Fixed Rank Positive Semi-Definite Matrices
* Automatic Prediction of Group Cohesiveness in Images
* Automatic Recognition Methods Supporting Pain Assessment: A Survey
* Bayesian Deep Learning Framework for End-To-End Prediction of Emotion From Heartbeat, A
* Beyond Mobile Apps: A Survey of Technologies for Mental Well-Being
* BReG-NeXt: Facial Affect Computing Using Adaptive Residual Networks With Bounded Gradient
* Causal Narrative Comprehension: A New Perspective for Emotion Cause Extraction
* Change Matters! Measuring the Effect of Changing the Leader in Joint Music Performances, The
* Classification of Video Game Player Experience Using Consumer-Grade Electroencephalography
* Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network
* Comprehensive and Context-Sensitive Neonatal Pain Assessment Using Computer Vision, A
* Conveying Emotions Through Device-Initiated Touch
* Deep Facial Expression Recognition: A Survey
* Deep Learning for Micro-Expression Recognition: A Survey
* Deep Multi-Task Multi-Label CNN for Effective Facial Attribute Classification
* Deep Multiscale Spatiotemporal Network for Assessing Depression From Facial Dynamics, A
* Deep Temporal Analysis for Non-Acted Body Affect Recognition
* Deeper Look at Facial Expression Dataset Bias, A
* DepecheMood++: A Bilingual Emotion Lexicon Built Through Simple Yet Powerful Techniques
* Depression Level Prediction Using Deep Spatiotemporal Features and Multilayer Bi-LTSM
* Development and Cross-Cultural Evaluation of a Scoring Algorithm for the Biometric Attachment Test: Overcoming the Challenges of Multimodal Fusion with Small Data
* Dimensional Affect Uncertainty Modelling for Apparent Personality Recognition
* Disentangling Identity and Pose for Facial Expression Recognition
* Doing and Feeling: Relationships Between Moods, Productivity and Task-Switching
* Dual-Branch Dynamic Graph Convolution Based Adaptive TransFormer Feature Fusion Network for EEG Emotion Recognition, A
* Dynamic Micro-Expression Recognition Using Knowledge Distillation
* Dynamics of Blink and Non-Blink Cyclicity for Affective Assessment: A Case Study for Stress Identification
* EEG-Based Emotion Recognition Using Regularized Graph Neural Networks
* EEG-Video Emotion-Based Summarization: Learning With EEG Auxiliary Signals
* Efficient Framework for Constructing Speech Emotion Corpus Based on Integrated Active Learning Strategies, An
* Efficient LSTM Network for Emotion Recognition From Multichannel EEG Signals, An
* EmoLabel: Semi-Automatic Methodology for Emotion Annotation of Social Media Text
* EmoSen: Generating Sentiment and Emotion Controlled Responses in a Multimodal Dialogue System
* Emotion Dependent Domain Adaptation for Speech Driven Affective Facial Feature Synthesis
* Emotion Prediction with Weighted Appraisal Models: Towards Validating a Psychological Theory of Affect
* Emotion Recognition and EEG Analysis Using ADMM-Based Sparse Group Lasso
* Emotional Conversation Generation Orientated Syntactically Constrained Bidirectional-Asynchronous Framework
* Emotions are the Great Captains of Our Lives: Measuring Moods Through the Power of Physiological and Environmental Sensing
* ENGAGE-DEM: A Model of Engagement of People With Dementia
* Enhancement of Movement Intention Detection Using EEG Signals Responsive to Emotional Music Stimulus
* ER-Chat: A Text-to-Text Open-Domain Dialogue Framework for Emotion Regulation
* Ethics and Good Practice in Computational Paralinguistics
* Evoking Physiological Synchrony and Empathy Using Social VR With Biofeedback
* Exploiting Evolutionary Algorithms to Model Nonverbal Reactions to Conversational Interruptions in User-Agent Interactions
* Exploring Individual Differences of Public Speaking Anxiety in Real-Life and Virtual Presentations
* Exploring Self-Attention Graph Pooling With EEG-Based Topological Structure and Soft Label for Depression Detection
* FaceEngage: Robust Estimation of Gameplay Engagement from User-Contributed (YouTube) Videos
* Facial Action Unit Detection Using Attention and Relation Learning
* Facial Depression Recognition by Deep Joint Label Distribution and Metric Learning
* Facial Expression Recognition Using a Temporal Ensemble of Multi-Level Convolutional Neural Networks
* Facial Expression Recognition with Active Local Shape Pattern and Learned-Size Block Representations
* Facial Expression Recognition With Deeply-Supervised Attention Network
* Facial Expression Translation Using Landmark Guided GANs
* Facial Expressions of Comprehension (FEC)
* First Impressions: A Survey on Vision-Based Apparent Personality Trait Analysis
* FLEPNet: Feature Level Ensemble Parallel Network for Facial Expression Recognition
* Framework to Model and Control the State of Presence in Virtual Reality Systems, A
* From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition
* Fusing of Electroencephalogram and Eye Movement With Group Sparse Canonical Correlation Analysis for Anxiety Detection
* GCB-Net: Graph Convolutional Broad Network and Its Application in Emotion Recognition
* Holistic Affect Recognition Using PaNDA: Paralinguistic Non-Metric Dimensional Analysis
* IAF-LG: An Interactive Attention Fusion Network With Local and Global Perspective for Aspect-Based Sentiment Analysis
* ICA-Evolution Based Data Augmentation with Ensemble Deep Neural Networks Using Time and Frequency Kernels for Emotion Recognition from EEG-Data
* Identifying Cortical Brain Directed Connectivity Networks From High-Density EEG for Emotion Recognition
* Immersion Measurement in Watching Videos Using Eye-tracking Data
* Improved Empirical Mode Decomposition of Electroencephalogram Signals for Depression Detection, An
* Improving the Performance of Sentiment Analysis Using Enhanced Preprocessing Technique and Artificial Neural Network
* Inconsistency-Based Multi-Task Cooperative Learning for Emotion Recognition
* Investigation of Speech Landmark Patterns for Depression Detection
* Issues and Challenges of Aspect-based Sentiment Analysis: A Comprehensive Survey
* Joint Feature Adaptation and Graph Adaptive Label Propagation for Cross-Subject Emotion Recognition From EEG Signals
* Joint Multi-Dimensional Model for Global and Time-Series Annotations
* Leaders and Followers Identified by Emotional Mimicry During Collaborative Learning: A Facial Expression Recognition Study on Emotional Valence
* Learning Continuous Facial Actions From Speech for Real-Time Animation
* Learning Pain from Action Unit Combinations: A Weakly Supervised Approach via Multiple Instance Learning
* Leveraging the Deep Learning Paradigm for Continuous Affect Estimation from Facial Expressions
* Leveraging the Dynamics of Non-Verbal Behaviors For Social Attitude Modeling
* Lexicon-Based Sentiment Convolutional Neural Networks for Online Review Analysis
* Measuring Temporal Distance Focus From Tweets and Investigating its Association With Psycho-Demographic Attributes
* MES-P: An Emotional Tonal Speech Dataset in Mandarin with Distal and Proximal Labels
* Micro and Macro Facial Expression Recognition Using Advanced Local Motion Patterns
* Modeling Feature Representations for Affective Speech Using Generative Adversarial Networks
* Modeling Real-World Affective and Communicative Nonverbal Vocalizations From Minimally Speaking Individuals
* Modeling Vocal Entrainment in Conversational Speech Using Deep Unsupervised Learning
* Modeling, Recognizing, and Explaining Apparent Personality From Videos
* Modulation of Driver's Emotional States by Manipulating In-Vehicle Environment: Validation With Biosignals Recorded in An Actual Car Environment
* Multi-Componential Approach to Emotion Recognition and the Effect of Personality, A
* Multi-Fusion Residual Memory Network for Multimodal Human Sentiment Comprehension
* Multi-Label and Multimodal Classifier for Affective States Recognition in Virtual Rehabilitation
* Multi-Label Multi-Task Deep Learning for Behavioral Coding
* Multi-Stage Graph Fusion Networks for Major Depressive Disorder Diagnosis
* Multi-Task Semi-Supervised Adversarial Autoencoding for Speech Emotion Recognition
* Multimodal Approach for Mania Level Prediction in Bipolar Disorder, A
* Multimodal Deception Detection Using Real-Life Trial Data
* Multimodal Non-Intrusive Stress Monitoring From the Pleasure-Arousal Emotional Dimensions, A
* Multimodal Self-Assessed Personality Estimation During Crowded Mingle Scenarios Using Wearables Devices and Cameras
* Multiple Instance Learning for Emotion Recognition Using Physiological Signals
* Multiview Facial Expression Recognition, A Survey
* Normative Emotional Agents: A Viewpoint Paper
* Novel Sentiment Polarity Detection Framework for Chinese, A
* Objective Class-Based Micro-Expression Recognition Under Partial Occlusion Via Region-Inspired Relation Reasoning Network
* On Emotions as Features for Speech Overlaps Classification
* On the Influence of Shot Scale on Film Mood and Narrative Engagement in Film Viewers
* On-the-Fly Facial Expression Prediction Using LSTM Encoded Appearance-Suppressed Dynamics
* PARSE: Pairwise Alignment of Representations in Semi-Supervised EEG Learning for Emotion Recognition
* Participatory Design of Affective Technology: Interfacing Biomusic and Autism
* PersEmoN: A Deep Network for Joint Analysis of Apparent Personality, Emotion and Their Relationship
* Persuasion-Induced Physiology as Predictor of Persuasion Effectiveness
* Phase Space Reconstruction Driven Spatio-Temporal Feature Learning for Dynamic Facial Expression Recognition
* Psychologically Inspired Fuzzy Cognitive Deep Learning Framework to Predict Crowd Behavior, A
* Psychophysiological Reactions to Persuasive Messages Deploying Persuasion Principles
* Quality-Aware Bag of Modulation Spectrum Features for Robust Speech Emotion Recognition
* Quantitative Personality Predictions From a Brief EEG Recording
* Rating Vs. Paired Comparison for the Judgment of Dominance on First Impressions
* Reading Personality Preferences From Motion Patterns in Computer Mouse Operations
* Recognition of Advertisement Emotions With Application to Computational Advertising
* Recognition of Multiple Anxiety Levels Based on Electroencephalograph, The
* Regression Guided by Relative Ranking Using Convolutional Neural Network (R^3CNN) for Facial Beauty Prediction
* Review on Psychological Stress Detection Using Biosignals
* RFAU: A Database for Facial Action Unit Analysis in Real Classrooms
* Robust Audiovisual Emotion Recognition: Aligning Modalities, Capturing Temporal Information, and Handling Missing Features
* Self-Supervised Approach for Facial Movement Based Optical Flow
* Self-Supervised ECG Representation Learning for Emotion Recognition
* Semantic-Rich Facial Emotional Expression Recognition
* Short and Long Range Relation Based Spatio-Temporal Transformer for Micro-Expression Recognition
* Spectral Representation of Behaviour Primitives for Depression Analysis
* Spontaneous Speech Emotion Recognition Using Multiscale Deep Convolutional LSTM
* State-Specific and Supraordinal Components of Facial Response to Pain
* Stimulus Sampling With 360-Videos: Examining Head Movements, Arousal, Presence, Simulator Sickness, and Preference on a Large Sample of Participants and Videos
* Study on Horse-Rider Interaction Based on Body Sensor Network in Competitive Equitation
* Toward Robust Stress Prediction in the Age of Wearables: Modeling Perceived Stress in a Longitudinal Study With Information Workers
* Towards Contrastive Context-Aware Conversational Emotion Recognition
* Training Socially Engaging Robots: Modeling Backchannel Behaviors with Batch Reinforcement Learning
* Two-Stage Fuzzy Fusion Based-Convolution Neural Network for Dynamic Emotion Recognition
* Unconstrained Facial Action Unit Detection via Latent Feature Domain
* Unraveling ML Models of Emotion With NOVA: Multi-Level Explainable AI for Non-Experts
* Unsupervised Learning in Reservoir Computing for EEG-Based Emotion Recognition
* Unsupervised Personalization of an Emotion Recognition System: The Unique Properties of the Externalization of Valence in Speech
* Utilizing Deep Learning Towards Multi-Modal Bio-Sensing and Vision-Based Affective Computing
* What an Ehm Leaks About You: Mapping Fillers into Personality Traits with Quantum Evolutionary Feature Selection Algorithms
* What Lies Beneath: A Survey of Affective Theory Use in Computational Models of Emotion
* What's Your Laughter Doing There? A Taxonomy of the Pragmatic Functions of Laughter
164 for AffCom(13)

AffCom(14) * 4DME: A Spontaneous 4D Micro-Expression Dataset With Multimodalities
* Acoustically Emotion-Aware Conversational Agent With Speech Emotion Recognition and Empathetic Responses, The
* Active Learning With Complementary Sampling for Instructing Class-Biased Multi-Label Text Emotion Classification
* Acute Stress State Classification Based on Electrodermal Activity Modeling
* Adversarial Training Based Speech Emotion Classifier With Isolated Gaussian Regularization, An
* AffectON: Incorporating Affect Into Dialog Generation
* Altered Brain Dynamics and Their Ability for Major Depression Detection Using EEG Microstates Analysis
* Analysis of Driver's Behavioral Tendency Under Different Emotional States Based on a Bayesian Network, The
* Applying Segment-Level Attention on Bi-Modal Transformer Encoder for Audio-Visual Emotion Recognition
* Artificial Emotional Intelligence in Socially Assistive Robots for Older Adults: A Pilot Study
* AT2GRU: A Human Emotion Recognition Model With Mitigated Device Heterogeneity
* Audio Features for Music Emotion Recognition: A Survey
* Audio-Visual Automatic Group Affect Analysis
* Audio-Visual Emotion Recognition With Preference Learning Based on Intended and Multi-Modal Perceived Labels
* Audio-Visual Gated-Sequenced Neural Networks for Affect Recognition
* Auditory Feedback of False Heart Rate for Video Game Experience Improvement
* Automated Autism Detection Based on Characterizing Observable Patterns From Photos
* Automated Classification of Dyadic Conversation Scenarios Using Autonomic Nervous System Responses
* Automatic Detection of Emotional Changes Induced by Social Support Loss Using fMRI
* Automatic Emotion Recognition for Groups: A Review
* Automatic Emotion Recognition in Clinical Scenario: A Systematic Review of Methods
* Automatic Estimation of Action Unit Intensities and Inference of Emotional Appraisals
* AutoML-Emo: Automatic Knowledge Selection Using Congruent Effect for Emotion Identification in Conversations
* Behavior-Based Ethical Understanding in Chinese Social News
* Behavioral and Physiological Signals-Based Deep Multimodal Approach for Mobile Emotion Recognition
* Beneath the Tip of the Iceberg: Current Challenges and New Directions in Sentiment Analysis Research
* Bias-Based Soft Label Learning for Facial Expression Recognition
* Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection, The
* Boosting Facial Expression Recognition by A Semi-Supervised Progressive Teacher
* Brain-Computer Interface for Generating Personally Attractive Images
* Capturing Interaction Quality in Long Duration (Simulated) Space Missions With Wearables
* Challenges in Evaluating Technological Interventions for Affect Regulation
* Chunk-Level Speech Emotion Recognition: A General Framework of Sequence-to-One Dynamic Temporal Modeling
* Classification of Interbeat Interval Time-Series Using Attention Entropy
* Classifying Suicide-Related Content and Emotions on Twitter Using Graph Convolutional Neural Networks
* Cluster-Level Contrastive Learning for Emotion Recognition in Conversations
* Collecting Mementos: A Multimodal Dataset for Context-Sensitive Modeling of Affect and Memory Processing in Responses to Videos
* Compound Aspect Extraction by Augmentation and Constituency Lattice
* Computation of Sensory-Affective Relationships Depending on Material Categories of Pictorial Stimuli
* Computational Model of Coping and Decision Making in High-Stress, Uncertain Situations: An Application to Hurricane Evacuation Decisions, A
* Computer-Aided Autism Spectrum Disorder Diagnosis With Behavior Signal Processing
* Contradicted by the Brain: Predicting Individual and Group Preferences via Brain-Computer Interfacing
* Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition
* Cost-Sensitive Boosting Pruning Trees for Depression Detection on Twitter
* Counterfactual Representation Augmentation for Cross-Domain Sentiment Analysis
* CPED: A Chinese Positive Emotion Database for Emotion Elicitation and Analysis
* Cross-Domain Aspect-Based Sentiment Classification by Exploiting Domain- Invariant Semantic-Primary Feature
* Cross-Task and Cross-Participant Classification of Cognitive Load in an Emergency Simulation Game
* Data Augmentation via Face Morphing for Recognizing Intensities of Facial Emotions
* DBATES: Dataset for Discerning Benefits of Audio, Textual, and Facial Expression Features in Competitive Debate Speeches
* Deep Multi-Modal Network Based Automated Depression Severity Estimation
* Deep Multimodal Learning Approach to Perceive Basic Needs of Humans From Instagram Profile, A
* Deep Siamese Neural Networks for Facial Expression Recognition in the Wild
* Depression Recognition Using Remote Photoplethysmography From Facial Videos
* Detecting Dependency-Related Sentiment Features for Aspect-Level Sentiment Classification
* Detecting Mental Disorders in Social Media Through Emotional Patterns: The Case of Anorexia and Depression
* Detection and Identification of Choking Under Pressure in College Tennis Based Upon Physiological Parameters, Performance Patterns, and Game Statistics
* Discerning Affect From Touch and Gaze During Interaction With a Robot Pet
* Discriminative Few Shot Learning of Facial Dynamics in Interview Videos for Autism Trait Classification
* Distilling Region-Wise and Channel-Wise Deep Structural Facial Relationships for FAU (DSR-FAU) Intensity Estimation
* Does Visual Self-Supervision Improve Learning of Speech Representations for Emotion Recognition?
* Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition
* Driver Emotion Recognition With a Hybrid Attentional Multimodal Fusion Framework
* Dual Attention and Element Recalibration Networks for Automatic Depression Level Prediction
* Dual Learning for Joint Facial Landmark Detection and Action Unit Recognition
* Dyadic Affect in Parent-Child Multimodal Interaction: Introducing the DAMI-P2C Dataset and its Preliminary Analysis
* Dynamic State-Space Modeling With Factorial Memories in Temporal Dominance of Sensations, Emotions and Temporal Liking
* E-Key: An EEG-Based Biometric Authentication and Driving Fatigue Detection System
* ECPEC: Emotion-Cause Pair Extraction in Conversations
* Editorial: Special Issue on Unobtrusive Physiological Measurement Methods for Affective Applications
* EEG Feature Selection via Global Redundancy Minimization for Emotion Recognition
* EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention
* EEG-Based Emotion Recognition via Neural Architecture Search
* EEG-Based Emotion Recognition With Emotion Localization via Hierarchical Self-Attention
* EEG-Based Emotional Video Classification via Learning Connectivity Structure
* EEG-Based Online Regulation of Difficulty in Simulated Flying
* EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion
* Effective 3D Text Recurrent Voting Generator for Metaverse, An
* Effective Connectivity Based EEG Revealing the Inhibitory Deficits for Distracting Stimuli in Major Depression Disorders
* Effects of Physiological Signals in Different Types of Multimodal Sentiment Estimation
* Embedding Refinement Framework for Targeted Aspect-Based Sentiment Analysis
* EmoNet: A Transfer Learning Framework for Multi-Corpus Speech Emotion Recognition
* Emotion Arousal Assessment Based on Multimodal Physiological Signals for Game Users
* Emotion Distribution Learning Based on Peripheral Physiological Signals
* Emotion Expression in Human Body Posture and Movement: A Survey on Intelligible Motion Factors, Quantification and Validation
* Emotion Intensity and its Control for Emotional Voice Conversion
* Emotion Recognition for Everyday Life Using Physiological Signals From Wearables: A Systematic Literature Review
* Emotion Recognition Method for Game Evaluation Based on Electroencephalogram, An
* Emotion-Regularized Conditional Variational Autoencoder for Emotional Response Generation
* Emotional Attention Detection and Correlation Exploration for Image Emotion Distribution Learning
* Emotional Contagion-Aware Deep Reinforcement Learning for Antagonistic Crowd Simulation
* Emotional Expressivity is a Reliable Signal of Surprise
* Empathetic Conversational Systems: A Review of Current Advances, Gaps, and Opportunities
* Enforcing Semantic Consistency for Cross Corpus Emotion Prediction Using Adversarial Discrepancy Learning in Emotion
* Enroll-to-Verify Approach for Cross-Task Unseen Emotion Class Recognition, An
* Estimating Affective Taste Experience Using Combined Implicit Behavioral and Neurophysiological Measures
* Estimating the Uncertainty in Emotion Class Labels With Utterance-Specific Dirichlet Priors
* Examining Emotion Perception Agreement in Live Music Performance
* Exploring Complexity of Facial Dynamics in Autism Spectrum Disorder
* Exploring the Contextual Factors Affecting Multimodal Emotion Recognition in Videos
* Exploring the Function of Expressions in Negotiation: The DyNego-WOZ Corpus
* Facial Expression Animation by Landmark Guided Residual Module
* Facial Expression Recognition in the Wild Using Multi-Level Features and Attention Mechanisms
* Facial Expression Recognition With Visual Transformers and Attentional Selective Fusion
* Facial Image-Based Automatic Assessment of Equine Pain
* FENP: A Database of Neonatal Facial Expression for Pain Analysis
* Few-Shot Learning in Emotion Recognition of Spontaneous Speech Using a Siamese Neural Network With Adaptive Sample Pair Formation
* Finding Needles in a Haystack: Recognizing Emotions Just From Your Heart
* Fine-Grained Domain Adaptation for Aspect Category Level Sentiment Analysis
* Frustration Recognition Using Spatio Temporal Data: A Novel Dataset and GCN Model to Recognize In-Vehicle Frustration
* GANSER: A Self-Supervised Data Augmentation Framework for EEG-Based Emotion Recognition
* Geometry-Aware Facial Expression Recognition via Attentive Graph Convolutional Networks
* GMSS: Graph-Based Multi-Task Self-Supervised Learning for EEG Emotion Recognition
* Graph-Based Facial Affect Analysis: A Review
* Group Synchrony for Emotion Recognition Using Physiological Signals
* Guest Editorial Neurosymbolic AI for Sentiment Analysis
* Guest Editorial: Special Issue on Affective Speech and Language Synthesis, Generation, and Conversion
* Heterogeneous Reinforcement Learning Network for Aspect-Based Sentiment Classification With External Knowledge
* Hidden Bawls, Whispers, and Yelps: Can Text Convey the Sound of Speech, Beyond Words?
* Hierarchical Interactive Multimodal Transformer for Aspect-Based Multimodal Sentiment Analysis
* Hierarchical Multiscale Recurrent Neural Networks for Detecting Suicide Notes
* High-Quality Landmarked Infrared Eye Video Dataset (IREye4Task): Eye Behaviors, Insights and Benchmarks for Wearable Mental State Analysis, A
* Human Emotion Recognition With Relational Region-Level Analysis
* Hybrid Contrastive Learning of Tri-Modal Representation for Multimodal Sentiment Analysis
* Hybrid Regularizations for Multi-Aspect Category Sentiment Analysis
* I Enjoy Writing and Playing, Do You?: A Personalized and Emotion Grounded Dialogue Agent Using Generative Adversarial Network
* Impact of Facial Landmark Localization on Facial Expression Recognition
* Improving Humanness of Virtual Agents and Users' Cooperation Through Emotions
* Improving Textual Emotion Recognition Based on Intra- and Inter-Class Variations
* Incorporating Forthcoming Events and Personality Traits in Social Media Based Stress Prediction
* Indirect Identification of Perinatal Psychosocial Risks From Natural Language
* Individual and Joint Body Movement Assessed by Wearable Sensing as a Predictor of Attraction in Speed Dates
* Interaction of Cognitive and Affective Load Within a Virtual City
* Interpretation of Depression Detection Models via Feature Selection Methods
* Investigating Multisensory Integration in Emotion Recognition Through Bio-Inspired Computational Models
* Joint Local-Global Discriminative Subspace Transfer Learning for Facial Expression Recognition
* Learnable Hierarchical Label Embedding and Grouping for Visual Intention Understanding
* Learning Enhanced Acoustic Latent Representation for Small Scale Affective Corpus with Adversarial Cross Corpora Integration
* Learning Person-Specific Cognition From Facial Reactions for Automatic Personality Recognition
* Learning to Learn Better Unimodal Representations via Adaptive Multimodal Meta-Learning
* Learning Transferable Sparse Representations for Cross-Corpus Facial Expression Recognition
* Learning Users Inner Thoughts and Emotion Changes for Social Media Based Suicide Risk Detection
* Local Temporal Pattern and Data Augmentation for Spotting Micro-Expressions
* Long Short-Term Memory Network Based Unobtrusive Workload Monitoring With Consumer Grade Smartwatches
* Looking at the Body: Automatic Analysis of Body Gestures and Self-Adaptors in Psychological Distress
* LQGDNet: A Local Quaternion and Global Deep Network for Facial Depression Recognition
* MDN: A Deep Maximization-Differentiation Network for Spatio-Temporal Depression Detection
* Media-Guided Attentive Graphical Network for Personality Recognition Using Physiology, A
* Mediating Effect of Emotions on Trust in the Context of Automated System Usage, The
* MERASTC: Micro-Expression Recognition Using Effective Feature Encodings and 2D Convolutional Neural Network
* Meta Auxiliary Learning for Facial Action Unit Detection
* Meta-Based Self-Training and Re-Weighting for Aspect-Based Sentiment Analysis
* Methodology to Assess Quality, Presence, Empathy, Attitude, and Attention in 360-degree Videos for Immersive Communications
* MIA-Net: Multi-Modal Interactive Attention Network for Multi-Modal Affective Analysis
* MMPosE: Movie-Induced Multi-Label Positive Emotion Classification Through EEG Signals
* Modeling Multiple Temporal Scales of Full-Body Movements for Emotion Classification
* Modelling Stochastic Context of Audio-Visual Expressive Behaviour With Affective Processes
* Morality Classification in Natural Language Text
* Multi-Label Emotion Detection via Emotion-Specified Feature Extraction and Emotion Correlation Learning
* Multi-Modal Sarcasm Detection and Humor Classification in Code-Mixed Conversations
* Multi-Modal Stacked Ensemble Model for Bipolar Disorder Classification, A
* Multi-Order Networks for Action Unit Detection
* Multi-Target Positive Emotion Recognition From EEG Signals
* Multidimensional Culturally Adapted Representation of Emotions for Affective Computational Simulation and Recognition, A
* Multimodal Affective States Recognition Based on Multiscale CNNs and Biologically Inspired Decision Fusion Model
* Multimodal Emotion-Cause Pair Extraction in Conversations
* Multimodal Engagement Analysis From Facial Videos in the Classroom
* Multimodal Hierarchical Attention Neural Network: Looking for Candidates Behaviour Which Impact Recruiter's Decision
* Multimodal Sentiment Analysis Based on Attentional Temporal Convolutional Network and Multi-Layer Feature Fusion
* Multimodal Sentiment Analysis in Car Reviews (MuSe-CaR) Dataset: Collection, Insights and Improvements, The
* Multimodal Spatiotemporal Representation for Automatic Depression Level Detection
* Multitask Learning From Augmented Auxiliary Data for Improving Speech Emotion Recognition
* Mutual Information Based Fusion Model (MIBFM): Mild Depression Recognition Using EEG and Pupil Area Signals
* Neural Predictive Model of Negative Emotions for COVID-19, A
* Neurofeedback Training With an Electroencephalogram-Based Brain-Computer Interface Enhances Emotion Regulation
* Non-Invasive Measurement of Trust in Group Interactions
* Novel Computational Linguistic Measures, Dialogue System and the Development of SOPHIE: Standardized Online Patient for Healthcare Interaction Education
* Novel Markovian Framework for Integrating Absolute and Relative Ordinal Emotion Information, A
* Ordinal Logistic Regression With Partial Proportional Odds for Depression Prediction
* Overview of Facial Micro-Expression Analysis: Data, Methodology and Challenge, An
* Pars-OFF: A Benchmark for Offensive Language Detection on Farsi Social Media
* Perceived Conversation Quality in Spontaneous Interactions
* PerceptSent: Exploring Subjectivity in a Novel Dataset for Visual Sentiment Analysis
* Personal-Zscore: Eliminating Individual Difference for EEG-Based Cross-Subject Emotion Recognition
* Personality Trait Recognition Based on Smartphone Typing Characteristics in the Wild
* PIDViT: Pose-Invariant Distilled Vision Transformer for Facial Expression Recognition in the Wild
* Pixels and Sounds of Emotion: General-Purpose Representations of Arousal in Games, The
* Prediction of Depression Severity Based on the Prosodic and Semantic Features With Bidirectional LSTM and Time Distributed CNN
* Probabilistic Attribute Tree Structured Convolutional Neural Networks for Facial Expression Recognition in the Wild
* Quantifying Emotional Similarity in Speech
* Re-Analysis and Synthesis of Data on Affect Dynamics in Learning, A
* Receiving a Mediated Touch From Your Partner vs. a Male Stranger: How Visual Feedback of Touch and Its Sender Influence Touch Experience
* Recognizing, Fast and Slow: Complex Emotion Recognition With Facial Expression Detection and Remote Physiological Measurement
* Region Attentive Action Unit Intensity Estimation With Uncertainty Weighted Multi-Task Learning
* Region Group Adaptive Attention Model For Subtle Expression Recognition, A
* Reinforcement Learning Based Two-Stage Model for Emotion Cause Pair Extraction, A
* Review of Affective Computing Research Based on Function-Component-Representation Framework, A
* Rhythm of Flow: Detecting Facial Expressions of Flow Experiences Using CNNs, The
* Self Supervised Adversarial Domain Adaptation for Cross-Corpus and Cross-Language Speech Emotion Recognition
* Self-Supervised Learning of Person-Specific Facial Dynamics for Automatic Personality Recognition
* Semi-Structural Interview-Based Chinese Multimodal Depression Corpus Towards Automatic Preliminary Screening of Depressive Disorders
* Sentiment- Emotion- and Context-Guided Knowledge Selection Framework for Emotion Recognition in Conversations
* Shared-Private Memory Networks For Multimodal Sentiment Analysis
* Shyness Trait Recognition for Schoolchildren via Multi-View Features of Online Writing
* Silicon Coppélia and the Formalization of the Affective Process
* Simple But Powerful, a Language-Supervised Method for Image Emotion Classification
* Smart Affect Monitoring With Wearables in the Wild: An Unobtrusive Mood-Aware Emotion Recognition System
* SMIN: Semi-Supervised Multi-Modal Interaction Network for Conversational Emotion Recognition
* Social Image-Text Sentiment Classification With Cross-Modal Consistency and Knowledge Distillation
* SocialInteractionGAN: Multi-Person Interaction Sequence Generation
* SparseDGCNN: Recognizing Emotion From Multichannel EEG Signals
* Spatial-Temporal Graphs Plus Transformers for Geometry-Guided Facial Expression Recognition
* Speech Synthesis With Mixed Emotions
* Spontaneous Driver Emotion Facial Expression (DEFE) Dataset for Intelligent Vehicles: Emotions Triggered by Video-Audio Clips in Driving Scenarios, A
* STCAM: Spatial-Temporal and Channel Attention Module for Dynamic Facial Expression Recognition
* Stress Detection During Motor Activity: Comparing Neurophysiological Indices in Older Adults
* Survey of Deep Representation Learning for Speech Emotion Recognition
* Survey of Textual Emotion Recognition and Its Challenges, A
* Survey on Emotion Sensing Using Mobile Devices
* Teardrops on My Face: Automatic Weeping Detection From Nonverbal Behavior
* TensorFormer: A Tensor-Based Multimodal Transformer for Multimodal Sentiment Analysis and Depression Detection
* THIN: THrowable Information Networks and Application for Facial Expression Recognition in the Wild
* To Be or Not to Be in Flow at Work: Physiological Classification of Flow Using Machine Learning
* Touching Virtual Humans: Haptic Responses Reveal the Emotional Impact of Affective Agents
* Toward Automated Classroom Observation: Multimodal Machine Learning to Estimate CLASS Positive Climate and Negative Climate
* Towards Participant-Independent Stress Detection Using Instrumented Peripherals
* Transfer Learning Approach to Heatmap Regression for Action Unit Intensity Estimation, A
* TSception: Capturing Temporal Dynamics and Spatial Asymmetry From EEG for Emotion Recognition
* TSSRD: A Topic Sentiment Summarization Framework Based on Reaching Definition
* Two Birds With One Stone: Knowledge-Embedded Temporal Convolutional Transformer for Depression Detection and Emotion Recognition
* Typical Facial Expression Network Using a Facial Feature Decoupler and Spatial-Temporal Learning
* UBFC-Phys: A Multimodal Database For Psychophysiological Studies of Social Stress
* Unsupervised Cross-Corpus Speech Emotion Recognition Using a Multi-Source Cycle-GAN
* Unsupervised Cross-View Facial Expression Image Generation and Recognition
* Use of Affective Visual Information for Summarization of Human-Centric Videos
* User State Modeling Based on the Arousal-Valence Plane: Applications in Customer Satisfaction and Health-Care
* Using Affect as a Communication Modality to Improve Human-Robot Communication in Robot-Assisted Search and Rescue Scenarios
* Using Subgroup Discovery to Relate Odor Pleasantness and Intensity to Peripheral Nervous System Reactions
* Variational Instance-Adaptive Graph for EEG Emotion Recognition
* Virtual Reality for Emotion Elicitation: A Review
* Vision Transformer With Attentive Pooling for Robust Facial Expression Recognition
* Weakly-Supervised Learning for Fine-Grained Emotion Recognition Using Physiological Signals
* Werewolf-XL: A Database for Identifying Spontaneous Affect in Large Competitive Group Interactions
* When and Why Static Images Are More Effective Than Videos
* When is a Haptic Message Like an Inside Joke? Digitally Mediated Emotive Communication Builds on Shared History
* WiFE: WiFi and Vision Based Unobtrusive Emotion Recognition via Gesture and Facial Expression
* You're Not You When You're Angry: Robust Emotion Features Emerge by Recognizing Speakers
247 for AffCom(14)

AffCom(15) * ARAUS: A Large-Scale Dataset and Baseline Models of Affective Responses to Augmented Urban Soundscapes
* Automatic Deceit Detection Through Multimodal Analysis of High-Stake Court-Trials
* Context-Aware Dynamic Word Embeddings for Aspect Term Extraction
* Cross-Day Data Diversity Improves Inter-Individual Emotion Commonality of Spatio-Spectral EEG Signatures Using Independent Component Analysis
* Crowdsourcing Affective Annotations Via fNIRS-BCI
* Data Leakage and Evaluation Issues in Micro-Expression Analysis
* E)Affective Bind: Situated Affectivity and the Prospect of Affect Recognition, An
* Efficient Multimodal Transformer With Dual-Level Feature Restoration for Robust Multimodal Sentiment Analysis
* Empirical Validation of an Agent-Based Model of Emotion Contagion
* Ethical Considerations and Checklist for Affective Research With Wearables
* Ethics of AI in Games, The
* Facial Expression Recognition in Classrooms: Ethical Considerations and Proposed Guidelines for Affect Detection in Educational Settings
* GA2MIF: Graph and Attention Based Two-Stage Multi-Source Information Fusion for Conversational Emotion Detection
* Guest Editorial: Ethics in Affective Computing
* LGSNet: A Two-Stream Network for Micro- and Macro-Expression Spotting With Background Modeling
* Measuring and Fostering Diversity in Affective Computing Research
* Modeling Uncertainty for Low-Resolution Facial Expression Recognition
* Opacity, Transparency, and the Ethics of Affective Computing
* Pose-Aware Facial Expression Recognition Assisted by Expression Descriptions
* Prompt Consistency for Multi-Label Textual Emotion Detection
* Quantum Probability Driven Framework for Joint Multi-Modal Sarcasm, Sentiment and Emotion Analysis, A
* Review of Tools and Methods for Detection, Analysis, and Prediction of Allostatic Load Due to Workplace Stress, A
* Role of Preprocessing for Word Representation Learning in Affective Tasks, The
* Spatio-Temporal Graph Analytics on Secondary Affect Data for Improving Trustworthy Emotional AI
* Transformer-Based Self-Supervised Multimodal Representation Learning for Wearable Emotion Recognition
* Unconstrained Facial Expression Recognition With No-Reference De-Elements Learning
* WavDepressionNet: Automatic Depression Level Prediction via Raw Speech Signals
27 for AffCom(15)

AffCom(2) * Aspect-Based Opinion Polling from Customer Reviews
* Automatic Recognition of Boredom in Video Games Using Novel Biosignal Moment-Based Features
* Constraint-Based Model for Synthesis of Multimodal Sequential Expressions of Emotions
* Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space
* Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior
* Emotion Recognition of Affective Speech Based on Multiple Classifiers Using Acoustic-Prosodic Information and Semantic Labels
* Ensemble Method for Classifying Startle Eyeblink Modulation from High-Speed Video Records, An
* Experience-Driven Procedural Content Generation
* Exploring Fusion Methods for Multimodal Emotion Recognition with Missing Data
* Facial Expression Recognition Using Facial Movement Features
* Finding Mutual Benefit between Subjectivity Analysis and Information Extraction
* Interdependencies among Voice Source Parameters in Emotional Speech
* Introduction to the Affect-Based Human Behavior Understanding Special Issue
* Real-Time Recognition of Affective States from Nonverbal Features of Speech and Its Application for Public Speaking Skill Analysis
* Recognizing Affect from Linguistic Information in 3D Continuous Space
* Role of Visual Complexity in Affective Reactions to Webpages: Subjective, Eye Movement, and Cardiovascular Responses, The
* SentiFul: A Lexicon for Sentiment Analysis
* Thermal Analysis of Facial Muscles Contractions
* Toward a Minimal Representation of Affective Gestures
19 for AffCom(2)

AffCom(3) * Affect Dilemma for Artificial Agents: Should We Develop Affective Artificial Agents?, The
* Affective Computing in Consumer Electronics
* Affective Learning: Empathetic Agents with Emotional Facial and Tone of Voice Expressions
* Are Emotional Robots Deceptive?
* Automatic Personality Perception: Prediction of Trait Attribution Based on Prosodic Features
* Belfast Induced Natural Emotion Database, The
* Bridging the Gap between Social Animal and Unsocial Machine: A Survey of Social Signal Processing
* Building and Exploiting EmotiNet, a Knowledge Base for Emotion Detection Based on the Appraisal Theory Model
* Building Autonomous Sensitive Artificial Listeners
* Co-Adaptive and Affective Human-Machine Interface for Improving Training Performances of Virtual Myoelectric Forearm Prosthesis
* Conative Dimensions of Machine Ethics: A Defense of Duty
* Connecting Meeting Behavior with Extraversion: A Systematic Study
* Context-Sensitive Learning for Enhanced Audiovisual Emotion Classification
* Crowdsourcing Facial Responses to Online Videos
* DEAP: A Database for Emotion Analysis Using Physiological Signals
* Detecting Naturalistic Expressions of Nonbasic Affect Using Physiological Signals
* ECG Pattern Analysis for Emotion Detection
* Editorial for the Special Section on Ethics and Affective Computing
* Effects of an Interactive Software Agent on Student Affective Dynamics while Using an Intelligent Tutoring System, The
* Evaluation of Four Designed Virtual Agent Personalities
* Exploring Temporal Patterns in Classifying Frustrated and Delighted Smiles
* Feelings Elicited by Auditory Feedback from a Computationally Augmented Artifact: The Flops
* Galvanic Intrabody Communication for Affective Acquiring and Computing
* Generation of Personalized Ontology Based on Consumer Emotion and Behavior Analysis
* Good Our Field Can Hope to Do, the Harm It Should Avoid, The
* Guest Editorial: Special Section on Naturalistic Affect Resources for System Building and Evaluation
* Identifying Emotion through Implicit and Explicit Measures: Cultural Differences, Cognitive Load, and Immersion
* Interpersonal Synchrony: A Survey of Evaluation Methods across Disciplines
* Modeling the Temporal Evolution of Acoustic Parameters for Speech Emotion Recognition
* Multi-Aspect Rating Inference with Aspect-Based Segmentation
* Multimodal Database for Affect Recognition and Implicit Tagging, A
* Multimodal Emotion Recognition in Response to Videos
* New Approach to Modeling Emotions and Their Use on a Decision-Making System for Artificial Agents, A
* Perinasal Imaging of Physiological Stress and Its Affective Potential
* Physiological-Based Affect Event Detector for Entertainment Video Applications
* Quantitative Study of Individual Emotional States in Social Networks
* Red-Pill Robots Only, Please
* Representative Segment-Based Emotion Analysis and Classification with Automatic Respiration Signal Segmentation
* Robots, Love, and Sex: The Ethics of Building a Love Machine
* Role of Nonlinear Dynamics in Affective Valence and Arousal Recognition, The
* SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent, The
* Toward E-Motion-Based Music Retrieval a Study of Affective Gesture Recognition
* Voice of Leadership: Models and Performances of Automatic Analysis in Online Speeches, The
43 for AffCom(3)

AffCom(4) * Affect and Social Processes in Online Communication: Experiments with an Affective Dialog System
* Affective Assessment by Digital Processing of the Pupil Diameter
* Affective Body Expression Perception and Recognition: A Survey
* Analyses of a Multimodal Spontaneous Facial Expression Database
* Anchor Models for Emotion Recognition from Speech
* Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation
* Challenges in Computational Modeling of Affective Processes
* Classifier-Based Learning of Nonlinear Feature Manifold for Visualization of Emotional Speech Prosody
* Component-Based Recognition of Faces and Facial Expressions
* Computational Modeling of Emotion: Toward Improving the Inter- and Intradisciplinary Exchange
* Data-Free Prior Model for Facial Action Unit Recognition
* Detecting Depression Severity from Vocal Prosody
* Directing Physiology and Mood through Music: Validation of an Affective Music Player
* DISFA: A Spontaneous Facial Action Intensity Database
* EEG-Based Classification of Music Appraisal Responses Using Time-Frequency Analysis and Familiarity Ratings
* Emotional Responses to Victory and Defeat as a Function of Opponent
* Exploring Cross-Modality Affective Reactions for Audiovisual Emotion Recognition
* Expressive Virtual Audience with Flexible Behavioral Styles, An
* Facial Expression Recognition in the Encrypted Domain Based on Local Fisher Discriminant Analysis
* Facial Expression Recognition Influenced by Human Aging
* HED: A Computational Model of Affective Adaptation and Emotion Dynamics
* How Was Your Day? Evaluating a Conversational Companion
* Introducing Emotions to the Modeling of Intra- and Inter-Personal Influences in Parent-Adolescent Conversations
* Iterative Feature Normalization Scheme for Automatic Emotion Detection from Speech
* Modeling Arousal Phases in Daily Living Using Wearable Sensors
* Nonlinear Appraisal Modeling: An Application of Machine Learning to the Study of Emotion Production
* Porting Multilingual Subjectivity Resources across Languages
* Positive Affective Interactions: The Role of Repeated Exposure and Copresence
* Predicting Emotional Responses to Long Informal Text
* Predicting User-Topic Opinions in Twitter with Social and Topical Context
* Projection into Expression Subspaces for Face Recognition from Single Sample per Person
* Role of Affect Analysis in Dialogue Act Identification, The
* Seeing Stars of Valence and Arousal in Blog Posts
* Sentiment Word Relations with Affect, Judgment, and Appreciation
* Synesketch: An Open Source Library for Sentence-Based Emotion Recognition
* Unimodal and Multimodal Human Perception of Naturalistic Non-Basic Affective States During Human-Computer Interactions
* Using a Smartphone to Measure Heart Rate Changes during Relived Happiness and Anger
37 for AffCom(4)

AffCom(5) * Affect and Engagement in Game-Based Learning Environments
* Affect and Wellbeing: Introduction to Special Section
* Affective and Content Analysis of Online Depression Communities
* Affective Visual Perception Using Machine Pareidolia of Facial Expressions
* Ambulatory Assessment of Affect: Survey of Sensor Systems for Monitoring of Autonomic Nervous Systems Activation in Emotion
* Are They Different? Affect, Feeling, Emotion, Sentiment, and Opinion Detection in Text
* Automatic Detection of Nonverbal Behavior Predicts Learning in Dyadic Interactions
* Automatic Framework for Textured 3D Video-Based Facial Expression Recognition, An
* Clustering Affective Qualities of Classical Music: Beyond the Valence-Arousal Plane
* CREMA-D: Crowd-Sourced Emotional Multimodal Actors Dataset
* Current Directions in Personality Science and the Potential for Advances through Computing
* Detection of Psychological Stress Using a Hyperspectral Imaging Technique
* Distributing Recognition in Computational Paralinguistics
* Do Prospect-Based Emotions Enhance Believability of Game Characters? A Case Study in the Context of a Dice Game
* Don't Classify Ratings of Affect; Rank Them!
* Emotion Recognition Based on Multi-Variant Correlation of Physiological Signals
* Faces of Engagement: Automatic Recognition of Student Engagement from Facial Expressions, The
* Feature Extraction and Selection for Emotion Recognition from EEG
* Fully automated recognition of spontaneous facial expressions in videos using random forest classifiers
* GAMYGDALA: An Emotion Engine for Games
* Guest Editorial: Emotion in Games
* Interpersonal Coordination of Head Motion in Distressed Couples
* Intra-Class Variation Reduction Using Training Expression Images for Sparse Representation Based Facial Expression Recognition
* Joint Attention Simulation Using Eye-Tracking and Virtual Humans
* Linking Recognition Accuracy and User Experience in an Affective Feedback Loop
* Making Tactile Textures with Predefined Affective Properties
* Measuring Affective-Cognitive Experience and Predicting Market Success
* Mind Module: Using an Affectand Personality Computational Modelas a Game-Play Element, The
* More Personality in Personality Computing
* Multi-View Facial Expression Recognition Based on Group Sparse Reduced-Rank Regression
* Predicting Continuous Conflict Perception with Bayesian Gaussian Processes
* Robust Unsupervised Arousal Rating: A Rule-Based Framework withKnowledge-Inspired Vocal Features
* Survey of Personality Computing, A
* Vision and Attention Theory Based Sampling for Continuous Facial Emotion Recognition
* Zapping Index: Using Smile to Measure Advertisement Zapping Likelihood
35 for AffCom(5)

AffCom(6) * Analyzing Interpersonal Empathy via Collective Impressions
* Arousal Recognition Using Audio-Visual Features and FMRI-Based Brain Response
* Assistive Image Comment Robot: A Novel Mid-Level Concept-Based Representation
* Autism Blogs: Expressed Emotion, Language Styles and Concerns in Personal and Community Settings
* Automatic Facial Expression Recognition Using Features of Salient Facial Patches
* Automatic Group Happiness Intensity Analysis
* Automatic Music Mood Classification Based on Timbre and Modulation Features
* Consensus Analysis and Modeling of Visual Aesthetic Perception
* Correcting Time-Continuous Emotional Labels by Modeling the Reaction Lag of Evaluators
* DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses
* Dynamic Time Warping for Music Retrieval Using Time Series Modeling of Musical Emotions
* Effects of Interactive Sonification on Emotionally Expressive Walking Styles
* Guest Editorial: Challenges and Perspectives for Affective Analysis in Multimedia
* HapFACS 3.0: FACS-Based Facial Expression Generator for 3D Speaking Virtual Characters
* Head Movement Dynamics during Play and Perturbed Mother-Infant Interaction
* Hierarchical Dirichlet Process Mixture Model for Music Emotion Recognition
* Humans versus Computers: Impact of Emotion Expressions on People's Decision Making
* I Can Already Guess Your Answer: Predicting Respondent Reactions during Dyadic Negotiation
* Introduction to the Best of ACII 2013 Special Section
* Investigating the Impact of Sound Angular Position on the Listener Affective State
* LIRIS-ACCEDE: A Video Database for Affective Content Analysis
* Modeling Emotion Influence in Image Social Networks
* Modeling the Affective Content of Music with a Gaussian Mixture Model
* Multimodal Affect Classification at Various Temporal Lengths
* Neuroticism, Extraversion, Conscientiousness and Stress: Physiological Correlates
* Perception and Automatic Recognition of Laughter from Whole-Body Motion: Continuous and Categorical Perspectives
* Predicting Ad Liking and Purchase Intent: Large-Scale Analysis of Facial Responses to Ads
* Predicting Mood from Punctual Emotion Annotations on Videos
* Recognizing Emotions Induced by Affective Sounds through Heart Rate Variability
* Speech Emotion Recognition Using Fourier Parameters
* Subjective Perceptions in Wartime Negotiation
* UMEME: University of Michigan Emotional McGurk Effect Data Set
* Video Affective Content Analysis: A Survey of State-of-the-Art Methods
* What Strikes the Strings of Your Heart?: Feature Mining for Music Emotion Analysis
* What Your Face Vlogs About: Expressions of Emotion and Big-Five Traits Impressions in YouTube
35 for AffCom(6)

AffCom(7) * Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection
* Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset, The
* Automatic Methods for the Detection of Accelerative Cardiac Defense Response
* Best of Bodynets 2014: Editorial
* Dancing with Physio: A Mobile Game with Physiologically Aware Virtual Humans
* Design and Evaluation of a Touch-Centered Calming Interaction with a Social Robot
* Dynamics of Non-Verbal Vocalizations and Hormones during Father-Infant Interaction
* Facial Expression Recognition in the Presence of Speech Using Blind Lexical Compensation
* Geneva Minimalistic Acoustic Parameter Set (GeMAPS) for Voice Research and Affective Computing, The
* Genre-Adaptive Semantic Computing and Audio-Based Modelling for Music Mood Annotation
* Identifying User-Specific Facial Affects from Spontaneous Expressions with Minimal Annotation
* Increasing the Reliability of Crowdsourcing Evaluations Using Online Quality Assessment
* Inertial BSN-Based Characterization and Automatic UPDRS Evaluation of the Gait Task of Parkinsonians
* Main Directional Mean Optical Flow Feature for Spontaneous Micro-Expression Recognition, A
* Method for Automatic Detection of Psychomotor Entrainment, A
* Naturalistic Recognition of Activities and Mood Using Wearable Electronics
* On the Influence of an Iterative Affect Annotation Approach on Inter-Observer and Self-Observer Reliability
* Partial Matching of Facial Expression Sequence Using Over-Complete Transition Dictionary for Emotion Recognition
* Piecewise Linear Dynamical Model for Action Clustering from Real-World Deployments of Inertial Body Sensors
* Predicting Movie Trailer Viewer's Like/Dislike via Learned Shot Editing Patterns
* Prediction-Based Audiovisual Fusion for Classification of Non-Linguistic Vocalisations
* PREVENTER, a Selection Mechanism for Just-in-Time Preventive Interventions
* Real-Time Tele-Monitoring of Patients with Chronic Heart-Failure Using a Smartphone: Lessons Learned
* ReBreathe: A Calibration Protocol that Improves Stress/Relax Classification by Relabeling Deep Breathing Relaxation Exercises
* Recognizing Stress Using Semantics and Modulation of Speech and Gestures
* Self-Reported Symptoms of Depression and PTSD Are Associated with Reduced Vowel Space in Screening Interviews
* Semi-Automatic Creation of Youth Slang Corpus and Its Application to Affective Computing
* Sentiment Analysis: From Opinion Mining to Human-Agent Interaction
* SentiWords: Deriving a High Precision and High Coverage Lexicon for Sentiment Analysis
* Subject-Independent Odor Pleasantness Classification Using Brain and Peripheral Signals
* Two Techniques for Assessing Virtual Agent Personality
* Unified Framework for Dividing and Predicting a Large Set of Action Units, A
* Wearable Sensor System with Circadian Rhythm Stability Estimation for Prototyping Biomedical Studies, A
33 for AffCom(7)

AffCom(8) * Action Units and Their Cross-Correlations for Prediction of Cognitive Load during Driving
* Adolescent Suicidal Risk Assessment in Clinician-Patient Interaction
* Affective Reasoning for Big Social Data Analysis
* Applications of Automated Facial Coding in Media Measurement
* Audio-Driven Laughter Behavior Controller
* Audio-Facial Laughter Detection in Naturalistic Dyadic Conversations
* Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate
* Automatic Pain Assessment with Facial Activity Descriptors
* Automatic Prediction of Impressions in Time and across Varying Context: Personality, Attractiveness and Likeability
* BAUM-1: A Spontaneous Audio-Visual Face Database of Affective and Mental States
* Bootstrapping Social Emotion Classification with Semantically Rich Hybrid Neural Networks
* Coarse-Grained +/-Effect Word Sense Disambiguation for Implicit Sentiment Analysis
* Cognitive Load Measurement in a Virtual Reality-Based Driving System for Autism Intervention
* Computational Modeling of Players: Emotional Response Patterns to the Story Events of Video Games
* Continuous Estimation of Emotions in Speech by Dynamic Cooperative Speaker Models
* Cost of Dichotomizing Continuous Labels for Binary Classification Problems: Deriving a Bayesian-Optimal Classifier, The
* Cross-Dataset and Cross-Cultural Music Mood Prediction: A Case on Western and Chinese Pop Songs
* Cyberbullying Detection Based on Semantic-Enhanced Marginalized Denoising Auto-Encoder
* Distantly Supervised Lifelong Learning for Large-Scale Social Media Sentiment Analysis
* Emotion recognition in never-seen languages using a novel ensemble method with emotion profiles
* Emotion Rendering in Auditory Simulations of Imagined Walking Styles
* Emotion Rendering in Plantar Vibro-Tactile Simulations of Imagined Walking Styles
* Guest Editorial: Toward Commercial Applications of Affective Computing
* Guest Editorial: Towards Machines Able to Deal with Laughter
* Identifying Human Behaviors Using Synchronized Audio-Visual Cues
* Indian Spontaneous Expression Database for Emotion Recognition, The
* Inferring Affective Meanings of Words from Word Embedding
* Interaction Style Recognition Based on Multi-Layer Multi-View Profile Representation
* Laughter and Smiling in 16 Positive Emotions
* Laughter and Tickles: Toward Novel Approaches for Emotion and Behavior Elicitation
* Microexpression Identification and Categorization Using a Facial Dynamics Map
* Modeling Dynamics of Expressive Body Gestures In Dyadic Interactions
* MSP-IMPROV: An Acted Corpus of Dyadic Interactions to Study Emotion Perception
* Multi-Task Learning Framework for Emotion Recognition Using 2D Continuous Space, A
* Multiple Instance Learning for Behavioral Coding
* MultiSense: Context-Aware Nonverbal Behavior Analysis Framework: A Psychological Distress Use Case
* Neural Word Embeddings Approach for Multi-Domain Sentiment Analysis, A
* Operationalizing Engagement with Multimedia as User Coherence with Context
* Physiological Responses to Affective Tele-Touch during Induced Emotional Stimuli
* Pictures We Like Are Our Image: Continuous Mapping of Favorite Pictures into Self-Assessed and Attributed Personality Traits, The
* Rapport with Virtual Agents: What Do Human Social Cues and Personality Explain?
* Sparsity in Dynamics of Spontaneous Subtle Emotions: Analysis and Application
* Toward Use of Facial Thermal Features in Dynamic Assessment of Affect and Arousal Level
* Wearable Device for Fast and Subtle Spontaneous Smile Recognition, A
44 for AffCom(8)

AffCom(9) * Affective Video Content Analysis: A Multidisciplinary Insight
* ASCERTAIN: Emotion and Personality Recognition Using Commercial Sensors
* Assessing the Influence of Mirroring on the Perception of Professional Competence Using Wearable Technology
* Asynchronous and Event-Based Fusion Systems for Affect Recognition on Naturalistic Data in Comparison to Conventional Approaches
* Audio-Based Granularity-Adapted Emotion Classification
* Automated Analysis and Prediction of Job Interview Performance
* Automated Depression Diagnosis Based on Deep Networks to Encode Facial Appearance and Dynamics
* CAS(ME)^2: A Database for Spontaneous Macro-Expression and Micro-Expression Spotting and Recognition
* Combined Rule-Based Machine Learning Audio-Visual Emotion Recognition Approach, A
* Combining Facial Expression and Touch for Perceiving Emotional Valence
* Computational Study of Expressive Facial Dynamics in Children with Autism, A
* Cross-Domain Color Facial Expression Recognition Using Transductive Transfer Subspace Learning
* Data-Driven Facial Beauty Analysis: Prediction, Retrieval and Manipulation
* Deep Bimodal Regression of Apparent Personality Traits from Short Video Sequences
* Detecting Aggression in Voice Using Inverse Filtered Speech Features
* Detecting Work Stress in Offices by Combining Unobtrusive Sensors
* Emotion Analysis for Personality Inference from EEG Signals
* Emotionally-Relevant Features for Classification and Regression of Music Lyrics
* End-User Development for Interactive Data Analytics: Uncertainty, Correlation and User Confidence
* Facial Expression Recognition in Video with Multiple Feature Fusion
* FiToViz: A Visualisation Approach for Real-Time Risk Situation Awareness
* Gaze-Sensitive Virtual Reality Based Social Communication Platform for Individuals with Autism
* Guest Editorial: Apparent Personality Analysis
* Heterogeneous Knowledge Transfer in Video Emotion Recognition, Attribution and Summarization
* Identifying Emotions from Non-Contact Gaits Information Based on Microsoft Kinects
* Improving Socially-Aware Recommendation Accuracy Through Personality
* Individuals' Stress Assessment Using Human-Smartphone Interaction Analysis
* Interactions Between Threat and Executive Control in a Virtual Reality Stroop Task
* Leveraging the Bayesian Filtering Paradigm for Vision-Based Facial Affective State Estimation
* Modeling Multiple Time Series Annotations as Noisy Distortions of the Ground Truth: An Expectation-Maximization Approach
* Multimodal Depression Detection: Fusion Analysis of Paralinguistic, Head Pose and Eye Gaze Behaviors
* Multimodal First Impression Analysis with Deep Residual Networks
* Multimodal Stress Detection from Multiple Assessments
* On the Interrelation Between Listener Characteristics and the Perception of Emotions in Classical Orchestra Music
* Perception of Emotions and Body Movement in the Emilya Database
* Portable Personality Recognizer Based on Affective State Classification Using Spectral Fusion of Features, A
* Predicting Personalized Image Emotion Perceptions in Social Networks
* Predicting the Probability Density Function of Music Emotion Using Emotion Space Mapping
* Real-Time Movie-Induced Discrete Emotion Recognition from EEG Signals
* Robust Facial Expression Recognition for MuCI: A Comprehensive Neuromuscular Signal Analysis
* SAMM: A Spontaneous Micro-Facial Movement Dataset
* Spontaneous Expression Detection from 3D Dynamic Sequences by Analyzing Trajectories on Grassmann Manifolds
* Supervised Committee of Convolutional Neural Networks in Automated Facial Expression Analysis
* Towards Reading Hidden Emotions: A Comparative Study of Spontaneous Micro-Expression Spotting and Recognition Methods
* Virtual Character Facial Expressions Influence Human Brain and Facial EMG Activity in a Decision-Making Game
* Who Likes What and, Why? Insights into Modeling Users' Personality Based on Image Likes
46 for AffCom(9)

Index for "a"


Last update:18-Apr-24 12:22:27
Use price@usc.edu for comments.