It is needless to say that emotions play an important role in our everyday lives. One's emotional state affects how memories are created and retrieved, the way we think of a problem and make decisions, and our approach to social interactions to name a few examples. This project explores how our emotional states can be affected by our environment and the objects that we interact with daily. More precisely, it proposes to investigate how feedback mechanisms already integrated in consumer electronics are affecting users' affective state and what multi-sensory stimulation parameters are necessary to observe these modifications. It is anticipated that results from this research effort will have significant impact on the future of emotion-aware interface designs, the entertainment industry, as well as provide the emotion researchers with new knowledge.
Recognition of users' emotional state, or "affect", provides invaluable feedback to user interface (UI) designers. More importantly, this allows for the potential of personalized UIs and games that adapt in real-time to the users' emotions. Prior efforts in this area have employed computer vision, peripheral physiological signals, and analysis of mouse movement and keyboard typing patterns. However, the increased popularity of consumer mobile devices motivates us to investigate how affect classification performance could benefit from the ever-increasing number of sensors that are integrated into modern smartphones. Specifically, this project will determine how text input patterns can be complemented by the data from other integrated sensors to achieve improved emotion recognition on consumer grade smartphones.
What causes us to perceive something as hot? We can see an object turning red, releasing steam, we can hear a kettle boiling or a pinging as the metal expands and cools, or we can touch it, allowing us to directly perceive its temperature. When something is dangerously hot, your body will protect itself by withdrawing your limb from the harmful surface. What if the withdrawal reflex is not only a mere reaction, physically protecting your skin, but part of the sensory information used by your brain in its assessment of temperature. Virtual and augmented reality training or gaming systems could benefit from such an illusion, allowing the rendering of perceptually dangerously hot temperature without physically endangering the user.
Otis, M. J.-D., Ayena, J. C., Tremblay, L. E., Fortin, P. E., & Ménélas, B.-A. J. (2016). Use of an Enactive Insole for Reducing the Risk of Falling on Different Types of Soil Using Vibrotactile Cueing for the Elderly. PloS One, 11(9).
Fortin, P. E.,& Cooperstock J. R.(2016). Exploring the relationship between laughter stimuli and the perception of emotion-rich tactile interactions. IEEE Transactions on Affective Computing Special Issue on Laughter. [Undergoing Revisions]
Fortin, P. E., Otis, M. J.-D., Duchaine, V., & Cooperstock, J. R. (2014). Event-based haptic vibration synthesis using a recursive filter for lower limb prosthetics. In 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings (pp. 47–52).
Fortin, P. E., Blum, J., & Cooperstock, J. R. (2017). “Electrical Muscle Stimulation for Simulated Heat Withdrawal Response”
Fortin, P. E., Cooperstock J. R., & Blain-Moraes, S. (2017). “A User-Driven Physiological Signals Quality Enhancement Technique for Telemedicine Applications”
Fortin, P. E., Serfaty, N., & Cooperstock, J. R. (2017). “Text Entry based, Contextually Aware Mobile Emotion Classifier”
Copyright © 2016 by Pascal E. Fortin