Dialogue: If you are surprised,
I will react to you

Depth camera, Python3 with Py-feat and OpenCV,
3D modeled face, Touchdesigner, electric sound
/ 2024

This work is a prototype of an interactive installation using audience facial expressions and 3D face images as a reaction to the audience. This series of events is based on a physiological loop in affective computing.

Exploring the gap between expression by the audience's facial emotion and visual artist expression is the central theme of this project. Whereas traditional visual art has tried to evoke viewers’ emotions that are opaque and non-quantified, emotions that are detected as physiological signals are defined precisely. In this project, an attempt is made to dialogue between emotions detected by the system and artistic facial expression images.

This work also aims to evoke the audience’s uncanny feelings. Cyberspace, where we often feel like we are in a dark, black-boxed space, is related to this work’s main technological component, as it uses the Python library Py-feat.


Trigger:
The user’s facial expression is caught by the sensor camera
- The camera captures only a human silhouette as a ‘Person’
- When the camera captures ‘Person’, saves the user’s face as a JPG image


Signal:
The FaceScore of ‘surprise’ detected by Py-feat
- Py-feat detects the FaceScore
- Set the threshold as FaceScore 0.4


Output:
The 3D facial images created by artistic concepts, react to the detected emotion, are shown.
If the FaceScore is less than the threshold, the initial image with the neutral face is shown.
- If the FaceScore is the same or more than the threshold, the 3D PNG facial images of the reaction are shown
- The 3D PNG images are changed depending on the FaceScore
The PNG images are set in a TouchDesigner and programmed to switch the images.