The architectural space impacts the emotional state of its inhabitants. Nevertheless, no studies have investigated, to date, how it influences the perception of others' affective states, possibly impacting our social behavior. This paper analyzes the eye-gaze data collected during a social scenario recreated after a promenade within virtual architectures. Immersive and dynamic virtual architectures were characterized by decreasing or increasing sidewall distance, ceiling height, windows height, and different colors. At the end of such an experience, participants judged the arousal level expressed by a virtual avatar. For the first time, we apply machine and deep learning techniques to the behavioral, environmental, and eye-gaze features extracted during the dynamic experience of virtual architectures. In order to verify the feasibility of automated classification of the final arousal judgment on the avatar emotional expression, we have considered both interpretable, i.e., decision trees, and black-box models, i.e., dense neural networks. The decision tree reached an accuracy rate of 66%, showing the importance of eye-gaze parameters to classify the participants' arousal judgment. The black-box dense neural network increased the accuracy up to 80%. Overall, our findings demonstrate the capability of artificial intelligence methodologies to classify and possibly predict the arousal judgment of body expressions at the end of a virtual promenade. Such knowledge will serve the design and evaluation of future spaces by combining virtual reality and artificial intelligence within the experience of architecture. In such a way, it will be possible to predict the influence of the surrounding architecture on human social behavior.

Machine and Deep Learning Techniques to Classify Arousal Judgments in Dynamic Virtual Experience of Architecture

Pecori R.;Aversano L.;Montano D.;Bernardi M. L.;
2023-01-01

Abstract

The architectural space impacts the emotional state of its inhabitants. Nevertheless, no studies have investigated, to date, how it influences the perception of others' affective states, possibly impacting our social behavior. This paper analyzes the eye-gaze data collected during a social scenario recreated after a promenade within virtual architectures. Immersive and dynamic virtual architectures were characterized by decreasing or increasing sidewall distance, ceiling height, windows height, and different colors. At the end of such an experience, participants judged the arousal level expressed by a virtual avatar. For the first time, we apply machine and deep learning techniques to the behavioral, environmental, and eye-gaze features extracted during the dynamic experience of virtual architectures. In order to verify the feasibility of automated classification of the final arousal judgment on the avatar emotional expression, we have considered both interpretable, i.e., decision trees, and black-box models, i.e., dense neural networks. The decision tree reached an accuracy rate of 66%, showing the importance of eye-gaze parameters to classify the participants' arousal judgment. The black-box dense neural network increased the accuracy up to 80%. Overall, our findings demonstrate the capability of artificial intelligence methodologies to classify and possibly predict the arousal judgment of body expressions at the end of a virtual promenade. Such knowledge will serve the design and evaluation of future spaces by combining virtual reality and artificial intelligence within the experience of architecture. In such a way, it will be possible to predict the influence of the surrounding architecture on human social behavior.
2023
Arousal score
Deep Learning
Eye-gaze
Machine Learning
Virtual environments
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12070/67213
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact