Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Aleksander Väljamäe
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2009) 18 (4): 277–285.
Published: 01 August 2009
Abstract
View article
PDF
Virtual and mixed reality environments (VMRE) often imply full-body human-computer interaction scenarios. We used a public multimodal mixed reality installation, the Synthetic Oracle, and a between-groups design to study the effects of implicit (e.g., passively walking) or explicit (e.g., pointing) interaction modes on the users' emotional and engagement experiences, and we assessed it using questionnaires. Additionally, real-time arm motion data was used to categorize the user behavior and to provide interaction possibilities for the explicit interaction group. The results show that the online behavior classification corresponded well to the users' interaction mode. In addition, contrary to the explicit interaction, the engagement ratings from implicit users were positively correlated with a valence but were uncorrelated with arousal ratings. Interestingly, arousal levels were correlated with different behaviors displayed by the visitors depending on the interaction mode. Hence, this study confirms that the activity level and behavior of users modulates their experience, and that in turn, the interaction mode modulates their behavior. Thus, these results show the importance of the selected interaction mode when designing users' experiences in VMRE.
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2008) 17 (1): 43–56.
Published: 01 February 2008
Abstract
View article
PDF
Sound is an important, but often neglected, component for creating a self-motion illusion (vection) in Virtual Reality applications, for example, motion simulators. Apart from auditory motion cues, sound can provide contextual information representing self-motion in a virtual environment. In two experiments we investigated the benefits of hearing an engine sound when presenting auditory (Experiment 1) or auditory-vibrotactile (Experiment 2) virtual environments inducing linear vection. The addition of the engine sound to the auditory scene significantly enhanced subjective ratings of vection intensity in Experiment 1 and vection onset times but not subjective ratings in Experiment 2. Further analysis using individual imagery vividness scores showed that this disparity between vection measures was created by participants with higher kinesthetic imagery. On the other hand, for participants with lower kinesthetic imagery scores, the engine sound enhanced vection sensation in both experiments. A high correlation with participants' kinesthetic imagery vividness scores suggests the influence of a first person perspective in the perception of the engine sound. We hypothesize that self-motion sounds (e.g., the sound of footsteps, engine sound) represent a specific type of acoustic body-centered feedback in virtual environments. Therefore, the results may contribute to a better understanding of the role of self-representation sounds (sonic self-avatars), in virtual and augmented environments.