Dates
Monday, December 05, 2022 - 06:00pm to Monday, December 05, 2022 - 08:00pm
Location
NCS 120
Event Description

Abstract: 


The virtual camera connects people and interactive computer graphics. Camera control methods enable the users to explore the virtual environment in desired ways. Traditional camera control methods mainly follow objective metrics, such as maximizing the information visibility in the camera views, which, however, ignores the significant influence of people's subjective experience. The perceptual judgment from users is particularly magnified in the camera control in immersive environments since the immersive exploration maintains the user-dependent interaction and egocentric immersive experience, which is unfeasible if evaluated solely based on a universally formulated objective metric.

This dissertation aims at pushing the boundary of camera control techniques in immersive explorations using perception-based metrics. Specifically, the cutting-edge camera control techniques in non-immersive and immersive scenarios are summarized first. Subsequently, we present a perception model to quantify the perceived motion sickness level during seated exploration in virtual reality. Furthermore, a real-time camera control method for immersive explorations with real-time motion sickness reduction is formulated based on our motion sickness perception model. On top of this model, We present how to improve the viewpoint finding and camera path control method in large-scale urban data visualization.

Unlike motion sickness, which is egocentric, numerous aspects of a virtual exploration can be either objective-driven or subjective-oriented, among which geometry perception in an immersive environment is one of the most distinguishing attributes from the virtual exploration in traditional display environments. Towards improving the geometry information perception, we present an innovation for visualizing 3D treelike biomedical structures using the combination of planar embedding and 3D exploration. To further investigate how to enhance depth perception, we apply the metric we have learned from our preliminary user study to the visual analytics of treelike structures in immersive environments.
In both motion sickness reduction and depth enhancement, the camera view selection relies or partially relies on the user's real-time input. To further investigate the possibility of guided camera path generation based on perceptual metrics, we have studied people's visual attention patterns when exposed to multiple concurrent visual stimuli that are informatively comparable. The findings can benefit the attentional view guidance and object placement in virtual reality. Moreover, we have developed an automatic aesthetic view-finding method based on deep reinforcement learning techniques, demonstrating the possibility of aesthetics-driven exploration in an arbitrary indoor scene.

Event Title
Ph.D. Thesis Defense: Peggy Hu, 'Perceptual Camera Control for Immersive Explorations'