Dates
Thursday, September 07, 2023 - 02:30pm to Thursday, September 07, 2023 - 04:00pm
Location
NCS 120
Event Description

Abstract:Advancements in data capture and imaging technologies have transformed the acquisition of high-resolution 3D data, enabling scientists to capture object characteristics across diverse domains. Visualization systems have become essential tools in scientific and biomedical workflows, aiding in the exploration and analysis of such data. While most visualization systems are designed for desktop setups, recent developments in eXtended Reality (XR) technologies -- virtual reality (VR), augmented reality (AR), and mixed reality (MR) -- present novel avenues for human information-processing capability. This dissertation explores designing end-to-end visualization systems for a spectrum of visual display modalities, with a specific focus on applications in neuroscience and atmospheric sciences and demonstrating them on the Reality Deck (RD), the world's largest immersive display. For neuroscience, we introduce a novel algorithm that overcomes the out-of-focus blur caused by the inherent design of wide-field microscopes. For a complete end-to-end system, we present NeuroConstruct, an application for the segmentation, registration, and visualization of neuronal structures. To address the limitation that biological specimens can only be imaged at a single timepoint, we introduce NeuRegenerate, a framework for the prediction and visualization of changes in neural fiber morphology within a subject across specified age-timepoints. In the atmospheric science domain, we present Submerse for visualizing flooding scenarios on large and immersive display ecologies. Specifically, we present our method for reconstructing a surface mesh from input flood simulation data, synthesizing water waves to provide a perception of flooding direction, and generating a to-scale 3D virtual scene by incorporating geographical data such as terrain, textures, buildings, and additional scene objects. As interaction is key for effective decision-making and analysis, we introduce two novel techniques for flood visualization in immersive systems: (1) an automatic scene-navigation method using optimal camera viewpoints generated for marked points-of-interest based on the display layout, and (2) an AR-based focus+context technique using an auxiliary display system. Finally, recognizing the potential of AR to integrate scientific visualizations into the physical world, we present VoxAR, a method that determines an appropriate placement of a volume-rendered object in the real-world and adapts the colors of the volume-rendered object based on the real-world placement region.

Event Title
Ph.D. Thesis Defense: Saeed Boorboor, 'Immersive and Augmented 3D Scientific Visualization Systems'