Projects

From Immersive Visualization Lab Wiki
Revision as of 16:44, 1 May 2013 by Jschulze (Talk | contribs)

Jump to: navigation, search

Contents

Past Projects

Active Projects

Android Head Tracking (Ken Dang, 2013)

Image-missing.jpg The Project Goal is to create a Android App that would use face detection algorithms to allows head tracking on mobile devices

Camelot (Thomas Gray, 2013)

Image-missing.jpg The goal of this project is to create a multi-camera video capture system.

ZSculpt - 3D Sculpting with the zSpace (Thinh Nguyen, 2013)

Image-missing.jpg The goal of this project is to explore the use of the zSpace VR system for 3D sculpting.

Magic Lens (Tony Chan, Michael Chao, 2013)

Image-missing.jpg The goal of this project is to research the use of smart phones in a virtual reality environment.

Pose Estimation for a Mobile Device (Kuen-Han Lin, 2013)

Kitchen.jpg The goal of this project is to develop an algorithm which runs on a PC to estimate the pose of a mobile Android device, linked via wifi.

AndroidAR (Kristian Hansen, Mads Pedersen, 2012)

AndroidAR.jpg Tracked Android phones are used to view 3D models and to draw in 3D.

ArtifactVis with Android Device (Sahar Aseeri, 2012)

B.jpg In this project, an Android device is used to control the ArtifactVis application and to view images of the artifacts on the mobile device.

Bubble Menu (Cathy Hughes, 2012-)

BubbleMenuSmall.jpg The purpose of this project is to create a new 3D user interface designed to be controlled using a Microsoft Kinect. The new menu system is based on wireframe spheres designed to be selected using gestures, and it features animated menu transitions, customizable menus, and sound effects.

Multi-User Graphics with Interactive Control (MUGIC) (Philip Weber, Nadia Zeng, Andy Muehlhausen, 2012-)

TD performance.jpg This project allows artists without programming background to render images with CalVR, essentially allowing them to express their artworks and creativity in a much larger scale. See the project in action, and a more recent performance with it (starts at 14:00 min).

Multi-user virtual reality on mobile phones (James Lue, 2012-)

Multi users VR android phone2.JPG This project is based on android phones. Users can navigate in a virtual world in the phone and see other users in the virtual world. They can interact with the objects (such as opening doors) in the scene and other users can see the actions The phones are connected by wireless network. Currently it only can support 2 phones to connect to each other

Mobile Old Town Osaka Viewer (Sumin Wang, 2012-)

Image-missing.jpg This project aims at creating an Android-based viewer similar to Google Streetview, which allows the user to view a 3D model of old town Osaka, Japan. We collaborate with the Cybermedia Institute of Osaka University on this project.

3D Chromosome Viewer (Yixin Zhu, 2012-)

Chromosome7 domain.jpg The purpose of this project is to create a three-dimensional chromosome viewer to facilitate the visualization of intra-chromosomal interactions, genomic features and their relationships, and the discovery of new genes.

CameraFlight (William Seo, 2012-)

CameraFlight.png This project's goal is to create automatic camera flights from one place to another in osgEarth.

AppGlobe Infrastructure (Chris McFarland, Philip Weber, 2012)

AppSwitchMini.png This project's goal is to create an application switcher for osgEarth-based CalVR plugins.

GUI Sketching Tool (Cathy Hughes, Andrew Prudhomme, 2012)

SketchThumb.jpg This project's goal is to develop a 3D sketching tool for a VR GUI.

San Diego Wildfires (Philip Weber, Jessica Block, 2011)

Image-missing.jpg The Cedar Fire was a human-caused wildfire which destroyed a large number of buildings and infrastructure in San Diego County in October 2003. This application shows high resolution aerial data of the areas affected by the fires. The data resolution is extremely high at 0.5 meters for the imagery and 2 meters for elevation. Our demonstration shows this data embedded into an osgEarth-based visualization framework, which allows adding such data to any place on our planet and viewing it in a way similar to Google Earth, but with full support for high-end visualization systems.

PanoView360 (Andrew Prudhomme, Dan Sandin, 2010-)

Image-missing.jpg Researchers at UIC/EVL and UCSD/Calit2 have developed a method to acquire very high resolution, surround and stereo panorama images using dual SLR cameras. This VR application allows viewing these approximately gigabyte sized images in real-time and supports real-time changes of the viewing direction and zooming.

VOX and Virvo (Jurgen Schulze, 1999-)

Deskvox.jpg Ongoing development of real-time volume rendering algorithms for interactive display at the desktop (DeskVOX) and in virtual environments (CaveVOX). Virvo is name for the GUI independent, OpenGL based volume rendering library which both DeskVOX and CaveVOX use.