From Immersive Visualization Lab Wiki
Contents
- 1 Past Projects
- 2 Active Projects
- 2.1 MoveLib
- 2.2 CalVR (Andrew Prudhomme, Philip Weber, Jurgen Schulze, since 2010)
- 2.3 Fringe Physics (Robert Maloney, 2014-2015)
- 2.4 Boxing Simulator (Russell Larson, 2014)
- 2.5 Parallel Raytracing (Rex West, 2014)
- 2.6 Altered Reality (Jonathan Shamblen, Cody Waite, Zach Lee, Larry Huynh, Liz Cai 2013)
- 2.7 Focal Stacks (Jurgen Schulze, 2013)
- 2.8 Android Head Tracking (Ken Dang, 2013)
- 2.9 Automatic Stereo Switcher for the ZSpace (Matt Kubasak, Thomas Gray, 2013-2014)
- 2.10 ZSculpt - 3D Sculpting with the Leap (Thinh Nguyen, 2013)
- 2.11 Magic Lens (Tony Chan, Michael Chao, 2013)
- 2.12 Pose Estimation for a Mobile Device (Kuen-Han Lin, 2013)
- 2.13 Multi-User Graphics with Interactive Control (MUGIC) (Shahrokh Yadegari, Philip Weber, Andy Muehlhausen, 2012-)
- 2.14 PanoView360 (Andrew Prudhomme, Dan Sandin, 2010-)
- 2.15 VOX and Virvo (Jurgen Schulze, 1999-)
|
Active Projects
250px |
If you want to use the Sony Move as a 3D controller for the PC look here. |
CalVR (Andrew Prudhomme, Philip Weber, Jurgen Schulze, since 2010)
|
CalVR is our virtual reality middleware (a.k.a. VR engine), which we have been developing for our graphics clusters. It runs on anything from a laptop to a large multi-node CAVE, and builds under Linux, Windows and MacOS. More information about how to obtain the code and build it can be found on our main CalVR page. We also wrote a paper on CalVR, and gave a presentation on it. |
Altered Reality (Jonathan Shamblen, Cody Waite, Zach Lee, Larry Huynh, Liz Cai 2013)
|
SIO will soon have a new microscope which can generate focal stacks faster than before. We are working on algorithms to visualize and analyze these focal stacks. |
|
The Project Goal is to create a Android App that would use face detection algorithms to allows head tracking on mobile devices |
|
We created an Arduino-based solution to fix the problem that in Linux the Zspace's left and right views are initially in random order. The Arduino, along with custom software, is used to sense which eye is displayed when, so that CalVR can switch the eyes if necessary, in order to show a correct stereo image. |
|
The goal of this project is to explore the use of the Leap Motion device for 3D sculpting. |
Magic Lens (Tony Chan, Michael Chao, 2013)
|
The goal of this project is to research the use of smart phones in a virtual reality environment. |
|
The goal of this project is to develop an algorithm which runs on a PC to estimate the pose of a mobile Android device, linked via wifi. |
PanoView360 (Andrew Prudhomme, Dan Sandin, 2010-)
|
Researchers at UIC/EVL and UCSD/Calit2 have developed a method to acquire very high resolution, surround and stereo panorama images using dual SLR cameras. This VR application allows viewing these approximately gigabyte sized images in real-time and supports real-time changes of the viewing direction and zooming. |
|
Ongoing development of real-time volume rendering algorithms for interactive display at the desktop (DeskVOX) and in virtual environments (CaveVOX). Virvo is name for the GUI independent, OpenGL based volume rendering library which both DeskVOX and CaveVOX use. |
|