From Immersive Visualization Lab Wiki
Contents
- 1 Past Projects
- 2 Active Projects
- 2.1 VR Deep Packet Inspector (Marlon West, 2017)
- 2.2 MoveInVR (Lachlan Smith, 2016)
- 2.3 CalVR (Andrew Prudhomme, Philip Weber, Jurgen Schulze, since 2010)
- 2.4 Fringe Physics (Robert Maloney, 2014-2015)
- 2.5 Boxing Simulator (Russell Larson, 2014)
- 2.6 Parallel Raytracing (Rex West, 2014)
- 2.7 Altered Reality (Jonathan Shamblen, Cody Waite, Zach Lee, Larry Huynh, Liz Cai 2013)
- 2.8 Focal Stacks (Jurgen Schulze, 2013)
- 2.9 Android Head Tracking (Ken Dang, 2013)
- 2.10 Automatic Stereo Switcher for the ZSpace (Matt Kubasak, Thomas Gray, 2013-2014)
- 2.11 ZSculpt - 3D Sculpting with the Leap (Thinh Nguyen, 2013)
- 2.12 Magic Lens (Tony Chan, Michael Chao, 2013)
- 2.13 Pose Estimation for a Mobile Device (Kuen-Han Lin, 2013)
- 2.14 Multi-User Graphics with Interactive Control (MUGIC) (Shahrokh Yadegari, Philip Weber, Andy Muehlhausen, 2012-)
- 2.15 PanoView360 (Andrew Prudhomme, Dan Sandin, 2010-)
- 2.16 VOX and Virvo (Jurgen Schulze, 1999-)
|
Active Projects
MoveInVR (Lachlan Smith, 2016)
|
If you want to use the Sony Move as a 3D controller for the PC look here. |
CalVR (Andrew Prudhomme, Philip Weber, Jurgen Schulze, since 2010)
|
CalVR is our virtual reality middleware (a.k.a. VR engine), which we have been developing for our graphics clusters. It runs on anything from a laptop to a large multi-node CAVE, and builds under Linux, Windows and MacOS. More information about how to obtain the code and build it can be found on our main CalVR page. We also wrote a paper on CalVR, and gave a presentation on it. |
Altered Reality (Jonathan Shamblen, Cody Waite, Zach Lee, Larry Huynh, Liz Cai 2013)
|
SIO will soon have a new microscope which can generate focal stacks faster than before. We are working on algorithms to visualize and analyze these focal stacks. |
|
The Project Goal is to create a Android App that would use face detection algorithms to allows head tracking on mobile devices |
|
We created an Arduino-based solution to fix the problem that in Linux the Zspace's left and right views are initially in random order. The Arduino, along with custom software, is used to sense which eye is displayed when, so that CalVR can switch the eyes if necessary, in order to show a correct stereo image. |
|
The goal of this project is to explore the use of the Leap Motion device for 3D sculpting. |
Magic Lens (Tony Chan, Michael Chao, 2013)
|
The goal of this project is to research the use of smart phones in a virtual reality environment. |
|
The goal of this project is to develop an algorithm which runs on a PC to estimate the pose of a mobile Android device, linked via wifi. |
PanoView360 (Andrew Prudhomme, Dan Sandin, 2010-)
|
Researchers at UIC/EVL and UCSD/Calit2 have developed a method to acquire very high resolution, surround and stereo panorama images using dual SLR cameras. This VR application allows viewing these approximately gigabyte sized images in real-time and supports real-time changes of the viewing direction and zooming. |
|
Ongoing development of real-time volume rendering algorithms for interactive display at the desktop (DeskVOX) and in virtual environments (CaveVOX). Virvo is name for the GUI independent, OpenGL based volume rendering library which both DeskVOX and CaveVOX use. |
|