Difference between revisions of "Projects"
From Immersive Visualization Lab Wiki
(→Multi-User Graphics with Interactive Control (MUGIC) (Shahrokh Yadegari, Philip Weber, Andy Muehlhausen, 2012-)) |
(→Soft Robotic Glove for Haptic Feedback in Virtual Reality (Saurabh Jadhav, 2017)) |
||
(21 intermediate revisions by 3 users not shown) | |||
Line 3: | Line 3: | ||
=<b>Active Projects</b>= | =<b>Active Projects</b>= | ||
− | ===[[ | + | ===[[Soft Robotic Glove for Haptic Feedback in Virtual Reality]] (Saurabh Jadhav, 2017)=== |
+ | |||
<table> | <table> | ||
<tr> | <tr> | ||
− | <td>[[Image: | + | <td>[[File:Cover1.jpg]]</td> |
− | <td></td> | + | <td>User interacting with the haptic glove in Virtual Reality</td> |
+ | </tr> | ||
+ | </table> | ||
+ | <hr> | ||
+ | |||
+ | ===[[VRDPI | VR Deep Packet Inspector]] (Marlon West, 2017)=== | ||
+ | |||
+ | <table> | ||
+ | <tr> | ||
+ | <td>[[File:Screenshot 2 4-9-17.jpg|thumb|right|250px]]</td> | ||
+ | <td>Explore the network communications in Virtual Reality</td> | ||
+ | </tr> | ||
+ | </table> | ||
+ | <hr> | ||
+ | |||
+ | ===[[Playstation_Move_UDP_API | MoveInVR]] (Lachlan Smith, 2016)=== | ||
+ | |||
+ | <table> | ||
+ | <tr> | ||
+ | <td>[[Image:move-and-eye.jpg|250px]]</td> | ||
+ | <td>If you want to use the Sony Move as a 3D controller for the PC look here.</td> | ||
</tr> | </tr> | ||
</table> | </table> | ||
Line 71: | Line 92: | ||
<td>[[Image:android-head-tracking2-250.jpg]]</td> | <td>[[Image:android-head-tracking2-250.jpg]]</td> | ||
<td>The Project Goal is to create a Android App that would use face detection algorithms to allows head tracking on mobile devices</td> | <td>The Project Goal is to create a Android App that would use face detection algorithms to allows head tracking on mobile devices</td> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
</tr> | </tr> | ||
</table> | </table> | ||
Line 132: | Line 144: | ||
<table> | <table> | ||
<tr> | <tr> | ||
− | <td>[[Image: | + | <td>[[Image:panoview-in-tourcave-250.jpg]]</td> |
<td>Researchers at UIC/EVL and UCSD/Calit2 have developed a method to acquire very high resolution, surround and stereo panorama images using dual SLR cameras. This VR application allows viewing these approximately gigabyte sized images in real-time and supports real-time changes of the viewing direction and zooming.</td> | <td>Researchers at UIC/EVL and UCSD/Calit2 have developed a method to acquire very high resolution, surround and stereo panorama images using dual SLR cameras. This VR application allows viewing these approximately gigabyte sized images in real-time and supports real-time changes of the viewing direction and zooming.</td> | ||
</tr> | </tr> |
Revision as of 18:45, 4 August 2017
Past Projects
Active Projects
Soft Robotic Glove for Haptic Feedback in Virtual Reality (Saurabh Jadhav, 2017)
User interacting with the haptic glove in Virtual Reality |
VR Deep Packet Inspector (Marlon West, 2017)
Explore the network communications in Virtual Reality |
MoveInVR (Lachlan Smith, 2016)
If you want to use the Sony Move as a 3D controller for the PC look here. |
CalVR (Andrew Prudhomme, Philip Weber, Jurgen Schulze, since 2010)
CalVR is our virtual reality middleware (a.k.a. VR engine), which we have been developing for our graphics clusters. It runs on anything from a laptop to a large multi-node CAVE, and builds under Linux, Windows and MacOS. More information about how to obtain the code and build it can be found on our main CalVR page. We also wrote a paper on CalVR, and gave a presentation on it. |
Fringe Physics (Robert Maloney, 2014-2015)
Boxing Simulator (Russell Larson, 2014)
Parallel Raytracing (Rex West, 2014)
Altered Reality (Jonathan Shamblen, Cody Waite, Zach Lee, Larry Huynh, Liz Cai 2013)
Focal Stacks (Jurgen Schulze, 2013)
SIO will soon have a new microscope which can generate focal stacks faster than before. We are working on algorithms to visualize and analyze these focal stacks. |
Android Head Tracking (Ken Dang, 2013)
The Project Goal is to create a Android App that would use face detection algorithms to allows head tracking on mobile devices |
Automatic Stereo Switcher for the ZSpace (Matt Kubasak, Thomas Gray, 2013-2014)
ZSculpt - 3D Sculpting with the Leap (Thinh Nguyen, 2013)
The goal of this project is to explore the use of the Leap Motion device for 3D sculpting. |
Magic Lens (Tony Chan, Michael Chao, 2013)
The goal of this project is to research the use of smart phones in a virtual reality environment. |
Pose Estimation for a Mobile Device (Kuen-Han Lin, 2013)
The goal of this project is to develop an algorithm which runs on a PC to estimate the pose of a mobile Android device, linked via wifi. |
Multi-User Graphics with Interactive Control (MUGIC) (Shahrokh Yadegari, Philip Weber, Andy Muehlhausen, 2012-)
Allow users simplified and versatile access to CalVR systems via network rendering commands. Users can create computer graphics in their own environments and easily display the output on any CalVR wall or system. See the project in action, and a condensed lecture on the mechanisms. |