Difference between revisions of "Project2S18"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
(Project Description (100 Points))
(Project Description (100 Points))
Line 16: Line 16:
 
# Use texture mapping to display [[Media:vr_test_pattern.zip | this calibration image]] on all faces of both cubes. The image needs to be right side up on the vertical faces, and make sure it's not mirrored. Each cube should be 20cm wide and its closest face should be about 30cm in front of the user. Here is [[Media:loadppm.txt | source code to load the PPM file]] into memory so that you can apply it as a texture. Sample code for texture loading can be found in [[Media:Hellovr_opengl_main.cpp|the OpenVR OpenGL example's]] function SetupTexturemaps().
 
# Use texture mapping to display [[Media:vr_test_pattern.zip | this calibration image]] on all faces of both cubes. The image needs to be right side up on the vertical faces, and make sure it's not mirrored. Each cube should be 20cm wide and its closest face should be about 30cm in front of the user. Here is [[Media:loadppm.txt | source code to load the PPM file]] into memory so that you can apply it as a texture. Sample code for texture loading can be found in [[Media:Hellovr_opengl_main.cpp|the OpenVR OpenGL example's]] function SetupTexturemaps().
 
# Render [http://web.eng.ucsd.edu/~jschulze/tmp/bear-stereo-cubemaps.zip this stereo panorama image] around the user as a sky box. The trick here is to render different textures on the sky box for each eye, which is necessary to achieve a 3D effect. Cycle with the 'X' button between showing the entire scene (cubes and sky box), just the sky box in stereo, and just the sky box in mono (i.e., one of the panorama images is rendered to both eyes).
 
# Render [http://web.eng.ucsd.edu/~jschulze/tmp/bear-stereo-cubemaps.zip this stereo panorama image] around the user as a sky box. The trick here is to render different textures on the sky box for each eye, which is necessary to achieve a 3D effect. Cycle with the 'X' button between showing the entire scene (cubes and sky box), just the sky box in stereo, and just the sky box in mono (i.e., one of the panorama images is rendered to both eyes).
# Cycle between the following four modes with the 'A' button: 3D stereo, mono (the same image rendered on both eyes, from viewpoint half way between the eyes), left eye only (right eye black), right eye only (left eye black), inverted stereo (left eye image rendered to right eye and vice versa). Cycling means that repeated pressing of the 'A' button will change viewing modes from 3D stereo to mono to left only to right only and back to 3D stereo. Regardless of which mode is active, head tracking should work correctly, depending on which mode it's in as described below.
+
# Cycle between the following four modes with the 'A' button: 3D stereo, mono (the same image rendered on both eyes, from viewpoint half way between the eyes), left eye only (right eye black), right eye only (left eye black), inverted stereo (left eye image rendered to right eye and vice versa). Regardless of which mode is active, head tracking should work correctly, depending on which mode it's in as described below.
 
# Cycle between different head tracking modes with the 'B' button: no tracking (position and orientation frozen to what they just were before the user pressed the button), orientation only (position frozen to what it just was before the mode was selected), position only (orientation frozen to what it just was), both (which is the normal tracking mode).
 
# Cycle between different head tracking modes with the 'B' button: no tracking (position and orientation frozen to what they just were before the user pressed the button), orientation only (position frozen to what it just was before the mode was selected), position only (orientation frozen to what it just was), both (which is the normal tracking mode).
 
<!--
 
<!--
Line 23: Line 23:
 
# Render a small white cube (about 5cm wide) at each of the controller positions. Create a measuring mode in which the distance between the controller positions is measured and displayed (in centimeters) in the application's text window on the Windows desktop, whenever the user presses the 'Y' button. (10 points)
 
# Render a small white cube (about 5cm wide) at each of the controller positions. Create a measuring mode in which the distance between the controller positions is measured and displayed (in centimeters) in the application's text window on the Windows desktop, whenever the user presses the 'Y' button. (10 points)
  
'''Note:'''  
+
'''Notes:'''
 +
* Cycling means that each time the respective button is pressed the viewing mode will change from one mode to the next, and eventually back to the first mode.
 
* The view in the Rift always needs to look like the control window on the screen: the render texture should never shift off the display panels in the Rift.
 
* The view in the Rift always needs to look like the control window on the screen: the render texture should never shift off the display panels in the Rift.
 
* To get the full scores you need to pass the following test during grading: we may test the functionality of the A and B buttons by having you put on the Rift and the grader will select the mode. You will need to say what mode the system is in. If you can't it's -5 points.
 
* To get the full scores you need to pass the following test during grading: we may test the functionality of the A and B buttons by having you put on the Rift and the grader will select the mode. You will need to say what mode the system is in. If you can't it's -5 points.

Revision as of 10:03, 21 April 2018

Levels of Immersion

In this project we are going to explore different levels of immersion with the Oculus Rift. Like in project 1, we're going to use the Oculus SDK, OpenGL and C++.

Starter Code

We recommend starting either with your project 1 code, or going back to the starter code. It's very compact but yet uses most of the API functions you will need.

Here is the minimal example .cpp file with comments on where the hooks are for the various functions required for this homework project. Note that this file has a main() function, so that it can be built as a console application, and will start up with a console window into which debug messages can be printed.

Project Description (100 Points)

You need to do the following things:

  1. Modify the starter code (in which many cubes are rendered) to render just two cubes, one behind another along the user's line of sight.
  2. Use texture mapping to display this calibration image on all faces of both cubes. The image needs to be right side up on the vertical faces, and make sure it's not mirrored. Each cube should be 20cm wide and its closest face should be about 30cm in front of the user. Here is source code to load the PPM file into memory so that you can apply it as a texture. Sample code for texture loading can be found in the OpenVR OpenGL example's function SetupTexturemaps().
  3. Render this stereo panorama image around the user as a sky box. The trick here is to render different textures on the sky box for each eye, which is necessary to achieve a 3D effect. Cycle with the 'X' button between showing the entire scene (cubes and sky box), just the sky box in stereo, and just the sky box in mono (i.e., one of the panorama images is rendered to both eyes).
  4. Cycle between the following four modes with the 'A' button: 3D stereo, mono (the same image rendered on both eyes, from viewpoint half way between the eyes), left eye only (right eye black), right eye only (left eye black), inverted stereo (left eye image rendered to right eye and vice versa). Regardless of which mode is active, head tracking should work correctly, depending on which mode it's in as described below.
  5. Cycle between different head tracking modes with the 'B' button: no tracking (position and orientation frozen to what they just were before the user pressed the button), orientation only (position frozen to what it just was before the mode was selected), position only (orientation frozen to what it just was), both (which is the normal tracking mode).

More to follow...