Difference between revisions of "Project3S21"
(→Project Description (100 Points)) |
(→Submission Instructions) |
||
(9 intermediate revisions by one user not shown) | |||
Line 14: | Line 14: | ||
# '''Head tracking:''' Cycle between different head tracking modes with the ''''B' button''': regular tracking (both position and orientation), orientation only (position frozen to what it just was before the mode was selected), position only (orientation frozen to what it just was), no tracking (position and orientation frozen to what they just were when the user pressed the button). ('''20 points''', 5 points per tracking mode) | # '''Head tracking:''' Cycle between different head tracking modes with the ''''B' button''': regular tracking (both position and orientation), orientation only (position frozen to what it just was before the mode was selected), position only (orientation frozen to what it just was), no tracking (position and orientation frozen to what they just were when the user pressed the button). ('''20 points''', 5 points per tracking mode) | ||
# '''Variable IOD:''' Gradually vary the interocular distance (IOD) with the '''right thumb stick left/right'''. Pushing down on the thumb stick should reset the IOD to the default. You'll have to learn about how the Oculus SDK specifies the IOD. They don't just use one number for the separation distance, but each eye has a 3D offset from a central point. Find out what these offsets are in the default case, and modify only their horizontal offsets to change the IOD, leaving the other two offsets untouched. Support an (absolute) IOD range from -0.1m (left/right eyes swapped) to +0.3m. ('''15 points''') | # '''Variable IOD:''' Gradually vary the interocular distance (IOD) with the '''right thumb stick left/right'''. Pushing down on the thumb stick should reset the IOD to the default. You'll have to learn about how the Oculus SDK specifies the IOD. They don't just use one number for the separation distance, but each eye has a 3D offset from a central point. Find out what these offsets are in the default case, and modify only their horizontal offsets to change the IOD, leaving the other two offsets untouched. Support an (absolute) IOD range from -0.1m (left/right eyes swapped) to +0.3m. ('''15 points''') | ||
− | # '''Tracking Lag:''' Explore what a lag (i.e., time delay) in tracking (head and controllers) would look like. Start by rendering a sphere at your dominant hand's controller position | + | # '''Tracking Lag:''' Explore what a lag (i.e., time delay) in tracking ('''head and controllers''') would look like. Start by rendering a sphere or controller model at your dominant hand's controller position. Then, instead of using the current camera and controller matrices you get from the Oculus SDK for rendering, save them into a [https://en.wikipedia.org/wiki/Circular_buffer ring buffer] (or other FIFO data structure) with at least 30 entries, and replace the actual head and controller positions with those from an earlier frame from the ring buffer. The default lag should be zero frames, but every click of the '''right index trigger''' should add one frame of tracking lag. The '''left index trigger''' should reduce the tracking lag by one frame. Display the amount of frames of lag with a label, e.g., "Tracking lag: 0 frames". ('''15 points''') |
− | # '''Rendering lag:''' Explore what it would look like if rendering a frame took more than 1/90th of a second (i.e., the Oculus Rift's refresh rate). Default is no additional delay, but every time the user pulls the '''right middle finger trigger''' you add one frame of rendering lag. This means that for as many frames as your delay is you skip rendering the updated images. The '''left middle finger trigger''' should reduce the rendering delay by 1 frame. Display the delay of rendering (i.e., the number of frames a rendered image is repeated for) | + | # '''Rendering lag:''' Explore what it would look like if rendering a frame took more than 1/90th of a second (i.e., the Oculus Rift's refresh rate). Default is no additional delay, but every time the user pulls the '''right middle finger trigger''' you add one frame of rendering lag. This means that for as many frames as your delay is you skip rendering the updated images. The '''left middle finger trigger''' should reduce the rendering delay by 1 frame. Display the delay of rendering (i.e., the number of frames a rendered image is repeated for) with a label such as "Rendering delay: 2 frames". Allow for up to 10 frames of delay. ('''10 points''') |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
'''Button assignment overview:''' | '''Button assignment overview:''' | ||
Line 89: | Line 85: | ||
* '''Source code:''' Upload your Unity project to Canvas: Create a .zip file of your Unity project and submit it to Canvas. You can reduce the size of this file by hosting your project on Github with a [https://github.com/github/gitignore/blob/master/Unity.gitignore .gitignore file for Unity], then downloading your project as a .zip file directly from Github. | * '''Source code:''' Upload your Unity project to Canvas: Create a .zip file of your Unity project and submit it to Canvas. You can reduce the size of this file by hosting your project on Github with a [https://github.com/github/gitignore/blob/master/Unity.gitignore .gitignore file for Unity], then downloading your project as a .zip file directly from Github. | ||
* '''Executable:''' If the .zip file with your Unity project includes the executable of your app you are done. Otherwise, build your Unity project into an Android .apk, Windows .exe file or the Mac equivalent and upload it to Canvas as zip file. | * '''Executable:''' If the .zip file with your Unity project includes the executable of your app you are done. Otherwise, build your Unity project into an Android .apk, Windows .exe file or the Mac equivalent and upload it to Canvas as zip file. | ||
+ | |||
+ | |||
+ | <!-- Next time: mention that the modes don't need to stack. --> |
Latest revision as of 22:07, 6 June 2021
Contents |
Levels of Immersion
In this project we are going to explore different levels of immersion with your VR headset.
Project Description (100 Points)
We recommend that you start with your project 2 files and remove the measurement code. You may find these files helpful to get started: Project 3 Files
- Cube: Render a 50cm wide cube in front of the user, at a distance of about 1.5 meters. You can give the cube any color or texture you would like. (5 points)
- 3D Sky box: Render a stereoscopic sky box using these skybox textures and the custom shader provided as part of the starter code. (Alternatively you can use these higher resolution skybox textures.) Here is a great tutorial on 3D skyboxes in Unity. (15 points)
- Field of View: Reduce the field of view of both eyes to half of the original size (eg, if it is 90 degrees originally, now it should be 45 degrees). You can do this by rendering a black frame on top of your left and right eye views, with the center being transparent. Toggle this mode on/off with the 'X' button. (10 points total)
- Stereo modes: Cycle between the following four modes with the 'A' button: 3D stereo, mono (the same image rendered on both eyes), left eye only (right eye black), right eye only (left eye black), inverted stereo (left eye image rendered to right eye and vice versa). Regardless of which mode is active, head tracking should work correctly, depending on which mode it's in as described below. (10 points, 2 points per mode)
- Head tracking: Cycle between different head tracking modes with the 'B' button: regular tracking (both position and orientation), orientation only (position frozen to what it just was before the mode was selected), position only (orientation frozen to what it just was), no tracking (position and orientation frozen to what they just were when the user pressed the button). (20 points, 5 points per tracking mode)
- Variable IOD: Gradually vary the interocular distance (IOD) with the right thumb stick left/right. Pushing down on the thumb stick should reset the IOD to the default. You'll have to learn about how the Oculus SDK specifies the IOD. They don't just use one number for the separation distance, but each eye has a 3D offset from a central point. Find out what these offsets are in the default case, and modify only their horizontal offsets to change the IOD, leaving the other two offsets untouched. Support an (absolute) IOD range from -0.1m (left/right eyes swapped) to +0.3m. (15 points)
- Tracking Lag: Explore what a lag (i.e., time delay) in tracking (head and controllers) would look like. Start by rendering a sphere or controller model at your dominant hand's controller position. Then, instead of using the current camera and controller matrices you get from the Oculus SDK for rendering, save them into a ring buffer (or other FIFO data structure) with at least 30 entries, and replace the actual head and controller positions with those from an earlier frame from the ring buffer. The default lag should be zero frames, but every click of the right index trigger should add one frame of tracking lag. The left index trigger should reduce the tracking lag by one frame. Display the amount of frames of lag with a label, e.g., "Tracking lag: 0 frames". (15 points)
- Rendering lag: Explore what it would look like if rendering a frame took more than 1/90th of a second (i.e., the Oculus Rift's refresh rate). Default is no additional delay, but every time the user pulls the right middle finger trigger you add one frame of rendering lag. This means that for as many frames as your delay is you skip rendering the updated images. The left middle finger trigger should reduce the rendering delay by 1 frame. Display the delay of rendering (i.e., the number of frames a rendered image is repeated for) with a label such as "Rendering delay: 2 frames". Allow for up to 10 frames of delay. (10 points)
Button assignment overview:
Mode | Button Assignment |
---|---|
Field of View | X button |
Stereo Modes | A button |
Head Tracking | B button |
Variable IOD | Right thumb stick left/right |
Tracking Lag | Left/right index finger trigger buttons |
Rendering Lag | Left/right middle finger trigger buttons |
Grading:
- Part 6: 5 points for logic of controlling IOD, 5 points for correct input processing, 5 points for IOD change in correct (head) coordinate system
- Part 7: if tracking delay affects only head or controller but not both: -5 points
- Part 8: if the controller still moves smoothly: -5 points
Extra Credit (up to 10 points)
There are four options for extra credit.
- Stereo Image Viewer: Take two regular, non-panoramic photos from an eye distance apart (about 65mm) with a regular camera such as the one in your cell phone. Use the widest angle your camera can be set to, as close to a 90 degree field of view as you can get. Cut the edges off to make the images square and exactly the same size. Modify your texturing code for the cubes so that they support stereo images (showing a different image to each eye). Use your custom images as the textures and render them in stereo so that you see a 3D image on the cube. You may have to make the cube bigger to see a correct stereo image - the size of the image should fill as much of your field of view as the camera's field of view was when it took the images. (5 points)
- Custom Sky Box: Create your own (monoscopic) sky box: Use your cell phone's panorama function to capture a 360 degree panorama picture (or use Google's StreetView app, which is free for Android and iPhone). Process it into cube maps - this on-line conversion tool can do this for you. Texture the sky box with the resulting textures. Note you'll have to download each of the six textures separately. Make it an alternate option to the Bear image when the 'X' button is pressed. (5 points)
- Super-Rotation: Modify the regular orientation tracking so that it exaggerates horizontal head rotations by a factor of two. This means that starting when the user's head faces straight forward, any rotation to left or right (=heading) is multiplied by two and this new head orientation is used to render the image. Do not modify pitch or roll. In this mode the user will be able to look behind them by just rotating their head by 90 degrees to either side. Get this mode to work with your skybox and calibration cubes, tracking fully on, and correct stereo rendering. This publication gives more information about this technique. (5 points)
- Smoother controller tracking: Render a sphere in the location of your dominant hand's controller location. Move your hand around and notice how the sphere follows it. Now push a previously unused controller button to enter 'Smoothing Mode'. In this mode, calculate the moving average over the past n frames' positional tracking values (we won't average over orientation here). Use this averaged position to render the sphere. Allow the user to change the number of frames (n) you're averaging over within a range of 1 to 45 in increments of 1. Notice how with larger values of n the tracking gets smoother, but there's also more lag between controller motion and the motion of the sphere. (5 points)
Submission Instructions
Once you are done implementing the project, record a video demonstrating all the functionality you have implemented, just like for projects 1 and 2. Below we are repeating those instructions.
The video should be no longer than 5 minutes, and can be substantially shorter. The video format should ideally be MP4, but any other format the graders can view will also work.
While recording the video, record your voice explaining what aspects of the project requirements are covered.
In this project, since some of the components requires rendering different images to both eyes, you need to use this tool to record videos off from both Oculus Quest's screens. Record Oculus Quest using scrcpy
- On any platform, you should be able to use Zoom to record a video.
- For Windows:
- Windows 10 has a built-in screen recorder
- If that doesn't work, the free OBS Studio is very good.
- On Macs you can use Quicktime.
If you can connect your headset to your computer with a video cable and use a Windows computer, you can use the Compositor Mirror tool, which is located on your computer in in C:\Program Files\Oculus\Support\oculus-diagnostics\OculusMirror.exe It's important that you record both the left and the right eye.
Components of your submission:
- Video: Upload the video at the Assignment link on Canvas. Also add a text comment stating which functionality you have or have not implemented and what extra credit you have implemented. If you couldn't implement something in its entirety, please state which parts you did implement and expect to get points for.
- Example 1: I've done the base project with no issues. No extra credit.
- Example 2: Everything works except an issue with x: I couldn't get y to work properly.
- Example 3: Sections 1, 2 and 4 are fully implemented.
- Example 4: The base project is complete and I did z for extra credit.
- Source code: Upload your Unity project to Canvas: Create a .zip file of your Unity project and submit it to Canvas. You can reduce the size of this file by hosting your project on Github with a .gitignore file for Unity, then downloading your project as a .zip file directly from Github.
- Executable: If the .zip file with your Unity project includes the executable of your app you are done. Otherwise, build your Unity project into an Android .apk, Windows .exe file or the Mac equivalent and upload it to Canvas as zip file.