Difference between revisions of "Project2S21"
From Immersive Visualization Lab Wiki
(→Characterizing VR Systems) |
|||
Line 5: | Line 5: | ||
* Eye distance: Use a ruler or tape measure and measure your own inter-pupillary distance (IPD) while standing in front of a mirror. | * Eye distance: Use a ruler or tape measure and measure your own inter-pupillary distance (IPD) while standing in front of a mirror. | ||
* Field of View: measure the horizontal and vertical field of view (FOV) of your VR headset. Write an app to set up a measurement scene, and show the values in degrees in your VR app. Calculate the diagonal FOV based on horizontal and vertical FOV, and show that in the app as well. | * Field of View: measure the horizontal and vertical field of view (FOV) of your VR headset. Write an app to set up a measurement scene, and show the values in degrees in your VR app. Calculate the diagonal FOV based on horizontal and vertical FOV, and show that in the app as well. | ||
− | * Spatial Resolution: spatial resolution is measured in pixels per degree of the FOV (=angular resolution). Create line pattern of equally wide black lines and white spaces in between, move pattern away from | + | * Spatial Resolution: spatial resolution is measured in pixels per degree of the FOV (=angular resolution). Create a line pattern of equally wide black lines and white spaces in-between them and show it in the headset. With the controller, move the pattern away from you until the lines are no longer distinguishable. Show the value of pixels per degree in your VR app. Do this for vertical and horizontal lines separately, which gives you the angular resolution in vertical and horizontal direction, respectively. |
− | * Controller tracking precision: place controller on table, measure position and orientation over the course of | + | * Controller tracking precision: place one controller on a table. Put the headset on, look at the controller from a 1-2 foot distance and measure position and orientation of the controller over the course of about 5 seconds. Calculate the standard deviation for position and orientation separately. |
− | * Pointing precision with controller: create a sphere of 0.1 meters diameter | + | * Pointing precision with controller: create a sphere of 0.1 meters diameter and put it at a comfortable distance of about 2 feet right in front of you at the same height as your head. Point with a virtual laser pointer and click on the sphere 20 times within a maximum of 20 seconds. Count the number of hits. Then move the sphere further away from you and repeat. How far (in meters) can you move the sphere from your eyes until your hitting accuracy drops below 50%? |
− | * Closest distance eyes can converge on an object: create an object that can be moved along the Z axis. Place the object | + | * Closest distance eyes can converge on an object: create an object that can be moved along the Z axis. Place the object 1 meter from your eyes, at the same height as your head, and show the distance of the object from your eyes (i.e., midpoint between them) on the screen. Move the object closer with the controller in small increments, until your eyes can no longer focus on it without uncomfortable eye strain. Take note of this distance. If you can't see stereoscopic 3D, feel free recruit a friend or family member to do the test. |
==Extra Credit (Max. 10 Points)== | ==Extra Credit (Max. 10 Points)== |
Revision as of 12:32, 19 April 2021
Characterizing VR Systems
Measure the following parameters of your VR system:
- Eye distance: Use a ruler or tape measure and measure your own inter-pupillary distance (IPD) while standing in front of a mirror.
- Field of View: measure the horizontal and vertical field of view (FOV) of your VR headset. Write an app to set up a measurement scene, and show the values in degrees in your VR app. Calculate the diagonal FOV based on horizontal and vertical FOV, and show that in the app as well.
- Spatial Resolution: spatial resolution is measured in pixels per degree of the FOV (=angular resolution). Create a line pattern of equally wide black lines and white spaces in-between them and show it in the headset. With the controller, move the pattern away from you until the lines are no longer distinguishable. Show the value of pixels per degree in your VR app. Do this for vertical and horizontal lines separately, which gives you the angular resolution in vertical and horizontal direction, respectively.
- Controller tracking precision: place one controller on a table. Put the headset on, look at the controller from a 1-2 foot distance and measure position and orientation of the controller over the course of about 5 seconds. Calculate the standard deviation for position and orientation separately.
- Pointing precision with controller: create a sphere of 0.1 meters diameter and put it at a comfortable distance of about 2 feet right in front of you at the same height as your head. Point with a virtual laser pointer and click on the sphere 20 times within a maximum of 20 seconds. Count the number of hits. Then move the sphere further away from you and repeat. How far (in meters) can you move the sphere from your eyes until your hitting accuracy drops below 50%?
- Closest distance eyes can converge on an object: create an object that can be moved along the Z axis. Place the object 1 meter from your eyes, at the same height as your head, and show the distance of the object from your eyes (i.e., midpoint between them) on the screen. Move the object closer with the controller in small increments, until your eyes can no longer focus on it without uncomfortable eye strain. Take note of this distance. If you can't see stereoscopic 3D, feel free recruit a friend or family member to do the test.
Extra Credit (Max. 10 Points)
Options for extra credit are:
- Devise a method to measure the focal distance of your headset. This may require special hardware such as an SLR lens. (10 points)
More options will be added.
Submission Instructions
Once you are done implementing the project, record a video demonstrating all the functionality you have implemented.
The video should be no longer than 5 minutes, and can be substantially shorter. The video format should ideally be MP4, but any other format the graders can view will also work.
While recording the video, record your voice explaining what aspects of the project requirements are covered. Record the video off the screen if you use a VR headset.
To create the video you don't need to use video editing software.
- On any platform, you should be able to use Zoom to record a video.
- For Windows:
- Windows 10 has a built-in screen recorder
- If that doesn't work, the free OBS Studio is very good.
- On Macs you can use Quicktime.
Components of your submission:
- Video: Upload the video at the Assignment link on Canvas. Also add a text comment stating which functionality you have or have not implemented and what extra credit you have implemented. If you couldn't implement something in its entirety, please state which parts you did implement and expect to get points for.
- Example 1: I've done the base project with no issues. No extra credit.
- Example 2: Everything works except an issue with x: I couldn't get y to work properly.
- Example 3: Sections 1, 2 and 4 are fully implemented.
- Example 4: The base project is complete and I did z for extra credit.
- Source code: Upload your Unity project to Canvas: Create a .zip file of your Unity project and submit it to Canvas.
- Executable: If the .zip file with your Unity project includes the executable of your app you are done. Otherwise, build your Unity project into an Android .apk, Windows .exe file or the Mac equivalent and upload it to Canvas as zip file.