Project2S21

From Immersive Visualization Lab Wiki
Revision as of 12:18, 23 April 2021 by Jschulze (Talk | contribs)

Jump to: navigation, search

Contents

Characterizing VR Systems

This project's topic is measuring the technical features of your VR system.

You do not need to have a VR system available to start working on this project. We recommend implementing the project in Unity for the desktop first. Once everything works in Unity, you can start testing on your VR device.

The next Discussion on Monday, April 26th will focus on how to get your VR headset to work with Unity. If you get most of the code working by then, you will have plenty of time to finish up the project before the May 2nd deadline.

If you are waiting to receive your VR headset from the CSE department, our goal is to have all headsets delivered by April 24th.

Measuring Tasks (100 Points)

  • Eye distance: Use a ruler or tape measure to measure your own inter-pupillary distance (IPD) while standing in front of a mirror, or with the help of a friend. Report your IPD (in inches or centimeters), along with how you measured it, in your submission video. (5 points)

Create a Unity app to measure the following parameters of your VR headset. All measurements have to happen in one app, by doing them in sequence.

  • Field of View: measure the horizontal and vertical field of view (FOV) of your VR headset. Write an app to set up a measurement scene, and show the values in degrees in your VR app. Calculate the diagonal FOV based on horizontal and vertical FOV, and show that in the app as well. (20 points)
  • Spatial Resolution: spatial resolution is measured in pixels per degree of the FOV (=angular resolution). Create a line pattern of equally wide black lines and white spaces in-between them and show it in the headset. With the controller, move the pattern away from you until the lines are no longer distinguishable. Show the value of pixels per degree in your VR app. Do this for vertical and horizontal lines separately, which gives you the angular resolution in vertical and horizontal direction, respectively. (25 points)
  • Controller tracking precision: place one controller on a table. Put the headset on, look at the controller from a 1-2 foot distance and measure position and orientation of the controller over the course of about 5 seconds. Calculate the standard deviation for position and orientation separately. (15 points)
  • Pointing precision with controller: create a sphere of 0.1 meters diameter and put it at a comfortable distance of about 2 feet right in front of you at the same height as your head. Point with a virtual laser pointer and click on the sphere 20 times within a maximum of 20 seconds. Count the number of hits. Then move the sphere further away from you and repeat. How far (in meters) can you move the sphere from your eyes until your hitting accuracy drops below 50%? (25 points)
  • Closest distance eyes can converge on an object: create an object that can be moved along the Z axis. Place the object 1 meter from your eyes, at the same height as your head, and show the distance of the object from your eyes (i.e., midpoint between them) on the screen. Move the object closer with the controller in small increments, until your eyes can no longer focus on it without uncomfortable eye strain. Take note of this distance. If you can't see stereoscopic 3D, feel free recruit a friend or family member to do the test. (10 points)

Extra Credit (Max. 10 Points)

Options for extra credit are:

  • Create a menu to select each of the six measurements. For eye distance, ask the user to type in their eye distance in millimeters. For the other measurements, show buttons which when pressed will run the respective measurement code. Display each measurement next to the menu button, once it has been acquired. (10 points)
  • Devise a method to measure the focal distance of your VR headset. This is not easy, and will likely require special hardware, such as an SLR lens. Report on your measuring technique and result as part of the submission video. (10 points)

More options will be added.

Submission Instructions

Once you are done implementing the project, record a video demonstrating all the functionality you have implemented.

The video should be no longer than 5 minutes, and can be substantially shorter. The video format should ideally be MP4, but any other format the graders can view will also work.

While recording the video, record your voice explaining what aspects of the project requirements are covered. Record the video off the screen if you use a VR headset.

To create the video you don't need to use video editing software.

Components of your submission:

  • Video: Upload the video at the Assignment link on Canvas. Also add a text comment stating which functionality you have or have not implemented and what extra credit you have implemented. If you couldn't implement something in its entirety, please state which parts you did implement and expect to get points for.
    • Example 1: I've done the base project with no issues. No extra credit.
    • Example 2: Everything works except an issue with x: I couldn't get y to work properly.
    • Example 3: Sections 1, 2 and 4 are fully implemented.
    • Example 4: The base project is complete and I did z for extra credit.
  • Source code: Upload your Unity project to Canvas: Create a .zip file of your Unity project and submit it to Canvas. You can reduce the size of this file by hosting your project on Github with a .gitignore file for Unity, then downloading your project as a .zip file directly from Github.
  • Executable: If the .zip file with your Unity project includes the executable of your app you are done. Otherwise, build your Unity project into an Android .apk, Windows .exe file or the Mac equivalent and upload it to Canvas as zip file.