Difference between revisions of "Project3W21"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
(Homework Assignment 3: Classroom Design Tool)
Line 16: Line 16:
 
For this assignment you can obtain 100 points, plus up to 10 points of extra credit.
 
For this assignment you can obtain 100 points, plus up to 10 points of extra credit.
  
The goal of this assignment is to create a 3D application which can help with the design of a VR classroom such as UCSD's in the CSE building.
+
The goal of this assignment is to create a 3D application which can help with the design of a classroom such as UCSD's VR lab in the CSE building.
  
 
All interaction (except otherwise noted) has to be done with the VR system (headset and controllers). The user is not allowed to use keyboard or mouse.
 
All interaction (except otherwise noted) has to be done with the VR system (headset and controllers). The user is not allowed to use keyboard or mouse.

Revision as of 18:42, 7 February 2021

Contents

Homework Assignment 3: Classroom Design Tool

Prerequisites:

  • Windows or Mac PC
  • Unity
  • GitHub Repo (accept from GitHub Classroom link)
  • VR headset with two 3D-tracked controllers, such as the Oculus Quest 2, Rift (S), Vive, etc.

Learning objectives:

  • Physics in Unity
  • Selection and manipulation with virtual hand and ray-casting.
  • Travel with the grabbing the air technique

For this assignment you can obtain 100 points, plus up to 10 points of extra credit.

The goal of this assignment is to create a 3D application which can help with the design of a classroom such as UCSD's VR lab in the CSE building.

All interaction (except otherwise noted) has to be done with the VR system (headset and controllers). The user is not allowed to use keyboard or mouse.

We will give an introduction to this project in discussion on Monday, February 8th at 4pm.

Getting Unity ready for VR

To enable Oculus VR support in Unity, use the Oculus Integration package from the Unity Asset Store. For SteamVR-compatible devices, there is a separate Unity asset. For other VR systems there should also be assets available. Contact us if you can't make your headset work.

Selection and Manipulation

Write a 3D application which puts the user in a 3D model of the VR lab at 1:1 scale (i.e., life size). This ZIP file contains the room, as well as the furniture needed for this project.

Vr-lab.jpg

  • Initially, place the user roughly in the center of the empty lab room (without any of the furniture in it).
  • Enable collision detection and physics in Unity and give the furniture as well as the room proper physics properties as well as colliders.
  • Implement the following two 3D selection and manipulation methods: ray-casting and virtual hand interaction. Switch between them with the X button on your controllers.
  • Each method needs to allow the user to create a piece of furniture, place it in the desired location, and rotate it to the desired orientation.

Those are the high-level goals. In detail you will need to deal with the following:

Spawning of Furniture

When the user presses the A button on the controller, a chair is created at about 2 meters from the user's hand on the ray. When the user presses B, a desk is created. While the user holds down the spawn button, the desk/chair remains on the ray so that it can be positioned and oriented as desired. Once the button is released, the desk/chair falls down pulled by gravity and comes to rest on the floor or on another piece of furniture. Repeated pushes on A or B should create more pieces of furniture of that type.

Ray-Casting

  • Display a line starting at your dominant hand's controller. The ray should point forward from the controller, much like a laser pointer would. The ray should be long enough to reach all of the walls of the lab.
  • To select one of the desks or chairs in the room: find out which piece of furniture is intersected by the ray and highlight it. Update the highlight when the ray intersects a different piece of furniture. If the ray intersects multiple objects, highlight the object that is closest to the controller. Make sure you ignore the user's avatar for the intersection test. You can choose any highlighting method you would like, such as a wireframe box around the collider, a halo, change the object's color, make it pulsate, etc.
  • Allow manipulation of the highlighted object when the user pulls the trigger button of the dominant hand's controller: move the object with the ray until the trigger button is released. The motion should resemble that of a marshmallow that you hold on a stick over a campfire. When the trigger button is released, the physics engine should take over and make the object fall down just like when initially spawned.

Virtual Hand

  • Display a sphere by your dominant hand's controller. The sphere should be about 0.1 meters wide. Add a collider to the sphere.
  • When the sphere collides with a piece of furniture, highlight it. When the user pulls the trigger on the controller, start moving the piece of furniture with the hand. This motion is very similar to ray-casting and should act much like ray-casting with a very short ray.

Travel

Implement the Grabbing the Air Technique for the user to move themselves through the classroom. Don't check for collisions but allow the user to go through furniture and walls.

Use the grab buttons for this functionality: on Oculus controllers they are at the middle fingers, on Vive or Microsoft XR controllers they are called grip buttons.

Extra Credit (10 Points)

You can choose between the following options for a maximum of 10 points of extra credit.

  1. Add an interaction mode for two-handed scaling of the entire world around you. By clicking and holding down both grab buttons, the user can gradually scale up or down the entire room with all the furniture to work on a global scale (by scaling down) or to work more accurately (by scaling up). The room should change its scale proportionally to the distance between the controllers while the grab buttons are held. Add a function to reset to the initial 1:1 scale by pulling both grab buttons simultaneously for less than about half a second. (On Oculus controllers the grab buttons are those under the middle fingers, on Vive controllers they are the ones on the sides of the controllers.) (4 points)
  2. Implement the Go-Go hand technique and replace the regular virtual hand with it. (4 points)
  3. Allow saving and loading of the furniture configuration. You can use keyboard keys (such as 's' for save and 'l' for load). You need to save to a file and be able to load from the file after quitting the app and restarting it. (4 points)
  4. Create a mini map of the room to interact with the furniture. Also add teleporting to wherever the user points in the mini map. (6 points)