Difference between revisions of "Project2aW19"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
(Selection and Manipulation (100 Points))
(Selection and Manipulation (100 Points))
Line 35: Line 35:
 
3. Allow the user to spawn desks and chairs: when the user presses the A (or X) button on the dominant hand's controller, a chair is created at about 6 feet (or 2 meter) distance from the user's hand on the ray. When the user presses B (or Y), a desk gets created (10 points). While the user holds down the spawn button, the desk/chair remains on the ray so that it can be positioned (5 points) and oriented (5 points) as desired. Once the button is released, the desk/chair falls down pulled by gravity and comes to rest on the floor, or on another piece of furniture if one happens to be below it (10 Points). (Total: 30 points)
 
3. Allow the user to spawn desks and chairs: when the user presses the A (or X) button on the dominant hand's controller, a chair is created at about 6 feet (or 2 meter) distance from the user's hand on the ray. When the user presses B (or Y), a desk gets created (10 points). While the user holds down the spawn button, the desk/chair remains on the ray so that it can be positioned (5 points) and oriented (5 points) as desired. Once the button is released, the desk/chair falls down pulled by gravity and comes to rest on the floor, or on another piece of furniture if one happens to be below it (10 Points). (Total: 30 points)
  
4. Implement ray casting to select one of the desks or chairs in the room: highlight the piece of furniture the ray intersects at any given time, when no button is pressed. If it intersects multiple objects, highlight the object that is closest to the user's dominant hand. (20 Points)
+
4. Implement ray casting to select one of the desks or chairs in the room: find out which piece of furniture is intersected by the ray and highlight it. Update the highlight when the ray intersects a different piece of furniture. If the ray intersects multiple objects, highlight the object that is closest to the controller. Make sure you ignore the user's avatar for the intersection test. You can choose any highlighting method you would like, such as a wireframe box around the collider, a halo, change the object's color, make it pulsate, etc. (20 Points)
  
 
5. Allow manipulation of the highlighted object when the user pulls the trigger button of the dominant hand's controller: move the object with the ray as if you skewer it, until the trigger button is released. The motion should resemble that of a marshmallow that you hold on a stick over a campfire. When the trigger button is released, the physics engine should take over and make the object fall down just like when initially spawned. (20 Points)
 
5. Allow manipulation of the highlighted object when the user pulls the trigger button of the dominant hand's controller: move the object with the ray as if you skewer it, until the trigger button is released. The motion should resemble that of a marshmallow that you hold on a stick over a campfire. When the trigger button is released, the physics engine should take over and make the object fall down just like when initially spawned. (20 Points)

Revision as of 12:31, 1 February 2019

Contents

Homework Assignment 2a: VR Classroom Design Tool - Selection and Manipulation

For this assignment you can obtain 100 points, plus up to 10 points of extra credit.

The goal of this assignment is to create a 3D application which can help with the design of a VR classroom.

This assignment is part 1 of a 2 part project. The second part is due a week later and will build on the first one.

The assignment is to be done individually, not in teams. The due date for this project is Friday, February 1st, 2019 at 3pm.

This project is to be done with the Oculus Rift head mounted display along with at least one Oculus Touch controller. Unless you have your own VR equipment, you should do the project in the VR lab, room B210.

This project will be covered in discussion on Monday, January 28th, 2019.

You can do this project in Unity 3D, UE4, Lumberyard or C++ with OpenGL and optionally OpenSceneGraph.

Unity 3D

To enable Touch controller support in Unity, use the Oculus Integration package from the Unity Asset Store.

Note that if you use Unity you aren't allowed to use any Unity assets other than the Oculus integration, 3D models and textures without explicit permission by a course staff member.

Selection and Manipulation (100 Points)

In this project, the only permitted interaction devices are the Oculus Rift HMD and one or two Oculus Touch controllers. Keyboard, mouse and other input devices cannot be used once the application runs.

Imagine that you have been tasked with re-furnishing the VR lab in room B210. Write a 3D application which puts the user in a 3D model of the lab at 1:1 scale (i.e., life size). This ZIP file contains the room, as well as the furniture needed for this project (note there will be more furniture items in the ZIP file than needed for part 2a).

Vr-lab.jpg

1. Put the user roughly in the center of the empty lab room (without any of the furniture in it initially). (15 Points)

2. Display a ray emanating from your dominant hand's controller. The ray should point forward from the controller, much like a laser pointer would. The ray should be long enough to reach all of the walls of the lab. (15 Points)

3. Allow the user to spawn desks and chairs: when the user presses the A (or X) button on the dominant hand's controller, a chair is created at about 6 feet (or 2 meter) distance from the user's hand on the ray. When the user presses B (or Y), a desk gets created (10 points). While the user holds down the spawn button, the desk/chair remains on the ray so that it can be positioned (5 points) and oriented (5 points) as desired. Once the button is released, the desk/chair falls down pulled by gravity and comes to rest on the floor, or on another piece of furniture if one happens to be below it (10 Points). (Total: 30 points)

4. Implement ray casting to select one of the desks or chairs in the room: find out which piece of furniture is intersected by the ray and highlight it. Update the highlight when the ray intersects a different piece of furniture. If the ray intersects multiple objects, highlight the object that is closest to the controller. Make sure you ignore the user's avatar for the intersection test. You can choose any highlighting method you would like, such as a wireframe box around the collider, a halo, change the object's color, make it pulsate, etc. (20 Points)

5. Allow manipulation of the highlighted object when the user pulls the trigger button of the dominant hand's controller: move the object with the ray as if you skewer it, until the trigger button is released. The motion should resemble that of a marshmallow that you hold on a stick over a campfire. When the trigger button is released, the physics engine should take over and make the object fall down just like when initially spawned. (20 Points)

Extra Credit (10 Points)

There are two options for extra credit.

1. Improve the spawning method: create a menu that shows up when you push and hold the trigger button on your non-dominant hand's controller. Show not just the chair and desk as options but also the other pieces of furniture in the ZIP file. Allow spawning a piece of furniture of your choice by clicking on it with the trigger button on your dominant hand's controller while the ray intersects the object you want to spawn. (5 points)

2. Add an interaction mode for two-handed scaling of the entire world around you. By clicking and holding down both middle finger trigger buttons, the user can gradually scale up or down the entire room with all the furniture to work on a global scale (by scaling down) or to work more accurately (by scaling up). The room should change its scale proportionally to the distance between the controllers while the middle finger triggers are held. Add a function to reset to the initial 1:1 scale by pulling both middle finger triggers simultaneously for less than a second. (5 points)