Difference between revisions of "Project4S17"
(→Useful Tips) |
(→Homework Assignment 4: Social VR) |
||
Line 25: | Line 25: | ||
The application has to have the following components: | The application has to have the following components: | ||
− | * If the 2nd user uses the Leap, the graphics window on the monitor needs to be [http://paulbourke.net/stereographics/anaglyph/ rendered with anaglyph stereo]. | + | * If the 2nd user uses the Leap, the graphics window on the monitor needs to be [http://paulbourke.net/stereographics/anaglyph/ rendered with anaglyph stereo]. |
+ | * If the 2nd user uses a smart phone, the PC either needs to stream the rendered images to it, or you have to write a renderer for the smart phone in its native language (e.g., Android, iOS). | ||
* There needs to be at least one piece of stationary geometry in the space, such as a table or a tennis net. This object needs to be a 3D model file with at least one texture. | * There needs to be at least one piece of stationary geometry in the space, such as a table or a tennis net. This object needs to be a 3D model file with at least one texture. | ||
* There needs to be at least one piece of moving geometry, such as a building block or a tennis ball. This object also needs to be a 3D model file with at least one texture. | * There needs to be at least one piece of moving geometry, such as a building block or a tennis ball. This object also needs to be a 3D model file with at least one texture. |
Revision as of 22:39, 11 June 2017
Contents |
Homework Assignment 4: Social VR
For this project you need to implement a two user VR application. Since we don't have two HMDs and computers for each team, one user is going to use the Oculus Rift with the Touch controllers, the other user is going to use either the Leap Motion and the monitor, or a smart phone in a VR viewer. For inspiration for this project you can use the Oculus Toy Box application, although in it both users have a full set of Oculus Rift equipment. We do have a special option for those who have access to two Rift units, see below.
For this assignment you can obtain 100 points, plus up to 10 points for extra credit.
This homework assignment is due on Tuesday, June 13th at 3:00pm.
The project is designed to be a team project for two people, just like the other projects this quarter.
The goal of this project is to create a dual user 3D VR application. One of the users has to use the Oculus Rift with the Touch controllers. For the other user you have a choice between:
- The Leap Motion gesture tracker.
- A smart phone with a VR viewer, such as Merge VR, which you can borrow from the Media Teaching Lab. In this case you could write an OpenGL application for the 2nd user and feed its output into the Trinus VR system. Or you could write the 2nd user's code in the native language of your cell phone OS (Android or iOS).
If you have access to an additional Rift with Touch controllers, here's a third option: Instead of using Leap or cell phone for one of the users, you can use create an all Rift application, but you need to support at least three Rifts, as an additional challenge in exchange for not having to work with Leap or phone. During grading, we'll have one PC ready with the TAs' Rift on which you'll need to install your application by bringing it on a USB storage device (thumb drive, HDD, etc), on which all the required files need to be in a single directory, ready to run by double clicking on a .exe file. To test your application with three Rifts, you can borrow the TA Rift during the TAs' office hours. Or coordinate with another team to use theirs at times they don't need it. Please do the majority of your 3-user testing well before the deadline because the lab may get crowded in the days leading up to presentation day.
The following rules apply:
- The application can't be based on homework project 1 (the CO2 removal trainer), unless you work on the final project alone.
- Oculus Rift users are only allowed to use the HMD and the Touch controllers for input - no keyboard, mouse, etc.
- Leap Motion users can use mouse or keyboard for secondary interactions, such as moving the user around. The primary interaction must be done through the Leap.
- Smart phone VR users can use the mouse in addition to the HMD and the button on it.
The application has to have the following components:
- If the 2nd user uses the Leap, the graphics window on the monitor needs to be rendered with anaglyph stereo.
- If the 2nd user uses a smart phone, the PC either needs to stream the rendered images to it, or you have to write a renderer for the smart phone in its native language (e.g., Android, iOS).
- There needs to be at least one piece of stationary geometry in the space, such as a table or a tennis net. This object needs to be a 3D model file with at least one texture.
- There needs to be at least one piece of moving geometry, such as a building block or a tennis ball. This object also needs to be a 3D model file with at least one texture.
- The two users need to interact with objects in the same space and work together on something, for example: hand an object to the other user, play 3D Pong, play chess, build something with Legos, etc.
- Collision detection needs to be part of the interaction algorithm. Can be simply done by proximity, or bounding box collisions.
- Each user needs to use at least one of their hands in the application.
- Both head positions and the positions of the interacting hand(s) of each user needs to be indicated with at least a simple piece of geometry for both users. This piece of geometry (e.g., a cube) needs to follow position and orientation of head/hand. Spheres are only allowed as "avatars" if they are textured so that changes of orientation are visible.
- The application needs to operate in 3D - it is not acceptable if it's just a 2D application.
- The application needs to run at 90fps in the Rift, and 60fps on the smart phone to avoid judder.
- While you may be running both users' applications on the same computer, your network communication needs to be written so that it would still run if one application was running on a different computer. The only change necessary then should be to replace "localhost" in your network code with the respective IP address.
- More criteria may be added.
Useful Tips
Regarding software libraries, you are allowed to use any libraries which you used in homework assignments 1 through 3. In addition, you are allowed the following libraries:
- The Leap Motion SDK
- SOIL to load texture images, or any other library listed here
- The Photon Engine for multi-user support
- The Oculus Avatar SDK
- Assimp for importing OBJs
- SDL, to replace GLFW
- OpenAL for audio support
- XML parsers, such as MiniXML or PugiXML - useful for configuration files
- Very simple, single header file solution to load images.
- Tiny OBJ Loader to load OBJ files.
Physics engines are not permitted - however, you are encouraged to use simple physics that you implement yourself.
You are allowed to use any source for 3D models and textures. Here a few good ones:
If your 3D model is too big to render smoothly, try using MeshLab to reduce the polygon count of your model.
To communicate between the two users (Oculus and Leap), you will need to implement network communication. You can keep this very simple. You can use any network communication library, including cloud services, data bases, or anything related. The simplest approach is to use direct socket communication, as described in this example, and the code can be downloaded here; or you can use a remote procedure call (RPC) library such as this one. Or ZeroMQ, which is closer to the socket level, but much nicer to program and quite optimized for speed. You can choose to create a server program which both applications connect to, or have each application connect directly to the other. Or you can use a professional multi-user SDK such as the Photon Engine.
You are welcome to use the TA/tutors' office hours to brainstorm ideas with them, as well as for help during your work on the project, like for the other homework projects.
Grading
Your final project score consists of three parts:
- Documentation (10 points)
- Application (90 points)
- Extra Credit (10 points)
Documentation (10 Points)
You need to create a blog to report on the progress you're making on your project. You need to make at least two blog entries to get the full score. The first is due on Monday, June 5th at 11:59pm, the second is due on Monday, June 12th at 11:59pm.
The first blog entry needs to contain (at a minimum) the following pieces of information:
- The name of your project (you need to come up with one).
- The names of your team members.
- A short description of the project.
- Two screen shots of your application in its current state, one for the Oculus and one for the Leap user.
In week 2 you need to write about the progress you made and update on any changes you made to team or team name. You also need to post another two screen shots.
You are free to create the blog on any web based blog site, such as Blogger or WordPress. You should use the same blog each time and just add new blog entries. You are free to add more entries than the required ones.
Each team also need to make slides (two slides recommended) for a one minute long presentation on the results of your project and present them during our lightning talks session on June 13th from 3-4pm. The slides need to include screen shots of what each user sees. The slides need to be made with Google Slides and they need to be added to a Google Slides deck everyone shares. The link to the Google Slides deck has been sent to you via email.
The points are distributed like this:
- Blog entry #1: 3 points
- Blog entry #2: 3 points
- Presentation slides: 4 points
We sent an email to everyone in class on June 4th with a link to a Google Spreadsheet to which you please add your team members names and your blog URL.
The Application (90 Points)
The final project has to be presented to the course staff during our final exam slot on Tuesday, June 13th. The presentations start off with a lightning talks session at 3pm in room 1202. Then we will do science fair-style demos (VR lab B210) in two groups: even hour teams 4-5pm, odd hour teams 5-6pm.
The points for your project demonstration will be distributed as follows:
- Technical quality: 60% (strictly based on your programming)
- UI Usability: 20% (subjective score for how easy to learn and use your UI is)
- Creativity: 20% (subjective score factoring in your project idea, 3D models, textures, aesthetics)
The scores will be determined by the course staff (instructor, TAs, tutors). You need to let a course staff member try your application so that we can determine the UI usability score.
Extra Credit (10 Points)
You can get extra credit for the following things:
- Implementing both Leap and smart phone versions, along with the Rift, for triple user. If the smart phone just shows what the other two users are doing (spectator VR) that's 5 extra points, if you the smart phone user can teleport around the space that's 7 extra points, if the smart phone user can fully interact with the application like the other two users that's 10 extra credit points.
- Video: Create a one minute long video that showcases your collaborative project. For this video it's not sufficient to record the screen. You also need to show both users interacting with the system. Then you need to merge the videos into one with a video editing tool. Add a title screen with your project name and the team members' names. Show the video with your slides on presentation day (you get an additional minute if you choose this option). The video does not need to have audio, but during the presentation you should talk over it. This video can serve as inspiration for how the video could be edited. (5 points)
- Even without you asking, we may decide to give you extra credit for your project.
Generally, we want to reward projects that go beyond what we ask for. Please contact the course staff with ideas for things you think should qualify for extra credit. Areas we will particularly consider for extra credit are: innovative interaction concepts, overall aesthetics, UI design, creativity, exceptional execution of technical features, well thought out usability, entertainment factor.