Difference between revisions of "Project4S17"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
(Documentation (10 Points))
(Application Requirements)
 
(21 intermediate revisions by one user not shown)
Line 22: Line 22:
 
* Leap Motion users can use mouse or keyboard for secondary interactions, such as moving the user around. The primary interaction must be done through the Leap.
 
* Leap Motion users can use mouse or keyboard for secondary interactions, such as moving the user around. The primary interaction must be done through the Leap.
 
* Smart phone VR users can use the mouse in addition to the HMD and the button on it.
 
* Smart phone VR users can use the mouse in addition to the HMD and the button on it.
 +
* The application needs to operate in 3D - it is not acceptable if it's just a 2D application.
 +
* While you may be running both users' applications on the same computer, your network communication needs to be written so that it would still run if one application was running on a different computer. The only change necessary then should be to replace "localhost" in your network code with the respective IP address.
  
The application has to have the following components:
+
==Application Requirements==
  
* If the 2nd user uses the Leap, the graphics window on the monitor needs to be [http://paulbourke.net/stereographics/anaglyph/ rendered with anaglyph stereo]. (Each team will get a pair of anaglyph glasses in class on Thursday June 1.)
+
The following requirements apply to your application. Each bullet item will get you 10% of your technical score.
* There needs to be at least one piece of stationary geometry in the space, such as a table or a tennis net. This object needs to be a 3D model file with at least one texture.
+
 
* There needs to be at least one piece of moving geometry, such as a building block or a tennis ball. This object also needs to be a 3D model file with at least one texture.
+
# If the 2nd user uses the Leap, the graphics window on the monitor needs to be [http://paulbourke.net/stereographics/anaglyph/ rendered in anaglyph stereo]. If the 2nd user uses a smart phone, the PC either needs to stream the rendered images to it, or you have to write a renderer for the smart phone in its native language (e.g., Android, iOS).
* The two users need to interact with objects in the same space and work together on something, for example: hand an object to the other user, play [http://www.ponggame.org/3dpong.php 3D Pong], play chess, build something with Legos, etc.
+
# All images the user sees (on Rift, anaglyph monitor, smart phone with cardboard/Merge VR viewer) need to be in 3D stereo.
* Collision detection needs to be part of the interaction algorithm. Can be simply done by proximity, or bounding box collisions.
+
# The Rift user's head must be properly tracked, the Leap user's finger position must show up in 3D where it physically is (assuming the user is sitting in the right spot in front of the monitor), the smart phone must properly update the images with head rotation.
* Each user needs to use at least one of their hands in the application.
+
# There needs to be at least one piece of stationary geometry in the space, such as a table or a tennis net. This object needs to be a 3D model file with at least one texture.
* Both head positions and the positions of the interacting hand(s) of each user needs to be indicated with at least a simple piece of geometry for both users. This piece of geometry (e.g., a cube) needs to follow position and orientation of head/hand. Spheres are only allowed as "avatars" if they are textured so that changes of orientation are visible.
+
# There needs to be at least one piece of moving geometry, such as a building block or a tennis ball. This object also needs to be a 3D model file with at least one texture.
* The application needs to operate in 3D - it is not acceptable if it's just a 2D application.
+
# The two users need to interact with objects in the same space and work together on something, for example: hand an object to the other user, play [http://www.ponggame.org/3dpong.php 3D Pong], play chess, build something with Legos, etc.
* The application needs to run at 90fps in the Rift, and 60fps on the smart phone to avoid judder.
+
# Collision detection needs to be part of the interaction algorithm. Can be simply done by proximity, or bounding box collisions.
* While you may be running both users' applications on the same computer, your network communication needs to be written so that it would still run if one application was running on a different computer. The only change necessary then should be to replace "localhost" in your network code with the respective IP address.
+
# Each user needs to use at least one of their hands in the application (with Touch controller or Leap). If the 2nd user is a smart phone user, either the button on the cardboard viewer has to be used, or mouse or keyboard.
* More criteria may be added.
+
# Both head positions and the positions of the interacting hand(s) of each user needs to be indicated with at least a simple piece of geometry for both users. This piece of geometry (e.g., a cube) needs to follow position and orientation of head/hand. Spheres are only allowed as "avatars" if they are textured so that changes of orientation are visible.
 +
# The application needs to run fluidly and without judder. This means 90/45fps for the Rift, and 60fps for anaglyph display or smart phone.
  
 
==Useful Tips==
 
==Useful Tips==
Line 53: Line 56:
 
Physics engines are not permitted - however, you are encouraged to use simple physics that you implement yourself.
 
Physics engines are not permitted - however, you are encouraged to use simple physics that you implement yourself.
  
You are allowed to use any source for 3D models and textures, such as Google 3D Warehouse.
+
You are allowed to use any source for 3D models and textures. Here a few good ones:
 +
* [https://3dwarehouse.sketchup.com Google 3D Warehouse]
 +
* [https://www.turbosquid.com/Search/3D-Models/free Turbosquid]
 +
* [https://www.cgtrader.com/free-3d-models CGTrader]
 +
 
 +
If your 3D model is too big to render smoothly, try using [http://www.meshlab.net/ MeshLab] to [https://www.shapeways.com/tutorials/polygon_reduction_with_meshlab reduce the polygon count] of your model.
  
 
To communicate between the two users (Oculus and Leap), you will need to implement network communication. You can keep this very simple. You can use any network communication library, including cloud services, data bases, or anything related. The simplest approach is to use direct socket communication, [https://www.codeproject.com/Articles/412511/Simple-client-server-network-using-Cplusplus-and-W as described in this example], and the code can be downloaded [[Media:simple_network.zip |here]]; or you can use a [http://rpclib.net/ remote procedure call (RPC) library such as this one]. Or [http://zeromq.org/ ZeroMQ], which is closer to the socket level, but much nicer to program and quite optimized for speed. You can choose to create a server program which both applications connect to, or have each application connect directly to the other. Or you can use a professional multi-user SDK such as the [https://www.photonengine.com Photon Engine].
 
To communicate between the two users (Oculus and Leap), you will need to implement network communication. You can keep this very simple. You can use any network communication library, including cloud services, data bases, or anything related. The simplest approach is to use direct socket communication, [https://www.codeproject.com/Articles/412511/Simple-client-server-network-using-Cplusplus-and-W as described in this example], and the code can be downloaded [[Media:simple_network.zip |here]]; or you can use a [http://rpclib.net/ remote procedure call (RPC) library such as this one]. Or [http://zeromq.org/ ZeroMQ], which is closer to the socket level, but much nicer to program and quite optimized for speed. You can choose to create a server program which both applications connect to, or have each application connect directly to the other. Or you can use a professional multi-user SDK such as the [https://www.photonengine.com Photon Engine].
Line 70: Line 78:
  
 
You need to create a blog to report on the progress you're making on your project. You need to make at least two blog entries to get the full score. The first is due on '''Monday, June 5th at 11:59pm''', the second is due on '''Monday, June 12th at 11:59pm'''.  
 
You need to create a blog to report on the progress you're making on your project. You need to make at least two blog entries to get the full score. The first is due on '''Monday, June 5th at 11:59pm''', the second is due on '''Monday, June 12th at 11:59pm'''.  
 
You also need to make two slides for a one minute long presentation on the results of your project and present them during our lightning talks session on June 13th from 3-4pm. The slides need to be made with Google Slides and they need to be added to a Google Slides deck everyone shares. The link to it has been sent to you via email.
 
  
 
The first blog entry needs to contain (at a minimum) the following pieces of information:
 
The first blog entry needs to contain (at a minimum) the following pieces of information:
Line 83: Line 89:
  
 
You are free to create the blog on any web based blog site, such as [http://www.blogger.com Blogger] or [http://wordpress.com WordPress]. You should use the same blog each time and just add new blog entries. You are free to add more entries than the required ones.  
 
You are free to create the blog on any web based blog site, such as [http://www.blogger.com Blogger] or [http://wordpress.com WordPress]. You should use the same blog each time and just add new blog entries. You are free to add more entries than the required ones.  
 +
 +
Each team also need to make slides (two slides recommended) for a one minute long presentation on the results of your project and present them during our lightning talks session on June 13th from 3-4pm. The slides need to include screen shots of what each user sees. The slides need to be made with Google Slides and they need to be added to a Google Slides deck everyone shares. The link to the Google Slides deck has been sent to you via email.
  
 
The points are distributed like this:
 
The points are distributed like this:
 
* Blog entry #1: 3 points
 
* Blog entry #1: 3 points
 
* Blog entry #2: 3 points
 
* Blog entry #2: 3 points
* Slide presentation: 4 points
+
* Presentation slides: 4 points
  
 
We sent an email to everyone in class on June 4th with a link to a Google Spreadsheet to which you please add your team members names and your blog URL.
 
We sent an email to everyone in class on June 4th with a link to a Google Spreadsheet to which you please add your team members names and your blog URL.
Line 93: Line 101:
 
==The Application (90 Points)==
 
==The Application (90 Points)==
  
The final project has to be presented to the course staff during our final exam slot on '''Tuesday, June 13th starting at 4pm.'''. We will have two groups: even hour teams start at 4pm, odd hour teams start at 5pm. The slide presentations start at 3pm.
+
The final project has to be presented to the course staff during our final exam slot on '''Tuesday, June 13th'''. The presentations start off with a lightning talks session at 3pm in room 1202. Then we will do science fair-style demos (VR lab B210) in two groups: even hour teams 4-5pm, odd hour teams 5-6pm.
  
 
The points for your project demonstration will be distributed as follows:
 
The points for your project demonstration will be distributed as follows:
Line 101: Line 109:
 
* Creativity: 20% (subjective score factoring in your project idea, 3D models, textures, aesthetics)
 
* Creativity: 20% (subjective score factoring in your project idea, 3D models, textures, aesthetics)
  
The scores will be determined by the course staff (instructor, TAs, tutors). You need to let a course staff member try your application so that we can determine the UI usability score.  
+
The scores will be determined by the course staff (instructor, TAs, tutors). You need to let a course staff member try your application so that we can determine the UI usability score.
  
 
==Extra Credit (10 Points)==
 
==Extra Credit (10 Points)==
Line 108: Line 116:
  
 
* Implementing both Leap and smart phone versions, along with the Rift, for triple user. If the smart phone just shows what the other two users are doing (spectator VR) that's 5 extra points, if you the smart phone user can teleport around the space that's 7 extra points, if the smart phone user can fully interact with the application like the other two users that's 10 extra credit points.
 
* Implementing both Leap and smart phone versions, along with the Rift, for triple user. If the smart phone just shows what the other two users are doing (spectator VR) that's 5 extra points, if you the smart phone user can teleport around the space that's 7 extra points, if the smart phone user can fully interact with the application like the other two users that's 10 extra credit points.
* Video: create a video that showcases your collaborative project. For this video it's not sufficient to record the screen. You also need to show both users. This means you'll need to record off screen, and each user separately as well - for instance with cell phone cameras. Then you need to merge the videos into one with a video editing tool. Add a title screen with your project name and the team members' names. Make the video about 1 minute long (2 minutes max) and show it as part of your slide presentation. (5 points)
+
* Video: Create a one minute long video that showcases your collaborative project. For this video it's not sufficient to record the screen. You also need to show both users interacting with the system. Then you need to merge the videos into one with a video editing tool. Add a title screen with your project name and the team members' names. Show the video with your slides on presentation day (you get an additional minute if you choose this option). The video does not need to have audio, but during the presentation you should talk over it. [https://vimeo.com/151371475 This video] can serve as inspiration for how the video could be edited. (5 points)
 
* Even without you asking, we may decide to give you extra credit for your project.
 
* Even without you asking, we may decide to give you extra credit for your project.
  
 
Generally, we want to reward projects that go beyond what we ask for. Please contact the course staff with ideas for things you think should qualify for extra credit. Areas we will particularly consider for extra credit are: innovative interaction concepts, overall aesthetics, UI design, creativity, exceptional execution of technical features, well thought out usability, entertainment factor.
 
Generally, we want to reward projects that go beyond what we ask for. Please contact the course staff with ideas for things you think should qualify for extra credit. Areas we will particularly consider for extra credit are: innovative interaction concepts, overall aesthetics, UI design, creativity, exceptional execution of technical features, well thought out usability, entertainment factor.

Latest revision as of 12:57, 13 June 2017

Contents

Homework Assignment 4: Social VR

For this project you need to implement a two user VR application. Since we don't have two HMDs and computers for each team, one user is going to use the Oculus Rift with the Touch controllers, the other user is going to use either the Leap Motion and the monitor, or a smart phone in a VR viewer. For inspiration for this project you can use the Oculus Toy Box application, although in it both users have a full set of Oculus Rift equipment. We do have a special option for those who have access to two Rift units, see below.

For this assignment you can obtain 100 points, plus up to 10 points for extra credit.

This homework assignment is due on Tuesday, June 13th at 3:00pm.

The project is designed to be a team project for two people, just like the other projects this quarter.

The goal of this project is to create a dual user 3D VR application. One of the users has to use the Oculus Rift with the Touch controllers. For the other user you have a choice between:

  • The Leap Motion gesture tracker.
  • A smart phone with a VR viewer, such as Merge VR, which you can borrow from the Media Teaching Lab. In this case you could write an OpenGL application for the 2nd user and feed its output into the Trinus VR system. Or you could write the 2nd user's code in the native language of your cell phone OS (Android or iOS).

If you have access to an additional Rift with Touch controllers, here's a third option: Instead of using Leap or cell phone for one of the users, you can use create an all Rift application, but you need to support at least three Rifts, as an additional challenge in exchange for not having to work with Leap or phone. During grading, we'll have one PC ready with the TAs' Rift on which you'll need to install your application by bringing it on a USB storage device (thumb drive, HDD, etc), on which all the required files need to be in a single directory, ready to run by double clicking on a .exe file. To test your application with three Rifts, you can borrow the TA Rift during the TAs' office hours. Or coordinate with another team to use theirs at times they don't need it. Please do the majority of your 3-user testing well before the deadline because the lab may get crowded in the days leading up to presentation day.

The following rules apply:

  • The application can't be based on homework project 1 (the CO2 removal trainer), unless you work on the final project alone.
  • Oculus Rift users are only allowed to use the HMD and the Touch controllers for input - no keyboard, mouse, etc.
  • Leap Motion users can use mouse or keyboard for secondary interactions, such as moving the user around. The primary interaction must be done through the Leap.
  • Smart phone VR users can use the mouse in addition to the HMD and the button on it.
  • The application needs to operate in 3D - it is not acceptable if it's just a 2D application.
  • While you may be running both users' applications on the same computer, your network communication needs to be written so that it would still run if one application was running on a different computer. The only change necessary then should be to replace "localhost" in your network code with the respective IP address.

Application Requirements

The following requirements apply to your application. Each bullet item will get you 10% of your technical score.

  1. If the 2nd user uses the Leap, the graphics window on the monitor needs to be rendered in anaglyph stereo. If the 2nd user uses a smart phone, the PC either needs to stream the rendered images to it, or you have to write a renderer for the smart phone in its native language (e.g., Android, iOS).
  2. All images the user sees (on Rift, anaglyph monitor, smart phone with cardboard/Merge VR viewer) need to be in 3D stereo.
  3. The Rift user's head must be properly tracked, the Leap user's finger position must show up in 3D where it physically is (assuming the user is sitting in the right spot in front of the monitor), the smart phone must properly update the images with head rotation.
  4. There needs to be at least one piece of stationary geometry in the space, such as a table or a tennis net. This object needs to be a 3D model file with at least one texture.
  5. There needs to be at least one piece of moving geometry, such as a building block or a tennis ball. This object also needs to be a 3D model file with at least one texture.
  6. The two users need to interact with objects in the same space and work together on something, for example: hand an object to the other user, play 3D Pong, play chess, build something with Legos, etc.
  7. Collision detection needs to be part of the interaction algorithm. Can be simply done by proximity, or bounding box collisions.
  8. Each user needs to use at least one of their hands in the application (with Touch controller or Leap). If the 2nd user is a smart phone user, either the button on the cardboard viewer has to be used, or mouse or keyboard.
  9. Both head positions and the positions of the interacting hand(s) of each user needs to be indicated with at least a simple piece of geometry for both users. This piece of geometry (e.g., a cube) needs to follow position and orientation of head/hand. Spheres are only allowed as "avatars" if they are textured so that changes of orientation are visible.
  10. The application needs to run fluidly and without judder. This means 90/45fps for the Rift, and 60fps for anaglyph display or smart phone.

Useful Tips

Regarding software libraries, you are allowed to use any libraries which you used in homework assignments 1 through 3. In addition, you are allowed the following libraries:

Physics engines are not permitted - however, you are encouraged to use simple physics that you implement yourself.

You are allowed to use any source for 3D models and textures. Here a few good ones:

If your 3D model is too big to render smoothly, try using MeshLab to reduce the polygon count of your model.

To communicate between the two users (Oculus and Leap), you will need to implement network communication. You can keep this very simple. You can use any network communication library, including cloud services, data bases, or anything related. The simplest approach is to use direct socket communication, as described in this example, and the code can be downloaded here; or you can use a remote procedure call (RPC) library such as this one. Or ZeroMQ, which is closer to the socket level, but much nicer to program and quite optimized for speed. You can choose to create a server program which both applications connect to, or have each application connect directly to the other. Or you can use a professional multi-user SDK such as the Photon Engine.

You are welcome to use the TA/tutors' office hours to brainstorm ideas with them, as well as for help during your work on the project, like for the other homework projects.

Grading

Your final project score consists of three parts:

  • Documentation (10 points)
  • Application (90 points)
  • Extra Credit (10 points)

Documentation (10 Points)

You need to create a blog to report on the progress you're making on your project. You need to make at least two blog entries to get the full score. The first is due on Monday, June 5th at 11:59pm, the second is due on Monday, June 12th at 11:59pm.

The first blog entry needs to contain (at a minimum) the following pieces of information:

  • The name of your project (you need to come up with one).
  • The names of your team members.
  • A short description of the project.
  • Two screen shots of your application in its current state, one for the Oculus and one for the Leap user.

In week 2 you need to write about the progress you made and update on any changes you made to team or team name. You also need to post another two screen shots.

You are free to create the blog on any web based blog site, such as Blogger or WordPress. You should use the same blog each time and just add new blog entries. You are free to add more entries than the required ones.

Each team also need to make slides (two slides recommended) for a one minute long presentation on the results of your project and present them during our lightning talks session on June 13th from 3-4pm. The slides need to include screen shots of what each user sees. The slides need to be made with Google Slides and they need to be added to a Google Slides deck everyone shares. The link to the Google Slides deck has been sent to you via email.

The points are distributed like this:

  • Blog entry #1: 3 points
  • Blog entry #2: 3 points
  • Presentation slides: 4 points

We sent an email to everyone in class on June 4th with a link to a Google Spreadsheet to which you please add your team members names and your blog URL.

The Application (90 Points)

The final project has to be presented to the course staff during our final exam slot on Tuesday, June 13th. The presentations start off with a lightning talks session at 3pm in room 1202. Then we will do science fair-style demos (VR lab B210) in two groups: even hour teams 4-5pm, odd hour teams 5-6pm.

The points for your project demonstration will be distributed as follows:

  • Technical quality: 60% (strictly based on your programming)
  • UI Usability: 20% (subjective score for how easy to learn and use your UI is)
  • Creativity: 20% (subjective score factoring in your project idea, 3D models, textures, aesthetics)

The scores will be determined by the course staff (instructor, TAs, tutors). You need to let a course staff member try your application so that we can determine the UI usability score.

Extra Credit (10 Points)

You can get extra credit for the following things:

  • Implementing both Leap and smart phone versions, along with the Rift, for triple user. If the smart phone just shows what the other two users are doing (spectator VR) that's 5 extra points, if you the smart phone user can teleport around the space that's 7 extra points, if the smart phone user can fully interact with the application like the other two users that's 10 extra credit points.
  • Video: Create a one minute long video that showcases your collaborative project. For this video it's not sufficient to record the screen. You also need to show both users interacting with the system. Then you need to merge the videos into one with a video editing tool. Add a title screen with your project name and the team members' names. Show the video with your slides on presentation day (you get an additional minute if you choose this option). The video does not need to have audio, but during the presentation you should talk over it. This video can serve as inspiration for how the video could be edited. (5 points)
  • Even without you asking, we may decide to give you extra credit for your project.

Generally, we want to reward projects that go beyond what we ask for. Please contact the course staff with ideas for things you think should qualify for extra credit. Areas we will particularly consider for extra credit are: innovative interaction concepts, overall aesthetics, UI design, creativity, exceptional execution of technical features, well thought out usability, entertainment factor.