Multi-user virtual reality on mobile phones

From Immersive Visualization Lab Wiki
Revision as of 11:01, 6 July 2012 by Jlue (Talk | contribs)

Jump to: navigation, search

Contents

Project Overview

  • This project allows 2 users use 2 android phones to connect to each other by TCP, then they can navigate and interact to each other in a virtual world. Both users use first-person-view to view the world, they can change their viewing direction by rotating the phone if the phone has gyroscope.
  • Explanation of the picture: The triangle in the left phone represents the orientation and position of right phone.


Multi users VR android phone2.JPG

Status

Implemented

  • TCP connection between 2 phones and periodically update the information of the world( including the status of dynamic objects and status of the other user)
  • Change viewing direction by gyroscope
  • Built the scene by OpenSceneGraph for android phones.
  • Users can do some actions on dynamic objects (such as open a door) and the other user can see the change.
  • Allow users to press buttons to move their position
  • Made a c-socket based version, it doesn't have to transfer data to java then sent by java socket any more, data transfer can be done in c native code. Used a serialization library in c called "tpl" (http://tpl.sourceforge.net/index.html) to do pack data for networking.
  • Wrote a function to draw a 3D line (pipe) by moving camera and connecting wanted camera positions
  • Both 2 users can see the line which is in their screen and drew by one of the users
  • After draw a line, both users can rotate the line by fixed angle and the other user can see the effect at the same time (now only rotation is implemented, but translation and scale can be done by changing the rotation matrix)
  • Discard the "tpl" serialization library because it cause some problems after transfer data via the network, now I use my own data transfer protocol for transfer different data type and action.
  • Both 2 phone can be tracked by the tracking system with the track based on Kristian and Mads' work. I slightly modified their code.

Notes

Miscellaneous

  • Tried a accelerometer based 3D gesture recognition library (written in java) "Wiigee" (http://www.wiigee.org/) to map gestures with some actions on OSG, but it fails to do gesture recognition with OSG due to the needed high frequency for collecting acceleration data for gesture recognition (high hardware loading causes it crash)
  • Some research work use sensors on the phone and 3D gesture recognition to estimate 2D position (the position in a room) of the phone, the result seems ok on coarse scale.

To-do

  • Maybe try to use a PC as server to allow more than 2 users to connect, the loading might be too heavy for one phone to be a network server for many clients.
  • Model modification support
  • Try if the camera pose estimation algorithms can be used with OSG, worry if the hardware loading will force the application to stop.


Software Developers:

  • James Lue


Project Advisors: