Difference between revisions of "Multi-user virtual reality on mobile phones"
From Immersive Visualization Lab Wiki
Line 14: | Line 14: | ||
* Users can do some actions on dynamic objects (such as open a door) and the other user can see the change. | * Users can do some actions on dynamic objects (such as open a door) and the other user can see the change. | ||
* Allow users to press buttons to move their position | * Allow users to press buttons to move their position | ||
+ | * Made a c-socket based version, it doesn't have to transfer data to java then sent by java socket any more, data transfer can be done in c native code. Used a serialization library in c called "tpl" (http://tpl.sourceforge.net/index.html) to do pack data for networking. | ||
+ | |||
+ | ===Notes=== | ||
+ | * Computing translation based on phone's accelerometer by integral seems impossible, here is a google talk about this: http://www.youtube.com/watch?v=C7JQ7Rpwn2k | ||
+ | |||
+ | ===Miscellaneous=== | ||
+ | * Tried a accelerometer based 3D gesture recognition library (written in java) "Wiigee" (http://www.wiigee.org/) to map gestures with some actions on OSG, but it fails to do gesture recognition with OSG due to the needed high frequency for collecting acceleration data for gesture recognition (high hardware loading causes it crash) | ||
===To-do=== | ===To-do=== | ||
Line 20: | Line 27: | ||
* Maybe try to use a PC as server to allow more than 2 users to connect, the loading might be too heavy for one phone to be a network server for many clients. | * Maybe try to use a PC as server to allow more than 2 users to connect, the loading might be too heavy for one phone to be a network server for many clients. | ||
* Model modification support | * Model modification support | ||
− | * | + | * Use the work done by Kristian and Mads or Greg to compute translation of the phone |
+ | |||
Software Developers: | Software Developers: |
Revision as of 07:28, 22 June 2012
Contents |
Project Overview
- This project allows 2 users use 2 android phones to connect to each other by TCP, then they can navigate and interact to each other in a virtual world. Both users use first-person-view to view the world, they can change their viewing direction by rotating the phone if the phone has gyroscope.
- Explanation of the picture: The triangle in the left phone represents the orientation and position of right phone.
Status
Implemented
- TCP connection between 2 phones and periodically update the information of the world( including the status of dynamic objects and status of the other user)
- Change viewing direction by gyroscope
- Built the scene by OpenSceneGraph for android phones.
- Users can do some actions on dynamic objects (such as open a door) and the other user can see the change.
- Allow users to press buttons to move their position
- Made a c-socket based version, it doesn't have to transfer data to java then sent by java socket any more, data transfer can be done in c native code. Used a serialization library in c called "tpl" (http://tpl.sourceforge.net/index.html) to do pack data for networking.
Notes
- Computing translation based on phone's accelerometer by integral seems impossible, here is a google talk about this: http://www.youtube.com/watch?v=C7JQ7Rpwn2k
Miscellaneous
- Tried a accelerometer based 3D gesture recognition library (written in java) "Wiigee" (http://www.wiigee.org/) to map gestures with some actions on OSG, but it fails to do gesture recognition with OSG due to the needed high frequency for collecting acceleration data for gesture recognition (high hardware loading causes it crash)
To-do
- Allow user to create geometries in the world and other users can see the created geometries
- Maybe try to use a PC as server to allow more than 2 users to connect, the loading might be too heavy for one phone to be a network server for many clients.
- Model modification support
- Use the work done by Kristian and Mads or Greg to compute translation of the phone
Software Developers:
- James Lue
Project Advisors: