Multi-user virtual reality on mobile phones

From Immersive Visualization Lab Wiki
Jump to: navigation, search

Contents

Project Overview

  • This project allows 2 users use 2 android phones to connect to each other by TCP, then they can navigate and interact to each other in a virtual world. The supported actions include basic geometry creation and geometry modification. Both users use first-person-view to view the world. The virtual world coordinate system maps to physical world coordinate system for both users. The world coordinate system is built by using markers with OpenCV marker detector (https://sites.google.com/site/playwithopencv/home/markerdetect). The tracking for phones'positoins and orientations are also done by using OpenCV marker. Gyroscope is used to compensate the tracking for the phones if the marker is not found by the phones.
  • Explanation of the picture: The triangle in the left phone represents the orientation and position of right phone.


ViewBox.JPG

Status

Implemented

  • TCP connection between 2 phones and periodically update the information of the world( including the status of dynamic objects and status of the other user)
  • Change viewing direction by gyroscope
  • Built the scene by OpenSceneGraph for android phones.
  • Users can do some actions on dynamic objects (such as open a door) and the other user can see the change.
  • Allow users to press buttons to move their position
  • Made a c-socket based version, it doesn't have to transfer data to java then sent by java socket any more, data transfer can be done in c native code. Used a serialization library in c called "tpl" (http://tpl.sourceforge.net/index.html) to do pack data for networking.
  • Wrote a function to draw a 3D line (pipe) by moving camera and connecting wanted camera positions
  • Both 2 users can see the line which is in their screen and drew by one of the users
  • After draw a line, both users can rotate the line by fixed angle and the other user can see the effect at the same time (now only rotation is implemented, but translation and scale can be done by changing the rotation matrix)
  • Discard the "tpl" serialization library because it cause some problems after transfer data via the network, now I use my own data transfer protocol for transfer different data type and action.
  • Both 2 phone can be tracked by the tracking system with the track based on Kristian and Mads' work. I slightly modified their code.
  • Tried to use OpenCV example for Android devices, but it just crashes with OSG (even just do the simplest image processing such as converting to grey scale for each camera frame)
  • Tried different way to read camera's preview data with OSG, it works now, but now it doesn't do any image processing work, just read the data into memory and get garbage collected.

  • TCP connection through c-socket
  • Exchange users' state periodically by Java TimerTask
  • Use OpenCV marker detector(https://sites.google.com/site/playwithopencv/home/markerdetect) to track phones positoin and orientation
  • Support basic geometry (boxes and lines) creation in OSG
  • Support geometry selection
  • Support geometry modifications (change color, rotation, translation, scale)
  • Gyro is used to compensate orientation tracking when marker is lost
  • Can save created model in to SD card and send the model to website server by Http POST (using cURL library)

Notes

Miscellaneous

  • Tried a accelerometer based 3D gesture recognition library (written in java) "Wiigee" (http://www.wiigee.org/) to map gestures with some actions on OSG, but it fails to do gesture recognition with OSG due to the needed high frequency for collecting acceleration data for gesture recognition (high hardware loading causes it crash)
  • Some research work use sensors on the phone and 3D gesture recognition to estimate 2D position (the position in a room) of the phone, the result seems ok on coarse scale.

To-do

  • Maybe try to use a PC as server to allow more than 2 users to connect, the loading might be too heavy for one phone to be a network server for many clients.
  • Model modification support
  • Try if the camera pose estimation algorithms can be used with OSG, worry if the hardware loading will force the application to stop.


Software Developers:

  • James Lue


Project Advisors:

Instructions for building a simple example of Android application with c++ native code

Set up Android SDK with Eclipse

1. Download Android SDK from "http://developer.android.com/sdk/index.html" and extract it under /home/username 2. Download Eclipse from "http://www.eclipse.org/downloads/", use "Eclipse IDE for Java Developers" or "Eclipse for Mobile Developers" (I use "IDE for Java Developers") 3. Install Eclipse Android sdk plugin by the steps in "http://developer.android.com/sdk/installing/installing-adt.html", this probably will restart Eclipse 4. After Eclipse restarted, in Eclipse, "Window -> Android SDK Manager" select the API version and download.

Set up Android NDK

1. Download Android NDK form "http://developer.android.com/tools/sdk/ndk/index.html" and extract it under /home/username

2. cd into /home/username and edit ".cshrc" (it's a hidden file, press ctrl+H in /home/username to show it) in .cshrc add the following lines:

	# android
	setenv ANDROID_NDK /home/username/android-ndk-rxx #(depends on the folder name of your android ndk version)
	setenv PATH /home/username/android-ndk-rxx:$PATH
	setenv ANDROID_SDK /home/username/android-sdks #(might change, depends on the folder name of android sdk)
	setenv PATH /home/username/android-sdks:$PATH
	setenv PATH /home/jlue/android-sdks/platform-tools:$PATH #(this is line may not be necessary)

3. open a terminal, cd into "/home/username/" and enter the command "source .cshrc"

Install CDT in Eclipse

1. In Eclipse, "Help -> Install New Software..." in "Wrok with", enter "http://download.eclipse.org/tools/cdt/releases/indigo" then just Next... to finish the installation and restart Eclipse

2. After Eclipse restarted. "File -> New -> Project..." to check if the option "C/C++" shows up. If it is, CDT is installed.

Build a simple example for NDK (modified from the sample "hellojni" in android ndk) by following steps:

1. In Eclipse,"File -> New -> Other... -> Android -> Android Application Project", Set "Application Name" as "HelloJni"

2. copy and paste the following code to "HelloJni.java":

/*
 * Copyright (C) 2009 The Android Open Source Project
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.example.hellojni;
import android.app.Activity;
import android.widget.TextView;
import android.os.Bundle;
import nativeLib.NativeLib;
public class HelloJni extends Activity
{
    /** Called when the activity is first created. */
    @Override
    public void onCreate(Bundle savedInstanceState)
    {
	super.onCreate(savedInstanceState);
	/* Create a TextView and set its content.
	 * the text is retrieved by calling a native
	 * function.
	 */
	TextView  tv = new TextView(this);
	tv.setText( NativeLib.stringFromJNI() );
	setContentView(tv);
    }

}

3. create a package named "nativeLib" in the project and a Java class named "NativeLib" in the package copy and paste the following code to "NativeLib.java":

package nativeLib;
public class NativeLib {
	static {
		System.loadLibrary("NativeLib");
	}
	public static native String  stringFromJNI();
}

4. Open a terminal and cd into /path_to_eclipse_workspace/HelloJni/bin/classes (if you can't find this path, compile your java code first)

5. In terminal, type the command "javah -jni nativeLib.NativeLib", then the file "nativeLib_NativeLib.h" will be generated in /path_to_eclipse_workspace/HelloJni/bin/classes

6. Create a folder named "jni" in the Android project, copy "native_NativeLib.h" to the jni folder

7. create a file named "Android.mk" in the jni folder under your project

8. copy and paste the following code to "Android.mk"

LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE    := NativeLib
LOCAL_SRC_FILES := nativeLib.cpp
include $(BUILD_SHARED_LIBRARY)

9. create a file named "nativeLib.cpp" under jni folder in your project

10. copy and paste the following code to "nativeLib.cpp"

#include <string.h>
#include "nativeLib_NativeLib.h"
JNIEXPORT jstring JNICALL Java_nativeLib_NativeLib_stringFromJNI
  (JNIEnv *env, jclass)
{
	jstring js=env->NewStringUTF( "Hello from JNI !");
	return js;
}

The structure of the project will look like this:

AndroidInsF.jpg

11. open a terminal and cd into /path_to_eclipse_workspace/HelloJni and type the command "ndk-build"

12. Hit the "run" icon in Eclipse and run as Android Application

Compile OpenSceneGraph for Android devices

Basically you can use this web page: http://forum.openscenegraph.org/viewtopic.php?t=10076

Here are the steps (please finish the installation of Android SDK and NDK first):

Install OpenSceneGraph

1. create a folder for OSG(e.g. create a folder "OSGAndroid" under /home/username/, full path: /home/username/OSGAndroid)

2. under the folder "OSGAndroid", open a terminal and enter the command
svn co http://www.openscenegraph.org/svn/osg/OpenSceneGraph/tags/OpenSceneGraph-3.0.1
(here is OSG version 3.0.1, make sure you checked the version of OSG)

3. create 2 folders "build", "osginstall" under the main OSG folder (e.g. /home/username/OSGAndroid/OpenSceneGraph-3.0.1)

4. download the 3rdparty package from this website: http://www2.ai2.upv.es/difusion/osgAndroid/3rdpartyAndroid.zip

5. unzip the 3rdparty package and put it into the main OSG folder

6. cd into the "build" folder and open a terminal

7. in the terminal, enter this:

	
cmake .. -DOSG_BUILD_PLATFORM_ANDROID=ON -DDYNAMIC_OPENTHREADS=OFF -DDYNAMIC_OPENSCENEGRAPH=OFF -DOSG_GL_DISPLAYLISTS_AVAILABLE=OFF -DOSG_GL_MATRICES_AVAILABLE=ON -DOSG_GL_VERTEX_FUNCS_AVAILABLE=ON -DOSG_GL_VERTEX_ARRAY_FUNCS_AVAILABLE=ON -DOSG_GL_FIXED_FUNCTION_AVAILABLE=ON -DOSG_CPP_EXCEPTIONS_AVAILABLE=OFF -DOSG_GL1_AVAILABLE=OFF -DOSG_GL2_AVAILABLE=OFF -DOSG_GL3_AVAILABLE=OFF -DOSG_GLES1_AVAILABLE=ON -DOSG_GLES2_AVAILABLE=OFF -DJ=4 -DCMAKE_INSTALL_PREFIX=/home/username/OSGAndroid/OpenSceneGraph-3.0.1/osginstall

be aware of the last 2 parameters DJ and install path, -DJ is the number of CPU cores in your computer, make sure to set the number right

after it configure, enter the command "make" in terminal, after it finished, enter the command "make install" (make and make install will take some time) 8. Try out the simple example which is modiflied from the official example in OSG, use the steps for compiling native code above.

the repo for the example:

git@github.com:amidofu/Simple_OSG_example.git

Code Repository

git@github.com:amidofu/Multi-User_Android_3D_Drawing.git