Difference between revisions of "AndroidAR"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
(Goals)
Line 5: Line 5:
 
On the android device, we are using the Min3D framework which is based on a scene graph and has excellent wavefront .obj file loading. When you first start the app, you a presented with a screen that allows you to choose which model you want shown. After picking a model, it should appear on the screen along with some information text in the upper left hand corner.
 
On the android device, we are using the Min3D framework which is based on a scene graph and has excellent wavefront .obj file loading. When you first start the app, you a presented with a screen that allows you to choose which model you want shown. After picking a model, it should appear on the screen along with some information text in the upper left hand corner.
 
Pressing the context menu button and then "Start tracking" will enable the tracking feature and continuously update the camera of the scene graph based on what we receive from the CalVR plugin.
 
Pressing the context menu button and then "Start tracking" will enable the tracking feature and continuously update the camera of the scene graph based on what we receive from the CalVR plugin.
 +
  
 
== Goals ==
 
== Goals ==
Line 18: Line 19:
  
 
Rendering in Cave and stream to device
 
Rendering in Cave and stream to device
 +
  
 
== Developers ==
 
== Developers ==
Line 31: Line 33:
  
 
[http://tordenoglynild.dk/files/cse298/aar3.jpg] Xoom 2 running the Virtual Reality app
 
[http://tordenoglynild.dk/files/cse298/aar3.jpg] Xoom 2 running the Virtual Reality app
 +
  
 
== Related work ==
 
== Related work ==
 
Greg is working on doing all the tracking internally in the phone, using all its sensors and the camera(s).
 
Greg is working on doing all the tracking internally in the phone, using all its sensors and the camera(s).

Revision as of 17:47, 2 May 2012

Contents

Introduction

This project uses 2 infrared trackers and a tracking target that is attached to some android device. The data is collected from the trackers by CalVR in the form of a homogeneous 4x4 matrix which is processed by a CalVR plugin that sends packets using UDP to whoever sends a request packet. It is sent as a comma separated string and then unwrapped an parsed on the android device; stripping the position and axis column vectors.

On the android device, we are using the Min3D framework which is based on a scene graph and has excellent wavefront .obj file loading. When you first start the app, you a presented with a screen that allows you to choose which model you want shown. After picking a model, it should appear on the screen along with some information text in the upper left hand corner. Pressing the context menu button and then "Start tracking" will enable the tracking feature and continuously update the camera of the scene graph based on what we receive from the CalVR plugin.


Goals

Rendering on Android

Camera tracking

Head tracing

Interactivity on device

Rendering in Cave and device

Rendering in Cave and stream to device


Developers

Kristian Theilgaard Hansen

Mads Pedersen


Pictures

[1] Mads holding the final tablet and tracking the chair

[2] Xoom 2 running the Virtual Reality app with IR Tracking

[3] Xoom 2 running the Virtual Reality app


Related work

Greg is working on doing all the tracking internally in the phone, using all its sensors and the camera(s).