ArtifactVis with Android Device
Contents |
Project Summary
Usage Instructions
Screen Shots
Progress
Week 1
- look for tutorials on OSG (Done part of them)
- creating boxes and spheres (Done)
- displaying images from JPEG files (Working on it)
- mouse control: move/rotate objects in 3D (Done the rotation)
- understand the concept of a scene graph (Done)
Week 2
First: OpenSceneGraph
The resources that I used to learn OSG
"OpenSceneGraph 3.0 Beginner's Guide book" and the OpenSceneGraph website.
I read the following chapters:
Chapter 1: The Journey into OpenSceneGraph
Chapter 3: Creating Your First OSG Program
Chapter 4: Building Geometry Models
Chapter 5: Managing Scene Graph
Chapter 6: Creating Realistic Rendering Effects
Chapter 9: Interacting with Outside Elements
And some tutorial related to the following topics:
- Geometry
- 2D Text write function to set camera position in osg from head node: for
testing, transfer 4x4 matrix of head pos
- Transformation (Translation, Rotation, Scaling)
- Shapes (Box, Cone, Sphere, Capsule, Cylinder)
- Polygon mode
- 2D Texture (load image from images file)
- Interaction with outside element (keyboard and mouse)
I read the previous topics and ran them on my PC and made some changes to the code to understand what each method does in the code.
Second: CalVR
I read the CalVR website, and I will focus on it more on Monday
Third: Android
I started with android: installed it on my laptop, done the "Hello World" program and ran it in my android devise. I also read about the following topics:
- Camera
- Kind of sensor
- Activity
- Create list, buttons
roadblocks I have encountered:
- Working with OSG installation
- Write a makefile
- Working with some linux commands.
- Understanding CalVR (but I did not give CalVR much time to work on)
- There were no simple Examples for OSG on the internet so I used the book "OpenSceneGraph 3.0 Beginner's Guide book" to practice some simple examples.
- I couldn't understand some mathematical equations in your slides.
Summery
Look for tutorials on OSG and compile them on your PC in my lab Done
Creating boxes and spheres Done
Displaying images from JPEG files Done (I used the image as texture)
Mouse control: move/rotate objects in 3D Done (move/rotate objects in 3D and Picking objects)
Understand the concept of a scene graph Done
Go through the slides on this web site Done (Lecture 1, 2, 3, 4, 5, 6, 10) I read in general and will continue reading
Week 3
Reading and understanding the code (Done 60%)
Read about socket programming (Done without any practical code)
TODO
- load excavation site model (VRML file) into tablet with osg
- write function to set camera position in osg from head node: for testing, transfer 4x4 matrix of head position over wifi to tablet: this will effectively copy the view on the Calvr system to the tablet
- sense touch on artifact, then send touched artifact ID to Calvr computer, which will send associated photograph to tablet; display photograph on tablet. This will allow the user to view pictures associated with artifacts on the screen
- connect tracking system to tablet to allow moving tablet around starcave and view excavation site model from arbitrary perspectives on tablet screen.