Difference between revisions of "Android Controller"

From Immersive Visualization Lab Wiki
Jump to: navigation, search
m (New page: ==Android Controller== Jeanne Wang ==Objective== Create an intuitive and novel approach to a tablet based multi-touch and sensor-enabled controller for a real-time 3D visualization of co...)
 
(Objective)
 
(34 intermediate revisions by one user not shown)
Line 1: Line 1:
 
==Android Controller==
 
==Android Controller==
 +
[[Image:camera.jpg]]
  
 
Jeanne Wang
 
Jeanne Wang
  
 
==Objective==
 
==Objective==
Create an intuitive and novel approach to a tablet based multi-touch and sensor-enabled controller for a real-time 3D visualization of coral reefs.
+
Create an intuitive and novel approach to a tablet based multi-touch and sensor-enabled controller for a real-time 3D visualization on a 2D platform.
 +
Using camera pose information relative to a 3D model displayed on a screen, we can display virtual camera shots in the 3D model space on the tablet.
  
 
----
 
----
 +
Spring 2011:
 +
* Created basic android application that has some ui components, and controls for them
 +
* Added networking capabilities between a server (running in a thread in OpenCover) and a client (running on custom camera activity in Android)
 +
* Am looking into getting camera pose information using ARToolkit
 +
* Will then use camera pose information to create a virtual camera from that orientation
  
==Spring 2011 Tasks==
+
====Android Basics:====
*text input + keyboard
+
*Sandboxed in a linux environment, each application is actually a user
*2d touch screen area
+
*1d or 2d accelerometer
+
*UDP sockets
+
*various layouts
+
*camera usage
+
*image transformations - image library?
+
*play with Motorola Xoom
+
*look into AR toolkit
+
  
Work done Spring 2011
+
*Model View Controller setup
*reading up on android platform
+
**Model - Content providers
*setting up environment
+
**View - XML
*basic ui elements in random layouts
+
**Controller - Activity, or Service
*touch,click,slide listeners
+
*UI Views
*text input
+
[[Image:mainUI.png]]
  
Goals for end of Spring 2011
+
<pre>
*accelerometer
+
<?xml version="1.0" encoding="utf-8"?>
*sockets
+
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
*camera
+
android:layout_width="match_parent"
*image transformations
+
    android:layout_height="match_parent"
*xoom installation
+
    android:orientation="vertical" >
*AR toolkit?
+
    <LinearLayout
*set up meeting with fang-pang
+
    android:layout_width="wrap_content"
 +
    android:layout_height="wrap_content"
 +
    android:padding="10dp">
 +
        <SeekBar android:id="@+id/seekBar1" android:layout_marginLeft="10dip" android:layout_width="200dip" android:layout_height="wrap_content"></SeekBar>
 +
    <TextView android:text="@string/hello"
 +
android:layout_height="wrap_content"
 +
android:layout_marginLeft="10dip"
 +
android:id="@+id/seekBarText"
 +
android:layout_width="wrap_content"
 +
android:layout_alignParentRight="true">
 +
</TextView>
 +
</LinearLayout>
 +
<RelativeLayout
 +
android:layout_width="fill_parent"
 +
    android:layout_height="fill_parent">
 +
<Button
 +
android:layout_height="wrap_content"
 +
android:id="@+id/button2"
 +
android:layout_width="wrap_content"
 +
android:text="type some text"
 +
android:layout_below="@+id/button3"
 +
android:layout_alignRight="@+id/button3">
 +
</Button>
 +
<Button
 +
android:layout_height="wrap_content"
 +
android:id="@+id/button4"
 +
android:layout_width="wrap_content"
 +
android:text="camera"
 +
android:layout_below="@+id/button2">
 +
</Button>
 +
<Button
 +
android:layout_height="wrap_content"
 +
android:id="@+id/button1"
 +
android:layout_width="wrap_content"
 +
android:text="socket send"
 +
android:layout_below="@+id/button4"
 +
android:layout_alignLeft="@+id/button4"
 +
android:layout_alignRight="@+id/button4">
 +
</Button>
 +
<Button
 +
android:layout_height="wrap_content"
 +
android:id="@+id/button3"
 +
android:layout_width="wrap_content"
 +
android:text="touch screen"
 +
android:layout_alignParentLeft="true"></Button>
 +
</RelativeLayout>
 +
<ImageView
 +
android:id="@+id/photoResultView"
 +
android:layout_width="wrap_content"
 +
android:layout_height="wrap_content"
 +
></ImageView>
 +
</LinearLayout>
 +
</pre>
 +
* UI Control Component
 +
**Buttons
 +
<pre>
 +
Button button = (Button)findViewById(R.id.button1);
 +
button.setOnClickListener(buttonOnClick);
 +
</pre>
 +
**Sliders
 +
<pre>
 +
SeekBar seekbar=(SeekBar)findViewById(R.id.seekBar1);
 +
seekbar.setOnSeekBarChangeListener(sbar);
 +
</pre>
 +
**TextView
 +
<pre>
 +
TextView seekbartxt=(TextView)findViewById(R.id.seekBarText);
 +
CharSequence t="slider progress:"+progress;
 +
seekbartxt.setText(t);
 +
</pre>
 +
**EditText
 +
***In Android manifest make sure to enable soft keyboard
 +
<pre><activity android:name=".TypeText"
 +
          android:windowSoftInputMode="stateAlwaysVisible|adjustResize">
 +
        </activity>
 +
</pre>
 +
<pre>
 +
EditText edtView=(EditText)findViewById(R.id.EditText01);
 +
edtView.setInputType(InputType.TYPE_CLASS_TEXT);
 +
</pre>
 +
**Toasts (timed popups)
 +
<pre>
 +
                    int duration = Toast.LENGTH_SHORT;
 +
                    View v;
 +
    CharSequence text = "button "+v.getId()+" pressed";
 +
    Toast t=Toast.makeText(v.getContext(),text, duration);
 +
    t.show();
 +
</pre>
 +
* Listeners
 +
**OnClickListener
 +
<pre>
 +
private OnClickListener showCamera = new OnClickListener() {
 +
    public void onClick(View v) {
 +
    //do stuff here
 +
            }
 +
</pre>
 +
**OnSeekBarChangeListener
 +
<pre>
 +
private SeekBar.OnSeekBarChangeListener sbar = new SeekBar.OnSeekBarChangeListener()
 +
{
 +
 +
public void onStopTrackingTouch(SeekBar seekBar) {
 +
}
 +
 +
public void onStartTrackingTouch(SeekBar seekBar) {
 +
}
 +
 +
public void onProgressChanged(SeekBar seekBar, int progress,
 +
boolean fromUser) {
 +
}
 +
};
 +
</pre>
 +
**OnTouchListener
 +
<pre>
 +
LinearLayout screen =(LinearLayout)findViewById(R.id.touch);
 +
screen.setOnTouchListener(new View.OnTouchListener() {
 +
            public boolean onTouch(View v, MotionEvent e) {
 +
            float x = e.getX();
 +
            float y = e.getY();
 +
    TextView text=(TextView)findViewById(R.id.touchtext);
 +
    CharSequence t="x:"+x+", y:"+y;
 +
    text.setText(t);
 +
return true;
 +
       
 +
            }
 +
        });
 +
</pre>
 +
 
 +
* Start New Intent
 +
<pre>
 +
Intent myIntent = new Intent(v.getContext(), TouchScreen.class);
 +
    startActivityForResult(myIntent, 0);
 +
</pre>
 +
 
 +
*Sockets
 +
**In Android Manifest be sure to enable internet usage
 +
<pre>
 +
<uses-permission android:name="android.permission.INTERNET" />
 +
</pre>
 +
<pre>
 +
public void sendSocket(String sendStr, String textStr)
 +
  {
 +
      try {
 +
          //Socket s = new Socket("137.110.119.121",11011);  // TourCAVE
 +
      Socket s = new Socket("137.110.118.26",3412); // sessions
 +
      //Socket s = new Socket("137.110.115.194",11011);  // HP laptop
 +
 
 +
          //outgoing stream redirect to socket
 +
          OutputStream out = s.getOutputStream();
 +
         
 +
 
 +
          PrintWriter output = new PrintWriter(out);
 +
          output.println(sendStr);
 +
          output.close();  // required to actually send the text
 +
         
 +
          Context context = getApplicationContext();
 +
          Toast.makeText(context, textStr, Toast.LENGTH_SHORT).show();
 +
 
 +
          //Close connection
 +
          s.close();
 +
      } catch (UnknownHostException e) {
 +
      Context context = getApplicationContext();
 +
          Toast.makeText(context, "UnknownHostException!",
 +
          Toast.LENGTH_SHORT).show();
 +
          e.printStackTrace();
 +
      } catch (IOException e) {
 +
      Context context = getApplicationContext();
 +
          Toast.makeText(context, "IOException! "+e,
 +
          Toast.LENGTH_SHORT).show();
 +
          e.printStackTrace();
 +
      }
 +
  }
 +
</pre>
 +
**Using sockets to get IP Address
 +
<pre>
 +
Socket s = new Socket("137.110.115.194",11011);  // HP laptop
 +
String myIPAddress = s.getLocalAddress().toString();
 +
</pre>
 +
 
 +
* Camera
 +
**In Android manifest be sure to include camera capabilities
 +
<pre>
 +
<uses-permission android:name="android.permission.CAMERA" />
 +
<uses-feature android:name="android.hardware.camera" />
 +
<uses-feature android:name="android.hardware.camera.autofocus" />
 +
</pre>
 +
**Default Camera
 +
<pre>
 +
                private static int CAMERA_PIC_REQUEST = 10232;
 +
    Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
 +
    intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1);
 +
    startActivityForResult(intent, CAMERA_PIC_REQUEST);
 +
 
 +
protected void onActivityResult(int requestCode, int resultCode, Intent data) { 
 +
    if (requestCode == CAMERA_PIC_REQUEST) { 
 +
    android.graphics.Bitmap thumbnail = (android.graphics.Bitmap) data.getExtras().get("data");   
 +
    ImageView image = (ImageView) findViewById(R.id.photoResultView); 
 +
    image.setImageBitmap(thumbnail);
 +
    } 
 +
}
 +
</pre>
 +
**Custom Camera
 +
<pre>
 +
 
 +
package jeanne.prime.tablet;
 +
 
 +
 
 +
import java.io.FileNotFoundException;
 +
import java.io.FileOutputStream;
 +
import java.io.IOException;
 +
 
 +
import android.content.Context;
 +
import android.graphics.Canvas;
 +
import android.graphics.Color;
 +
import android.graphics.Paint;
 +
import android.hardware.Camera;
 +
import android.hardware.Camera.PreviewCallback;
 +
import android.util.Log;
 +
import android.view.SurfaceHolder;
 +
import android.view.SurfaceView;
 +
 
 +
class Preview extends SurfaceView implements SurfaceHolder.Callback {
 +
private static final String TAG = "Preview";
 +
 
 +
SurfaceHolder mHolder;
 +
public Camera camera;
 +
 
 +
Preview(Context context) {
 +
super(context);
 +
 
 +
// Install a SurfaceHolder.Callback so we get notified when the
 +
// underlying surface is created and destroyed.
 +
mHolder = getHolder();
 +
mHolder.addCallback(this);
 +
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
 +
}
 +
 
 +
public void surfaceCreated(SurfaceHolder holder) {
 +
// The Surface has been created, acquire the camera and tell it where
 +
// to draw.
 +
camera = Camera.open();
 +
try {
 +
camera.setPreviewDisplay(holder);
 +
 
 +
camera.setPreviewCallback(new PreviewCallback() {
 +
 
 +
public void onPreviewFrame(byte[] data, Camera arg1) {
 +
//FileOutputStream outStream = null;
 +
try {
 +
Log.d(TAG, "onPreviewFrame - wrote bytes: "
 +
+ data.length);
 +
 
 +
} finally {
 +
}
 +
Preview.this.invalidate();
 +
}
 +
});
 +
} catch (IOException e) {
 +
e.printStackTrace();
 +
}
 +
}
 +
 
 +
public void surfaceDestroyed(SurfaceHolder holder) {
 +
// Surface will be destroyed when we return, so stop the preview.
 +
// Because the CameraDevice object is not a shared resource, it's very
 +
// important to release it when the activity is paused.
 +
camera.stopPreview();
 +
camera.release();
 +
camera = null;
 +
Log.d(TAG,"surfaceDestroyed");
 +
}
 +
 
 +
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
 +
// Now that the size is known, set up the camera parameters and begin
 +
// the preview.
 +
Camera.Parameters parameters = camera.getParameters();
 +
parameters.setPreviewSize(w, h);
 +
parameters.setPictureSize(100, 60);
 +
camera.setParameters(parameters);
 +
camera.startPreview();
 +
}
 +
 
 +
@Override
 +
public void draw(Canvas canvas) {
 +
super.draw(canvas);
 +
Paint p = new Paint(Color.RED);
 +
Log.d(TAG, "draw");
 +
canvas.drawText("PREVIEW", canvas.getWidth() / 2,
 +
canvas.getHeight() / 2, p);
 +
}
 +
}
 +
 
 +
package jeanne.prime.tablet;
 +
 
 +
import java.io.BufferedOutputStream;
 +
import java.io.ByteArrayOutputStream;
 +
import java.io.FileNotFoundException;
 +
import java.io.FileOutputStream;
 +
import java.io.IOException;
 +
import java.io.InputStream;
 +
import java.io.OutputStream;
 +
import java.io.PrintWriter;
 +
import java.net.Socket;
 +
import java.net.UnknownHostException;
 +
 
 +
import android.app.Activity;
 +
import android.content.Context;
 +
import android.content.Intent;
 +
import android.hardware.Camera;
 +
import android.hardware.Camera.PictureCallback;
 +
import android.hardware.Camera.ShutterCallback;
 +
import android.os.Bundle;
 +
import android.util.Log;
 +
import android.view.SurfaceHolder;
 +
import android.view.View;
 +
import android.view.View.OnClickListener;
 +
import android.widget.Button;
 +
import android.widget.FrameLayout;
 +
import android.widget.Toast;
 +
 
 +
/*
 +
* http://marakana.com/forums/android/examples/39.html
 +
* http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CameraPreview.html
 +
* http://mobile.tutsplus.com/tutorials/android/android-sdk-quick-tip-launching-the-camera/
 +
*/
 +
 
 +
public class CameraDemo extends Activity {
 +
private static final String TAG = "CameraDemo";
 +
Preview preview;
 +
Button buttonClick;
 +
Socket s;
 +
 
 +
/** Called when the activity is first created. */
 +
@Override
 +
public void onCreate(Bundle savedInstanceState) {
 +
super.onCreate(savedInstanceState);
 +
setContentView(R.layout.camera);
 +
 
 +
preview = new Preview(this);
 +
((FrameLayout) findViewById(R.id.preview)).addView(preview);
 +
 +
buttonClick = (Button) findViewById(R.id.buttonClick);
 +
buttonClick.setOnClickListener(new OnClickListener() {
 +
public void onClick(View v) {
 +
//preview.camera.startPreview();
 +
preview.camera.takePicture(null, rawCallback,
 +
jpegCallback);
 +
 
 +
}
 +
});
 +
 +
Button back = (Button) findViewById(R.id.backbutton);
 +
        back.setOnClickListener(new View.OnClickListener() {
 +
            public void onClick(View view) {
 +
                Intent intent = new Intent();
 +
                setResult(Activity.RESULT_OK, intent);
 +
                finish();
 +
            }
 +
 
 +
        });
 +
 
 +
Log.d(TAG, "onCreate'd");
 +
}
 +
public byte[] intToByteArray(int value) { //big endian
 +
        return new byte[] {
 +
                (byte)(value >>> 24),
 +
                (byte)(value >>> 16),
 +
                (byte)(value >>> 8),
 +
                (byte)value};
 +
}
 +
public byte[] intToByteArray2(int value) { //little endian
 +
        return new byte[] {
 +
        (byte)value,
 +
                (byte)(value >>> 8),
 +
                (byte)(value >>> 16),
 +
                (byte)(value >>> 24)
 +
                };
 +
}
 +
 +
public void sendPhoto(byte[] photo, String textStr)
 +
  {
 +
      try {
 +
     
 +
      s = new Socket("137.110.118.26",3412);// sessions
 +
 
 +
      s.setTcpNoDelay(true);
 +
          OutputStream out = s.getOutputStream();
 +
          //BufferedOutputStream out = new BufferedOutputStream(o);
 +
 
 +
          out.write(intToByteArray2(photo.length), 0, 4);
 +
          out.close();
 +
         
 +
          Socket s2 = new Socket("137.110.118.26",3412);// sessions
 +
 
 +
      s2.setTcpNoDelay(true);
 +
          OutputStream out2 = s2.getOutputStream();
 +
          out2.write(photo);
 +
          out2.close();
 +
         
 +
          Context context = getApplicationContext();
 +
          Toast.makeText(context, textStr, Toast.LENGTH_SHORT).show();
 +
          Log.d(TAG,textStr);
 +
 
 +
      } catch (UnknownHostException e) {
 +
      Context context = getApplicationContext();
 +
          Toast.makeText(context, "UnknownHostException!",
 +
          Toast.LENGTH_SHORT).show();
 +
          Log.d(TAG,"UnknownHostException!");
 +
          e.printStackTrace();
 +
      } catch (IOException e) {
 +
      Context context = getApplicationContext();
 +
          Toast.makeText(context, "IOException! "+e,
 +
          Toast.LENGTH_SHORT).show();
 +
          Log.d(TAG,"IOException! "+e);
 +
          e.printStackTrace();
 +
      }
 +
  }
 +
 +
ShutterCallback shutterCallback = new ShutterCallback() {
 +
public void onShutter() {
 +
Log.d(TAG, "onShutter'd");
 +
}
 +
};
 +
 
 +
/** Handles data for raw picture */
 +
PictureCallback rawCallback = new PictureCallback() {
 +
public void onPictureTaken(byte[] data, Camera camera) {
 +
camera.startPreview();
 +
if (data != null){
 +
sendPhoto(data, "photosize:"+data.length);
 +
}
 +
Log.d(TAG, "onPictureTaken - raw");
 +
}
 +
};
 +
 
 +
/** Handles data for jpeg picture */
 +
PictureCallback jpegCallback = new PictureCallback() {
 +
public void onPictureTaken(byte[] data, Camera camera) {
 +
if (data != null){
 +
sendPhoto(data, "photosize:"+data.length);
 +
Log.d(TAG, "onPictureTaken - jpeg,size:"+data.length);
 +
}
 +
 +
}
 +
};
 +
 +
public void onPause(){
 +
        //Close connection
 +
        try {
 +
s.close();
 +
} catch (IOException e) {
 +
// TODO Auto-generated catch block
 +
e.printStackTrace();
 +
}
 +
    }
 +
 
 +
}
 +
</pre>
 +
 
 +
====Hardware + Emulator====
 +
* [http://android-tricks.blogspot.com/2009/01/using-adp1-without-sim-card.html using-adp1-without-sim-card]
 +
* [http://forum.xda-developers.com/showthread.php?t=452316 register g1]
 +
* [http://www.blogsdna.com/1256/how-to-quickly-update-t1-mobile-g1-phone-firmware-manually-rc29.htm update g1 firmware]
 +
 
 +
====Network====
 +
* server
 +
<pre>
 +
#include <iostream>
 +
#include "CVRSocket.h"
 +
#include <jpeglib.h>
 +
#include <jerror.h>
 +
#include <jconfig.h>
 +
#include <jmorecfg.h>
 +
using namespace std;
 +
using namespace cvr;
 +
int main(){
 +
 +
while(true){
 +
CVRSocket server1=CVRSocket(LISTEN, "137.110.118.26", 3412,AF_INET,SOCK_STREAM);
 +
server1.setNoDelay(true);
 +
server1.setReuseAddress(true);
 +
server1.bind();
 +
server1.listen();
 +
server1.accept();
 +
int photoSize=0;
 +
bool received=server1.recv(&photoSize,sizeof(int));
 +
printf("photoSize:%d\n",photoSize);
 +
 
 +
CVRSocket server=CVRSocket(LISTEN, "137.110.118.26", 3412,AF_INET,SOCK_STREAM);
 +
server.setNoDelay(true);
 +
server.setReuseAddress(true);
 +
server.bind();
 +
server.listen();
 +
server.accept();
 +
if (photoSize > 0 && photoSize < 10000000){
 +
unsigned char * buf = (unsigned char *) malloc(sizeof(unsigned char)*photoSize);
 +
bool received=server.recv(buf,photoSize);
 +
if (received){
 +
printf("success photo received\n");
 +
}
 +
else{
 +
printf("error photo not received\n");
 +
}
 +
for (int i=0;i<100;i++){
 +
cerr << buf[i] << ",";
 +
//printf("%d,",buf[i]);
 +
}
 +
cerr << "\n";
 +
FILE * pFile;
 +
pFile = fopen ( "pic2.jpg" , "wb" );
 +
fwrite (buf , 1 , photoSize , pFile );
 +
fclose (pFile);
 +
}
 +
}
 +
}
 +
</pre>
 +
 
 +
 
 +
* client
 +
**See android socket code above, and also camera send code
 +
<pre>
 +
#include "CVRSocket.h"
 +
#include <iostream>
 +
using namespace cvr;
 +
 
 +
int main(){
 +
        CVRSocket client=CVRSocket(CONNECT, "137.110.118.26", 3412,AF_INET,SOCK_STREAM);
 +
        client.setNoDelay(true);
 +
        client.setReuseAddress(true);
 +
        client.connect();
 +
        int size=10;
 +
        client.send(&size,sizeof(int));
 +
        char * buf = (char *)malloc(sizeof(char)*size);
 +
        for (int i=0;i<size;i++){
 +
                buf[i]='a';
 +
        }
 +
        client.send(buf,size);
 +
 
 +
 
 +
}
 +
</pre>
 +
 
 +
====ARToolkit====
 +
*ARToolkit [http://www.hitl.washington.edu/artoolkit/]
 +
*There is also a java port [http://code.google.com/p/andar/]
 +
 
 +
Markers Code from Daniel Tenedorio
 +
* detect marker
 +
<pre>
 +
#include "Markers.h"
 +
 
 +
const char* Markers::hiroPatternName = "AR/patt.hiro";
 +
const char* Markers::kanjiPatternName = "AR/patt.kanji";
 +
const char* Markers::camParamName = "AR/camparams.dat";
 +
 
 +
Markers::Markers() : markersFound(0) {
 +
    ARParam wparam;
 +
    if (arParamLoad(camParamName, 1, &wparam) < 0) {
 +
        setError("Camera parameter load error!");
 +
        initialized = false;
 +
        return;
 +
    }
 +
 
 +
    // Insert previously calibrated focal length and principal point data into
 +
    // the calibration matrix. Account for no lens distortion, since the
 +
    // reprojection error has been reported to be low for the RGB camera.
 +
    wparam.mat[0][0] = 594.21f;
 +
    wparam.mat[1][1] = 591.04f;
 +
    wparam.mat[0][2] = 339.5f;
 +
    wparam.mat[1][2] = 242.7f;
 +
    wparam.dist_factor[0] = 319;
 +
    wparam.dist_factor[1] = 239;
 +
    wparam.dist_factor[2] = 0;
 +
    wparam.dist_factor[3] = 1;
 +
 
 +
    arInitCparam(&wparam);
 +
    if ((hiroPatternID = arLoadPatt(hiroPatternName)) < 0 ||
 +
        (kanjiPatternID = arLoadPatt(kanjiPatternName)) < 0) {
 +
        setError("AR Marker pattern load error!");
 +
        initialized = false;
 +
        return;
 +
    }
 +
    initialized = true;
 +
}
 +
 
 +
bool Markers::findMarkers(ARUint8* videoFrame) {
 +
    static double markerCenter[2] = { 0, 0 };
 +
    static double markerWidthMM = 203; // 20.3 cm
 +
    static int thresh = 100;
 +
    foundMarkers = false;
 +
    ARMarkerInfo* markerInfo;
 +
    if (!initialized) {
 +
        setError("Marker detection failed to initialize");
 +
        return false;
 +
    }
 +
    if (arDetectMarker(videoFrame, thresh, &markerInfo, &markersFound) < 0) {
 +
        setError("Marker detection failed");
 +
        return false;
 +
    }
 +
 
 +
    // Try to find a visible marker.
 +
    // cerr << markersFound << " markers found in the image" << endl;
 +
    int hiroMarker = -1;
 +
    int kanjiMarker = -1;
 +
    for(int curMarker = 0; curMarker < markersFound; ++curMarker) {
 +
        if (hiroPatternID == markerInfo[curMarker].id) {
 +
            if (hiroMarker == -1) {
 +
                hiroMarker = curMarker;
 +
            } else if (markerInfo[hiroMarker].cf < markerInfo[curMarker].cf) {
 +
                hiroMarker = curMarker;
 +
            }
 +
        }
 +
        if (kanjiPatternID == markerInfo[curMarker].id) {
 +
            if (kanjiMarker == -1) {
 +
                kanjiMarker = curMarker;
 +
            } else if (markerInfo[kanjiMarker].cf < markerInfo[curMarker].cf) {
 +
                kanjiMarker = curMarker;
 +
            }
 +
        }
 +
    }
 +
 
 +
    if (hiroMarker != -1) {
 +
        usingMarker = hiroMarker;
 +
    } else if (kanjiMarker != -1) {
 +
        usingMarker = kanjiMarker;
 +
    } else {
 +
        setError("No visible markers found");
 +
        return false;
 +
    }
 +
    double markerTrans[3][4];
 +
    double invMarkerTrans[3][4];
 +
    arGetTransMat(&markerInfo[usingMarker], markerCenter,
 +
                  markerWidthMM, markerTrans);
 +
    argConvGlpara(markerTrans, markerMatrix);
 +
    if (arUtilMatInv(markerTrans, invMarkerTrans) == 0) {
 +
        argConvGlpara(invMarkerTrans, invMarkerMatrix);
 +
    } else {
 +
        setError("Matrix inversion failed");
 +
    }
 +
    foundMarkers = true;
 +
    return true;
 +
}
 +
</pre>
 +
* pose
 +
<pre>
 +
bool Markers::getCameraPose(float trans[3], float dir[3]) {
 +
    if (!initialized) {
 +
        setError("Marker detection failed to initialize");
 +
        return false;
 +
    } else if (!foundMarkers) {
 +
        setError("Failed to detect markers last frame");
 +
        return false;
 +
    }
 +
    trans[0] = invMarkerMatrix[12];
 +
    trans[1] = invMarkerMatrix[13];
 +
    trans[2] = invMarkerMatrix[14];
 +
    vec3<float> camDir;
 +
    const double origin[] = { 0, 0, 0 };
 +
    const double zfar[] = { 0, 0, -1000 };
 +
    double camOrigin[3], camZFar[3];
 +
    multMatrixVec<double>(invMarkerMatrix, origin, camOrigin);
 +
    multMatrixVec<double>(invMarkerMatrix, zfar, camZFar);
 +
    camDir.set(camOrigin[0] - camZFar[0],
 +
              camOrigin[1] - camZFar[1],
 +
              camOrigin[2] - camZFar[2]);
 +
    camDir.normalize();
 +
    dir[0] = camDir.x;
 +
    dir[1] = camDir.y;
 +
    dir[2] = camDir.z;
 +
    return true;
 +
}
 +
 
 +
 
 +
</pre>
 +
 
 +
====OpenCover====
 +
*display marker
 +
<pre>
 +
char * filename = "/home/covise/covise/src/renderer/OpenCOVER/plugins/calit2/Boom/HiroPattern.png";
 +
 
 +
    Geode* artmarker = createImageGeode(filename);
 +
    MatrixTransform* rotation = new MatrixTransform();
 +
    MatrixTransform* trans = new MatrixTransform();
 +
    Matrixd m1,m3;
 +
    m1.makeRotate(3.14/2,1,0,0);
 +
    rotation->setMatrix(m1);
 +
    m3.makeTranslate(0,100-drawCounter,0);
 +
    trans->setMatrix(m3);
 +
    trans->addChild(rotation);
 +
    rotation->addChild(artmarker);
 +
    //cover->getScene()->addChild(trans); //makes it unmovable
 +
    cover->getObjectsRoot()->addChild(trans);
 +
 
 +
/** Loads image file into geode; returns NULL if image file cannot be loaded */
 +
Geode* Boom::createImageGeode(const char* filename)
 +
{
 +
  // Create OSG image:
 +
  Image* image;// = new Image();
 +
  image = osgDB::readImageFile(filename);
 +
  Texture2D * imageTexture;
 +
 
 +
  // Create OSG texture:
 +
  if (image)
 +
  {
 +
    cerr << "Image loaded\n";
 +
    imageTexture = new Texture2D();
 +
    imageTexture->setImage(image);
 +
  }
 +
  else
 +
  {
 +
    std::cerr << "Cannot load image file " << filename << std::endl;
 +
    //delete image;
 +
    return NULL;
 +
  }
 +
 
 +
  // Create OSG geode:
 +
  Geode* imageGeode = (Geode *) new Geode();
 +
  imageGeode->addDrawable(createImageGeometry(imageTexture));
 +
  return imageGeode;
 +
}
 +
/** Used by createImageGeode() */
 +
Geometry* Boom::createImageGeometry(Texture2D* imageTexture)
 +
{
 +
  const float WIDTH  = 300.0f;
 +
  const float HEIGHT = 200.0f;
 +
  Geometry* geom = new Geometry();
 +
 
 +
  // Create vertices:
 +
  Vec3Array* vertices = new Vec3Array(4);
 +
  (*vertices)[0].set(-WIDTH / 2.0, -HEIGHT / 2.0, 0); // bottom left
 +
  (*vertices)[1].set( WIDTH / 2.0, -HEIGHT / 2.0, 0); // bottom right
 +
  (*vertices)[2].set( WIDTH / 2.0, HEIGHT / 2.0, 0); // top right
 +
  (*vertices)[3].set(-WIDTH / 2.0, HEIGHT / 2.0, 0); // top left
 +
  geom->setVertexArray(vertices);
 +
 
 +
  // Create texture coordinates for image texture:
 +
  Vec2Array* texcoords = new Vec2Array(4);
 +
  (*texcoords)[0].set(0.0, 0.0);
 +
  (*texcoords)[1].set(1.0, 0.0);
 +
  (*texcoords)[2].set(1.0, 1.0);
 +
  (*texcoords)[3].set(0.0, 1.0);
 +
  geom->setTexCoordArray(0,texcoords);
 +
 
 +
  // Create normals:
 +
  Vec3Array* normals = new Vec3Array(1);
 +
  (*normals)[0].set(0.0f, 0.0f, 1.0f);
 +
  geom->setNormalArray(normals);
 +
  geom->setNormalBinding(Geometry::BIND_OVERALL);
 +
 
 +
  // Create colors:
 +
  Vec4Array* colors = new Vec4Array(1);
 +
  (*colors)[0].set(1.0, 1.0, 1.0, 1.0);
 +
  geom->setColorArray(colors);
 +
  geom->setColorBinding(Geometry::BIND_OVERALL);
 +
 
 +
  geom->addPrimitiveSet(new DrawArrays(PrimitiveSet::QUADS, 0, 4));
 +
 
 +
  Vec4 color(1.0,0.0,0.0,1.0);
 +
  // Set texture parameters:
 +
  StateSet* stateset = geom->getOrCreateStateSet();
 +
  stateset->setMode(GL_LIGHTING, StateAttribute::OFF);  // make texture visible independent of lighting
 +
  //stateset->setRenderingHint(StateSet::TRANSPARENT_BIN);  // only required for translucent images
 +
  stateset->setTextureAttributeAndModes(0, imageTexture, StateAttribute::ON);
 +
  Material* mat = new Material();
 +
  mat->setColorMode(Material::AMBIENT_AND_DIFFUSE);
 +
  mat->setDiffuse(Material::FRONT,color);
 +
  mat->setSpecular(Material::FRONT,color);
 +
  stateset->setAttribute(mat);
 +
  stateset->setAttributeAndModes(mat, StateAttribute::ON);
 +
 
 +
  cerr << "Geometry set\n";
 +
  return geom;
 +
}
 +
 
 +
</pre>
 +
 
 +
*display dataset
 +
<pre>
 +
char * modelfile = "/home/jeanne/data/pdb/4HHBcart.wrl";
 +
    Node* model = createModelGeode(modelfile);
 +
    MatrixTransform* scale = new MatrixTransform();
 +
    Matrixd m2;
 +
    m2=scale->getMatrix();
 +
    m2.makeScale(2,2,2);
 +
    scale->setMatrix(m2);
 +
    scale->addChild(model);
 +
    cover->getObjectsRoot()->addChild(scale);
 +
 
 +
/** Loads a VRML model into geode; returns NULL if image file cannot be loaded */
 +
Node* Boom::createModelGeode(const char* filename)
 +
{
 +
  osg::Node* modelNode = osgDB::readNodeFile(filename);
 +
  if(modelNode==NULL)
 +
  {
 +
      cerr << "Boom: Error reading file " << filename << endl;
 +
      return NULL;
 +
  }
 +
  else
 +
  {
 +
    return modelNode;
 +
  }
 +
}
 +
 
 +
</pre>
 +
 
 +
====Tips====
 +
* Tip: An easy way to add import packages to your project is to press Ctrl-Shift-O (Cmd-Shift-O, on Mac). This is an Eclipse shortcut that identifies missing packages based on your code and adds them for you.
 +
* To get cpuinfo on a particular machine look in: /proc/cpuinfo
 +
* To get ipaddress on a particular machine call /sbin/ifconfig
 +
 
 +
====Other possibilities====
 +
* Corona Sdk [[http://www.anscamobile.com/corona/]]
 +
* HTML5 DeviceMotionEvent  [[http://www.gauravmanek.com/blog/?p=33]]
 +
* PhoneGap [[http://docs.phonegap.com/]]

Latest revision as of 21:38, 4 June 2011

Contents

Android Controller

Camera.jpg

Jeanne Wang

Objective

Create an intuitive and novel approach to a tablet based multi-touch and sensor-enabled controller for a real-time 3D visualization on a 2D platform. Using camera pose information relative to a 3D model displayed on a screen, we can display virtual camera shots in the 3D model space on the tablet.


Spring 2011:

  • Created basic android application that has some ui components, and controls for them
  • Added networking capabilities between a server (running in a thread in OpenCover) and a client (running on custom camera activity in Android)
  • Am looking into getting camera pose information using ARToolkit
  • Will then use camera pose information to create a virtual camera from that orientation

Android Basics:

  • Sandboxed in a linux environment, each application is actually a user
  • Model View Controller setup
    • Model - Content providers
    • View - XML
    • Controller - Activity, or Service
  • UI Views

MainUI.png

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
	android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical" >
    <LinearLayout
    	android:layout_width="wrap_content"
    	android:layout_height="wrap_content"
    	android:padding="10dp">
        <SeekBar android:id="@+id/seekBar1" android:layout_marginLeft="10dip" android:layout_width="200dip" android:layout_height="wrap_content"></SeekBar>
	    <TextView android:text="@string/hello" 
			android:layout_height="wrap_content" 
			android:layout_marginLeft="10dip" 
			android:id="@+id/seekBarText" 
			android:layout_width="wrap_content" 
			android:layout_alignParentRight="true">
			</TextView>
	</LinearLayout>
	<RelativeLayout
		android:layout_width="fill_parent"
	    android:layout_height="fill_parent">
		<Button 
			android:layout_height="wrap_content" 
			android:id="@+id/button2" 
			android:layout_width="wrap_content" 
			android:text="type some text" 
			android:layout_below="@+id/button3" 
			android:layout_alignRight="@+id/button3">
		</Button>
		<Button 
			android:layout_height="wrap_content" 
			android:id="@+id/button4" 
			android:layout_width="wrap_content" 
			android:text="camera" 
			android:layout_below="@+id/button2">
		</Button>
		<Button 
		android:layout_height="wrap_content" 
		android:id="@+id/button1" 
		android:layout_width="wrap_content" 
		android:text="socket send" 
		android:layout_below="@+id/button4" 
		android:layout_alignLeft="@+id/button4" 
		android:layout_alignRight="@+id/button4">
		</Button>
		<Button 
		android:layout_height="wrap_content" 
		android:id="@+id/button3" 
		android:layout_width="wrap_content" 
		android:text="touch screen" 
		android:layout_alignParentLeft="true"></Button>
	</RelativeLayout>
	<ImageView
		android:id="@+id/photoResultView"
		android:layout_width="wrap_content" 
		android:layout_height="wrap_content" 
	></ImageView>
</LinearLayout>
  • UI Control Component
    • Buttons
Button button = (Button)findViewById(R.id.button1);
button.setOnClickListener(buttonOnClick);
    • Sliders
SeekBar seekbar=(SeekBar)findViewById(R.id.seekBar1);
seekbar.setOnSeekBarChangeListener(sbar);
    • TextView
TextView seekbartxt=(TextView)findViewById(R.id.seekBarText);
			CharSequence t="slider progress:"+progress;
			seekbartxt.setText(t);
    • EditText
      • In Android manifest make sure to enable soft keyboard
<activity android:name=".TypeText"
        		  android:windowSoftInputMode="stateAlwaysVisible|adjustResize">
        </activity>
EditText edtView=(EditText)findViewById(R.id.EditText01);
edtView.setInputType(InputType.TYPE_CLASS_TEXT);
    • Toasts (timed popups)
                    int duration = Toast.LENGTH_SHORT;
                    View v;
		    CharSequence text = "button "+v.getId()+" pressed";
		    Toast t=Toast.makeText(v.getContext(),text, duration);
		    t.show();
  • Listeners
    • OnClickListener
private OnClickListener showCamera = new OnClickListener() {
	    public void onClick(View v) {
		    	//do stuff here
            }
    • OnSeekBarChangeListener
private SeekBar.OnSeekBarChangeListener sbar = new SeekBar.OnSeekBarChangeListener()
	{
		
		public void onStopTrackingTouch(SeekBar seekBar) {			
		}
		
		public void onStartTrackingTouch(SeekBar seekBar) {
		}
		
		public void onProgressChanged(SeekBar seekBar, int progress,
				boolean fromUser) {
		}
	};
    • OnTouchListener
LinearLayout screen =(LinearLayout)findViewById(R.id.touch);
screen.setOnTouchListener(new View.OnTouchListener() {
            public boolean onTouch(View v, MotionEvent e) {
            	float x = e.getX();
            	float y = e.getY();
    			TextView text=(TextView)findViewById(R.id.touchtext);
    			CharSequence t="x:"+x+", y:"+y;
    			text.setText(t);
				return true;
         
            }
        });
  • Start New Intent
Intent myIntent = new Intent(v.getContext(), TouchScreen.class);
		    startActivityForResult(myIntent, 0);
  • Sockets
    • In Android Manifest be sure to enable internet usage
<uses-permission android:name="android.permission.INTERNET" />
public void sendSocket(String sendStr, String textStr)
	   {
	       try {
	           //Socket s = new Socket("137.110.119.121",11011);   // TourCAVE
	    	   Socket s = new Socket("137.110.118.26",3412); // sessions
	    	   //Socket s = new Socket("137.110.115.194",11011);   // HP laptop

	           //outgoing stream redirect to socket
	           OutputStream out = s.getOutputStream();
	           

	           PrintWriter output = new PrintWriter(out);
	           output.println(sendStr);
	           output.close();  // required to actually send the text
	           
	           Context context = getApplicationContext();
	           Toast.makeText(context, textStr, Toast.LENGTH_SHORT).show();

	           //Close connection
	           s.close();
	       } catch (UnknownHostException e) {
	    	   Context context = getApplicationContext();
	           Toast.makeText(context, "UnknownHostException!",
	        		   Toast.LENGTH_SHORT).show();
	           e.printStackTrace();
	       } catch (IOException e) {
	    	   Context context = getApplicationContext();
	           Toast.makeText(context, "IOException! "+e,
	        		   Toast.LENGTH_SHORT).show();
	           e.printStackTrace();
	       }
	   }
    • Using sockets to get IP Address
Socket s = new Socket("137.110.115.194",11011);   // HP laptop
String myIPAddress = s.getLocalAddress().toString();
  • Camera
    • In Android manifest be sure to include camera capabilities
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
    • Default Camera
                private static int CAMERA_PIC_REQUEST = 10232;
	    	Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
	    	intent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1);
	    	startActivityForResult(intent, CAMERA_PIC_REQUEST);

	protected void onActivityResult(int requestCode, int resultCode, Intent data) {  
	    if (requestCode == CAMERA_PIC_REQUEST) {  
	    	android.graphics.Bitmap thumbnail = (android.graphics.Bitmap) data.getExtras().get("data");    
	    	ImageView image = (ImageView) findViewById(R.id.photoResultView);  
	    	image.setImageBitmap(thumbnail); 
	    }  
	} 
    • Custom Camera

package jeanne.prime.tablet;


import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;

import android.content.Context;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;

class Preview extends SurfaceView implements SurfaceHolder.Callback {
	private static final String TAG = "Preview";

	SurfaceHolder mHolder;
	public Camera camera;

	Preview(Context context) {
		super(context);

		// Install a SurfaceHolder.Callback so we get notified when the
		// underlying surface is created and destroyed.
		mHolder = getHolder();
		mHolder.addCallback(this);
		mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
	}

	public void surfaceCreated(SurfaceHolder holder) {
		// The Surface has been created, acquire the camera and tell it where
		// to draw.
		camera = Camera.open();
		try {
			camera.setPreviewDisplay(holder);

			camera.setPreviewCallback(new PreviewCallback() {

				public void onPreviewFrame(byte[] data, Camera arg1) {
					//FileOutputStream outStream = null;
					try {
						Log.d(TAG, "onPreviewFrame - wrote bytes: "
								+ data.length);

					} finally {
					}
					Preview.this.invalidate();
				}
			});
		} catch (IOException e) {
			e.printStackTrace();
		}
	}

	public void surfaceDestroyed(SurfaceHolder holder) {
		// Surface will be destroyed when we return, so stop the preview.
		// Because the CameraDevice object is not a shared resource, it's very
		// important to release it when the activity is paused.
		camera.stopPreview();
		camera.release();
		camera = null;
		Log.d(TAG,"surfaceDestroyed");
	}

	public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
		// Now that the size is known, set up the camera parameters and begin
		// the preview.
		Camera.Parameters parameters = camera.getParameters();
		parameters.setPreviewSize(w, h);
		parameters.setPictureSize(100, 60);
		camera.setParameters(parameters);
		camera.startPreview();
	}

	@Override
	public void draw(Canvas canvas) {
		super.draw(canvas);
		Paint p = new Paint(Color.RED);
		Log.d(TAG, "draw");
		canvas.drawText("PREVIEW", canvas.getWidth() / 2,
				canvas.getHeight() / 2, p);
	}
}

package jeanne.prime.tablet;

import java.io.BufferedOutputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.PrintWriter;
import java.net.Socket;
import java.net.UnknownHostException;

import android.app.Activity;
import android.content.Context;
import android.content.Intent;
import android.hardware.Camera;
import android.hardware.Camera.PictureCallback;
import android.hardware.Camera.ShutterCallback;
import android.os.Bundle;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.FrameLayout;
import android.widget.Toast;

/*
 * http://marakana.com/forums/android/examples/39.html
 * http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CameraPreview.html 
 * http://mobile.tutsplus.com/tutorials/android/android-sdk-quick-tip-launching-the-camera/
 */

public class CameraDemo extends Activity {
	private static final String TAG = "CameraDemo";
	Preview preview;
	Button buttonClick;
	Socket s;

	/** Called when the activity is first created. */
	@Override
	public void onCreate(Bundle savedInstanceState) {
		super.onCreate(savedInstanceState);
		setContentView(R.layout.camera);

		preview = new Preview(this);
		((FrameLayout) findViewById(R.id.preview)).addView(preview);
		
		buttonClick = (Button) findViewById(R.id.buttonClick);
		buttonClick.setOnClickListener(new OnClickListener() {
			public void onClick(View v) {
				//preview.camera.startPreview();
				preview.camera.takePicture(null, rawCallback,
						jpegCallback);

			}
		});
		
		Button back = (Button) findViewById(R.id.backbutton);
        back.setOnClickListener(new View.OnClickListener() {
            public void onClick(View view) {
                Intent intent = new Intent();
                setResult(Activity.RESULT_OK, intent);
                finish();
            }

        });

		Log.d(TAG, "onCreate'd");
	}
	public byte[] intToByteArray(int value) { //big endian
        return new byte[] {
                (byte)(value >>> 24),
                (byte)(value >>> 16),
                (byte)(value >>> 8),
                (byte)value};
	}
	public byte[] intToByteArray2(int value) { //little endian
        return new byte[] {
        		(byte)value,
                (byte)(value >>> 8),
                (byte)(value >>> 16),
                (byte)(value >>> 24)
                };
	}
	
	public void sendPhoto(byte[] photo, String textStr)
	   {
	       try {
	       
	    	   s = new Socket("137.110.118.26",3412);// sessions

	    	   s.setTcpNoDelay(true);
	           OutputStream out = s.getOutputStream();
	           //BufferedOutputStream out = new BufferedOutputStream(o);

	           out.write(intToByteArray2(photo.length), 0, 4);
	           out.close();
	           
	           Socket s2 = new Socket("137.110.118.26",3412);// sessions

	    	   s2.setTcpNoDelay(true);
	           OutputStream out2 = s2.getOutputStream();
	           out2.write(photo);
	           out2.close();
	           
	           Context context = getApplicationContext();
	           Toast.makeText(context, textStr, Toast.LENGTH_SHORT).show();
	           Log.d(TAG,textStr);

	       } catch (UnknownHostException e) {
	    	   Context context = getApplicationContext();
	           Toast.makeText(context, "UnknownHostException!",
	        		   Toast.LENGTH_SHORT).show();
	           Log.d(TAG,"UnknownHostException!");
	           e.printStackTrace();
	       } catch (IOException e) {
	    	   Context context = getApplicationContext();
	           Toast.makeText(context, "IOException! "+e,
	        		   Toast.LENGTH_SHORT).show();
	           Log.d(TAG,"IOException! "+e);
	           e.printStackTrace();
	       }
	   }
	
	ShutterCallback shutterCallback = new ShutterCallback() {
		public void onShutter() {
			Log.d(TAG, "onShutter'd");
		}
	};

	/** Handles data for raw picture */
	PictureCallback rawCallback = new PictureCallback() {
		public void onPictureTaken(byte[] data, Camera camera) {
			camera.startPreview();
			if (data != null){
				sendPhoto(data, "photosize:"+data.length);
			}
			Log.d(TAG, "onPictureTaken - raw");
		}
	};

	/** Handles data for jpeg picture */
	PictureCallback jpegCallback = new PictureCallback() {
		public void onPictureTaken(byte[] data, Camera camera) {
			if (data != null){
				sendPhoto(data, "photosize:"+data.length);
				Log.d(TAG, "onPictureTaken - jpeg,size:"+data.length);
			}
			
		}
	};
	
	public void onPause(){
        //Close connection
        try {
			s.close();
		} catch (IOException e) {
			// TODO Auto-generated catch block
			e.printStackTrace();
		}
    }

}

Hardware + Emulator

Network

  • server
#include <iostream>
#include "CVRSocket.h"
#include <jpeglib.h>
#include <jerror.h>
#include <jconfig.h>
#include <jmorecfg.h>
using namespace std;
using namespace cvr;
int main(){
	
	while(true){
		CVRSocket server1=CVRSocket(LISTEN, "137.110.118.26", 3412,AF_INET,SOCK_STREAM);
		server1.setNoDelay(true);
		server1.setReuseAddress(true);
		server1.bind();
		server1.listen();
		server1.accept();
		int photoSize=0;
		bool received=server1.recv(&photoSize,sizeof(int));
		printf("photoSize:%d\n",photoSize);

		CVRSocket server=CVRSocket(LISTEN, "137.110.118.26", 3412,AF_INET,SOCK_STREAM);
		server.setNoDelay(true);
		server.setReuseAddress(true);
		server.bind();
		server.listen();
		server.accept();
		if (photoSize > 0 && photoSize < 10000000){
			unsigned char * buf = (unsigned char *) malloc(sizeof(unsigned char)*photoSize);
			bool received=server.recv(buf,photoSize);
			if (received){
				printf("success photo received\n");
			}
			else{
				printf("error photo not received\n");
			}
			for (int i=0;i<100;i++){
				cerr << buf[i] << ",";
				//printf("%d,",buf[i]);
			}
			cerr << "\n";
			FILE * pFile;
			pFile = fopen ( "pic2.jpg" , "wb" );
			fwrite (buf , 1 , photoSize , pFile );
			fclose (pFile);
		}
	}
}


  • client
    • See android socket code above, and also camera send code
#include "CVRSocket.h"
#include <iostream>
using namespace cvr;

int main(){
        CVRSocket client=CVRSocket(CONNECT, "137.110.118.26", 3412,AF_INET,SOCK_STREAM);
        client.setNoDelay(true);
        client.setReuseAddress(true);
        client.connect();
        int size=10;
        client.send(&size,sizeof(int));
        char * buf = (char *)malloc(sizeof(char)*size);
        for (int i=0;i<size;i++){
                buf[i]='a';
        }
        client.send(buf,size);


}

ARToolkit

  • ARToolkit [1]
  • There is also a java port [2]

Markers Code from Daniel Tenedorio

  • detect marker
#include "Markers.h"

const char* Markers::hiroPatternName = "AR/patt.hiro";
const char* Markers::kanjiPatternName = "AR/patt.kanji";
const char* Markers::camParamName = "AR/camparams.dat";

Markers::Markers() : markersFound(0) {
    ARParam wparam;
    if (arParamLoad(camParamName, 1, &wparam) < 0) {
        setError("Camera parameter load error!");
        initialized = false;
        return;
    }

    // Insert previously calibrated focal length and principal point data into
    // the calibration matrix. Account for no lens distortion, since the
    // reprojection error has been reported to be low for the RGB camera.
    wparam.mat[0][0] = 594.21f;
    wparam.mat[1][1] = 591.04f;
    wparam.mat[0][2] = 339.5f;
    wparam.mat[1][2] = 242.7f;
    wparam.dist_factor[0] = 319;
    wparam.dist_factor[1] = 239;
    wparam.dist_factor[2] = 0;
    wparam.dist_factor[3] = 1;

    arInitCparam(&wparam);
    if ((hiroPatternID = arLoadPatt(hiroPatternName)) < 0 ||
        (kanjiPatternID = arLoadPatt(kanjiPatternName)) < 0) {
        setError("AR Marker pattern load error!");
        initialized = false;
        return;
    }
    initialized = true;
}

bool Markers::findMarkers(ARUint8* videoFrame) {
    static double markerCenter[2] = { 0, 0 };
    static double markerWidthMM = 203; // 20.3 cm
    static int thresh = 100;
    foundMarkers = false;
    ARMarkerInfo* markerInfo;
    if (!initialized) {
        setError("Marker detection failed to initialize");
        return false;
    }
    if (arDetectMarker(videoFrame, thresh, &markerInfo, &markersFound) < 0) {
        setError("Marker detection failed");
        return false;
    }

    // Try to find a visible marker.
    // cerr << markersFound << " markers found in the image" << endl;
    int hiroMarker = -1;
    int kanjiMarker = -1;
    for(int curMarker = 0; curMarker < markersFound; ++curMarker) {
        if (hiroPatternID == markerInfo[curMarker].id) {
            if (hiroMarker == -1) {
                hiroMarker = curMarker;
            } else if (markerInfo[hiroMarker].cf < markerInfo[curMarker].cf) {
                hiroMarker = curMarker;
            }
        }
        if (kanjiPatternID == markerInfo[curMarker].id) {
            if (kanjiMarker == -1) {
                kanjiMarker = curMarker;
            } else if (markerInfo[kanjiMarker].cf < markerInfo[curMarker].cf) {
                kanjiMarker = curMarker;
            }
        }
    }

    if (hiroMarker != -1) {
        usingMarker = hiroMarker;
    } else if (kanjiMarker != -1) {
        usingMarker = kanjiMarker;
    } else {
        setError("No visible markers found");
        return false;
    }
    double markerTrans[3][4];
    double invMarkerTrans[3][4];
    arGetTransMat(&markerInfo[usingMarker], markerCenter, 
                  markerWidthMM, markerTrans);
    argConvGlpara(markerTrans, markerMatrix);
    if (arUtilMatInv(markerTrans, invMarkerTrans) == 0) {
        argConvGlpara(invMarkerTrans, invMarkerMatrix);
    } else {
        setError("Matrix inversion failed");
    }
    foundMarkers = true;
    return true;
}
  • pose
bool Markers::getCameraPose(float trans[3], float dir[3]) {
    if (!initialized) {
        setError("Marker detection failed to initialize");
        return false;
    } else if (!foundMarkers) {
        setError("Failed to detect markers last frame");
        return false;
    }
    trans[0] = invMarkerMatrix[12];
    trans[1] = invMarkerMatrix[13];
    trans[2] = invMarkerMatrix[14];
    vec3<float> camDir;
    const double origin[] = { 0, 0, 0 };
    const double zfar[] = { 0, 0, -1000 };
    double camOrigin[3], camZFar[3];
    multMatrixVec<double>(invMarkerMatrix, origin, camOrigin);
    multMatrixVec<double>(invMarkerMatrix, zfar, camZFar);
    camDir.set(camOrigin[0] - camZFar[0], 
               camOrigin[1] - camZFar[1],
               camOrigin[2] - camZFar[2]);
    camDir.normalize();
    dir[0] = camDir.x;
    dir[1] = camDir.y;
    dir[2] = camDir.z;
    return true;
}


OpenCover

  • display marker
char * filename = "/home/covise/covise/src/renderer/OpenCOVER/plugins/calit2/Boom/HiroPattern.png";

    Geode* artmarker = createImageGeode(filename);
    MatrixTransform* rotation = new MatrixTransform();
    MatrixTransform* trans = new MatrixTransform();
    Matrixd m1,m3;
    m1.makeRotate(3.14/2,1,0,0);
    rotation->setMatrix(m1);
    m3.makeTranslate(0,100-drawCounter,0);
    trans->setMatrix(m3);
    trans->addChild(rotation);
    rotation->addChild(artmarker);
    //cover->getScene()->addChild(trans); //makes it unmovable
    cover->getObjectsRoot()->addChild(trans);

/** Loads image file into geode; returns NULL if image file cannot be loaded */
Geode* Boom::createImageGeode(const char* filename)
{
  // Create OSG image:
  Image* image;// = new Image();
  image = osgDB::readImageFile(filename);
  Texture2D * imageTexture;

  // Create OSG texture:
  if (image)
  {
    cerr << "Image loaded\n";
    imageTexture = new Texture2D();
    imageTexture->setImage(image);
  }
  else
  {
    std::cerr << "Cannot load image file " << filename << std::endl;
    //delete image;
    return NULL;
  }

  // Create OSG geode:
  Geode* imageGeode = (Geode *) new Geode();
  imageGeode->addDrawable(createImageGeometry(imageTexture));
  return imageGeode;
}
/** Used by createImageGeode() */
Geometry* Boom::createImageGeometry(Texture2D* imageTexture)
{
  const float WIDTH  = 300.0f;
  const float HEIGHT = 200.0f;
  Geometry* geom = new Geometry();

  // Create vertices:
  Vec3Array* vertices = new Vec3Array(4);
  (*vertices)[0].set(-WIDTH / 2.0, -HEIGHT / 2.0, 0); // bottom left
  (*vertices)[1].set( WIDTH / 2.0, -HEIGHT / 2.0, 0); // bottom right
  (*vertices)[2].set( WIDTH / 2.0, HEIGHT / 2.0, 0); // top right
  (*vertices)[3].set(-WIDTH / 2.0, HEIGHT / 2.0, 0); // top left
  geom->setVertexArray(vertices);

  // Create texture coordinates for image texture:
  Vec2Array* texcoords = new Vec2Array(4);
  (*texcoords)[0].set(0.0, 0.0);
  (*texcoords)[1].set(1.0, 0.0);
  (*texcoords)[2].set(1.0, 1.0);
  (*texcoords)[3].set(0.0, 1.0);
  geom->setTexCoordArray(0,texcoords);

  // Create normals:
  Vec3Array* normals = new Vec3Array(1);
  (*normals)[0].set(0.0f, 0.0f, 1.0f);
  geom->setNormalArray(normals);
  geom->setNormalBinding(Geometry::BIND_OVERALL);

  // Create colors:
  Vec4Array* colors = new Vec4Array(1);
  (*colors)[0].set(1.0, 1.0, 1.0, 1.0);
  geom->setColorArray(colors);
  geom->setColorBinding(Geometry::BIND_OVERALL);

  geom->addPrimitiveSet(new DrawArrays(PrimitiveSet::QUADS, 0, 4));

  Vec4 color(1.0,0.0,0.0,1.0);
  // Set texture parameters:
  StateSet* stateset = geom->getOrCreateStateSet();
  stateset->setMode(GL_LIGHTING, StateAttribute::OFF);  // make texture visible independent of lighting
  //stateset->setRenderingHint(StateSet::TRANSPARENT_BIN);  // only required for translucent images
  stateset->setTextureAttributeAndModes(0, imageTexture, StateAttribute::ON);
  Material* mat = new Material();
  mat->setColorMode(Material::AMBIENT_AND_DIFFUSE);
  mat->setDiffuse(Material::FRONT,color);
  mat->setSpecular(Material::FRONT,color);
  stateset->setAttribute(mat);
  stateset->setAttributeAndModes(mat, StateAttribute::ON);

  cerr << "Geometry set\n";
  return geom;
}

  • display dataset
char * modelfile = "/home/jeanne/data/pdb/4HHBcart.wrl";
    Node* model = createModelGeode(modelfile);
    MatrixTransform* scale = new MatrixTransform();
    Matrixd m2;
    m2=scale->getMatrix();
    m2.makeScale(2,2,2);
    scale->setMatrix(m2);
    scale->addChild(model);
    cover->getObjectsRoot()->addChild(scale);

/** Loads a VRML model into geode; returns NULL if image file cannot be loaded */
Node* Boom::createModelGeode(const char* filename)
{
  osg::Node* modelNode = osgDB::readNodeFile(filename);
  if(modelNode==NULL)
  {
      cerr << "Boom: Error reading file " << filename << endl;
      return NULL;
  }
  else
  {
    return modelNode;
  }
}

Tips

  • Tip: An easy way to add import packages to your project is to press Ctrl-Shift-O (Cmd-Shift-O, on Mac). This is an Eclipse shortcut that identifies missing packages based on your code and adds them for you.
  • To get cpuinfo on a particular machine look in: /proc/cpuinfo
  • To get ipaddress on a particular machine call /sbin/ifconfig

Other possibilities

  • Corona Sdk [[3]]
  • HTML5 DeviceMotionEvent [[4]]
  • PhoneGap [[5]]