Augmented Reality – II

In my last post I showed how to get position and orientation updates. In this short post (also because it’s quite simple) I’ll show how to integrate it with the camera preview.
To show the camera preview in Android it’s quite easy, just create a class that extends from SurfaceView and implements the SurfaceHolder.Callback methods:
[code lang=”java”]
package com.fuzzion.argine.viewer;
import android.content.Context;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
public class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
private Camera camera;
private boolean running = false;
public CameraView(Context context) {
super(context);
SurfaceHolder holder = getHolder();
holder.addCallback(this);
holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
setFocusable(true);
requestFocus();
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
if(running) {
camera.stopPreview();
}
try {
camera.setPreviewDisplay(holder);
} catch(Exception e) {}
camera.startPreview();
running = true;
camera.setPreviewCallback(this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
if(camera != null) {
camera.stopPreview();
running = false;
camera.release();
camera = null;
}
} catch(Exception e) {}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
}
}
[/code]
It’s really simple, just using the startPreview() / stopPreview methods. To overlay our previous view with the camera preview we have to do that in our Activity main class:
[code lang=”java”]
cv = new CameraView(this);
tv = new TestView(this);
setContentView(cv);
addContentView(tv, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
[/code]
We’re adding the camera preview and our previous viewer (removing the background, and changing the arrow color to white) so we can see both views at the same time.
I’ve also implemented the PreviewCallback class to receive callbacks with the camera raw data (byte[]). It might be useful some day..

Augmented Reality – I

Since I had my first android phone I got curiosity into augmented reality. I have to say, compared to j2me, android is a lot more powerful and you get surprised about how easy is to achieve some things than in j2me are near impossible or rather complicated. Also seems that AR is some kind of teenager fashion, now it’s really cool to do things in AR instead of just showing a map with POIs. Let’s have the POIs floating around the user, even if it’s more confusing than showing a map, but, hey! it’s cooler.
Anyway, I had to try it by myself, otherwise I couldn’t apply the cliché of having an android phone and did some AR tests. I’ll introduce the engine I developed with different articles, explaining few parts of it, and if anyone is interested I can provide access to the svn server where the code is stored (send me an email to raimon.rafols [a] gmail.com). You can use this code for any non-commercial project you want but you need the written permission of the author for commercial usage.
One of the basic parts needed for an AR engine is to know the exact location and orientation of the device. Otherwise it won’t be possible to show the correct POIs and in the correct position in the screen. Here is a code snippet of the orientation provider.
[code lang=”java”]
package com.fuzzion.argine.hal.android;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import android.app.Activity;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.util.Log;
import com.fuzzion.argine.engine.OrientationListener;
import com.fuzzion.argine.hal.OrientationProvider;
public class AndroidOrientationProvider implements OrientationProvider, SensorEventListener {
private SensorManager sManager;
private List listeners;
public AndroidOrientationProvider() {
listeners = Collections.synchronizedList(new ArrayList());
}
@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {}
@Override
public void onSensorChanged(SensorEvent event) {
float values[] = event.values;
switch(event.sensor.getType()) {
case Sensor.TYPE_ACCELEROMETER:
break;
case Sensor.TYPE_ORIENTATION:
for(OrientationListener listener : listeners) {
listener.directionChanged(values);
}
break;
}
}
@Override
public void registerOrientationListener(OrientationListener listener) {
listeners.add(listener);
}
@Override
public void removeOrientationListener(OrientationListener listener) {
listeners.remove(listener);
}
public void start() {
Activity act = AndroidPlatform.getActivity();
sManager = (SensorManager) act.getSystemService(Context.SENSOR_SERVICE);
sManager.registerListener(this, sManager.getDefaultSensor(Sensor.TYPE_ORIENTATION), SensorManager.SENSOR_DELAY_FASTEST);
}
public void stop() {
if(sManager != null) {
sManager.unregisterListener(this);
}
}
}
[/code]
In the next entry I’ll explain the integration of all the parts (camera, orientation and position). Meanwhile you can check the code in svn.