Categories
Android code Java opencv programming vision

OpenCV2.1 on Android quickey with Haar object detection [w/ code]

Hi!
Long time no post… MIT is kicking my ass with work. But it was amazing to come back to so many comments with people anxious to get OpenCV going mobile!
Anyway, just wanted to share my work on object detection using OpenCV2.1 on the Android.

Although it seems like a trivial task, since you can just compile OCV2.1 as native lib and use JNI to access it – actually I havn’t seen too many people claim to have done it nicely and also share code… (Ahem, computer-vision-software.com, share the knowledge!)
Anyway, this is a quickey so I’ll be brief. I followed android-opencv project instructions for compiling (using Crystax NDK), and successfully ran their example CVCamera app on my device. A good starting point.
But – the suggested API they use is so cumbersome… took me a while to figure out, but in the end I couldn’t be bothered with re-writing the silly parts so I just used it as is.
To save you some time, what I basically did was add a function to detect objects:

int Detector::detectAndDrawObjects(int idx, image_pool* pool) {
	vector<Rect> objects;
	const static Scalar colors[] =  { CV_RGB(0,0,255),
        CV_RGB(0,128,255),
        CV_RGB(0,255,255),
        CV_RGB(0,255,0),
        CV_RGB(255,128,0),
        CV_RGB(255,255,0),
        CV_RGB(255,0,0),
        CV_RGB(255,0,255)} ;
	double scale = 2.0;
	Mat* _img = pool->getImage(idx);
	Mat tmp;
	resize(*_img,tmp,Size(_img->cols/2.0, _img->rows/2.0));
	double angle = -90.0;
	Point2f src_center(tmp.rows/2.0, tmp.rows/2.0);
    Mat rot_mat = getRotationMatrix2D(src_center, angle, 1.0/scale);
    Mat dst;
    warpAffine(tmp, dst, rot_mat, Size(tmp.rows,tmp.cols));
	flip(dst,dst,1);
	Mat img = dst;
	Mat gray, smallImg; //( cvRound (img.rows/scale), cvRound(img.cols/scale), CV_8UC1 );
	cvtColor( img, gray, CV_BGR2GRAY );
	smallImg = gray;
        equalizeHist( smallImg, smallImg );
	int minobjsize = 40;
	this->cascade.detectMultiScale( smallImg, objects,
							 1.1, 2, 0
							 |CV_HAAR_FIND_BIGGEST_OBJECT
							 //|CV_HAAR_DO_ROUGH_SEARCH
							 |CV_HAAR_SCALE_IMAGE
							 ,
							 Size(minobjsize, minobjsize) );
	stringstream ss; ss << objects.size() << " objects, " << smallImg.cols << "x" << smallImg.rows;
	putText(img,ss.str(),Point(20,20),FONT_HERSHEY_PLAIN,1.0,Scalar(0,255,0),2);
	int i = 0; scale = 1.0;
    for( vector<Rect>::const_iterator r = objects.begin(); r != objects.end(); r++, i++ )
    {
        Point center;
        Scalar color = colors[i%8];
        int radius;
        center.x = cvRound((r->x + r->width*0.5)*scale);
        center.y = cvRound((r->y + r->height*0.5)*scale);
        radius = cvRound((r->width + r->height)*0.25*scale);
        circle( img, center, radius, color, 3, 8, 0 );
		stringstream ss1; ss1 << r->x << "," << r->y;
		putText(img,ss1.str(),Point(20,30),FONT_HERSHEY_PLAIN,1.0,Scalar(0,255,0),2);
	}
	//whole area
	rectangle(img,Point(0,0),Point(img.cols-1,img.rows-1),Scalar(0,255,0),3);
	//a [minobjsize]x[minobjsize] rect
	rectangle(img,Point(img.cols/2.0 - minobjsize/2.0,img.rows/2.0 - minobjsize/2.0),
			  Point(img.cols/2.0 + minobjsize/2.0,img.rows/2.0 + minobjsize/2.0),
			  Scalar(0,255,0),3);
	dst.copyTo(*_img);
	return objects.size();
}

Excuse my messy code, it’s just a modification of facedetect.cpp from OCV examples.
However, one move was to rotate the frame because the silly Samsung Galaxy is delivering frames in “portrait” rather than “landscape” (the warpAffine op). Or rather it’s android-opencv problem with delivering the bytes… but anyway I had to deal with it. The rest is pretty standard stuff.
So what’s going on in the java-side? Nothing much… just a call to the JNI function

    class DetectorProcessor implements NativeProcessor.PoolCallback {
		@Override
		public void process(int idx, image_pool pool, long timestamp,
				NativeProcessor nativeProcessor) {
			Log.i("Detector","Detector process start");
  			int num = processor.detectAndDrawObjects(idx, pool);
			Log.i("Detector","Detector process end, found " + num + " objects");
                        //probably should do something with these objects now..
               }
     }

In a timely fashion – which means adding it to android-ocv’s “Callback Stack”:

		LinkedList<PoolCallback> defaultcallbackstack = new LinkedList<PoolCallback>();
		defaultcallbackstack.addFirst(new DetectorProcessor());
		mPreview.addCallbackStack(defaultcallbackstack);

This will run the JNI call on every frame…
The JNI is created by SWIG following android-ocv:

/*
 * include the headers required by the generated cpp code
 */
%{
#include "Detector.h"
#include "image_pool.h"
	using namespace cv;
%}
//import the android-cv.i file so that swig is aware of all that has been previous defined
//notice that it is not an include....
%import "android-cv.i"
//make sure to import the image_pool as it is referenced by the Processor java generated class
%typemap(javaimports) Detector "
import com.opencv.jni.image_pool;// import the image_pool interface for playing nice with
// android-opencv
//this is exactly as in "Detector.h"
class Detector {
public:
	Detector();
	virtual ~Detector();
	bool initCascade(const char* filename);
	int detectAndDrawObjects(int idx, image_pool* pool);
};

Almost forgot – loading the classifier cascade!
This proved a bit tricky, since just adding the XML to the “assets” doesn’t allow the native code to access it via system file interface, I did a little workaround and just made a temp copy of it and then (when I have an accessible File object) I load it in the Cascade object using absolutePath:

try {
			InputStream is = getAssets().open("cascade-haar-40.xml");
			File tempfile = File.createTempFile("detector", "");
			Log.i("Detector","Tempfile:" + tempfile.getAbsolutePath());
			FileOutputStream fos = new FileOutputStream(tempfile);
			byte[] b = new byte[1024];
			int read = -1;
			while((read = is.read(b,0,1024)) > 0) {
				Log.i("Detector","read " + read);
				fos.write(b,0,read);
			}
			fos.close(); is.close();
			boolean res = processor.initCascade(tempfile.getAbsolutePath());
			Log.i("Detector","initCascade: " + res);
			tempfile.delete(); // no longer needed
		} catch (IOException e) {
			e.printStackTrace();
		}

This is some simple object detection on Android right? and it works in high FPSs too (>10 on Samsung Galaxy S)
I’ll try to upload video proof soon (video of a video is not so simple :), and maybe complete source.
Thanks for tuning in…
Roy.