Jul 01 2009

Augmented reality on the iPhone using NyARToolkit [w/ code]

Published by at 1:37 pm under 3d,graphics,Mobile phones,opengl,programming,video

nyarrrHi

I saw the stats for the blog a while ago and it seems that the augmented reality topic is hot! 400 clicks/day, that's awesome!

So I wanted to share with you my latest development in this field - cross compiling the AR app to the iPhone. A job that proved easier than I originally thought, although it took a while to get it working smoothly.

Basically all I did was take NyARToolkit, compile it for armv6 arch, combine it with Norio Namura's iPhone camera video feed code, slap on some simple OpenGL ES rendering, and bam - Augmented Reality on the iPhone.

Update: Apple officially supports camera video pixel buffers in iOS 4.x using AVFoundation, here's sample code from Apple developer.

This is how I did it...

I recommend you read my last post on this matter. I have some insights, however superficial, to working with NyARToolkit implementation for C++, that I also use here.

Getting NyARToolkit C++ to compile on iPhone

First of all, I needed to cross-compile NyARToolkit for iPhone's CPU architecture (Arm), but this was a very simple task - it just compiled off the bat! No tweaking done, what so ever.
But that's only the beginning, as iPhone apps are built using Objective-C and not C++ (maybe they can, but all the documentation is in obj-c). So I needed to write an Obj-C wrapper around NyARTk to allow my iPhone app to interact with it.

I only needed a very small set of functions out of NyARTk to get Aug.Reality - those that have to do with marker detection. I ended up with a lean API:

@interface NyARToolkitWrapper : NSObject {
	bool wasInit;
}

-(void)initNyARTwithWidth:(int)width andHeight:(int)height;
-(bool)detectMarker:(float[])resultMat;
-(void)setNyARTBuffer:(Byte*)buf;
-(void)getProjectionMatrix:(float[])m;

I also have some functions I used for debugging, and non-optimized stages. The inner works of the wrapper are not very interesting (and you can see them in the code yourself), they are mainly invoking NyARSingleDetectMarker functions.

In the beginning - there was only marker detection

OK, to get AR basically what I need to do is:

  1. initialize NyARTk inner structs
  2. set NyARTk's RGBA buffer with each frame's pixles
  3. get the extrinsic parameters of the camera, and draw the OpenGL scene accordingly

This is for full fledged AR, but let me start with a simpler case - detecting the market in a single image read from a file. No OpenGL, no camera. Just reading the file's pixels data, and feeding it to NyARTk.

Now this is far more simple:

CGImageRef img = [[UIImage imageNamed:@"test_marker.png"] CGImage];
int width = CGImageGetWidth(img);
int height = CGImageGetHeight(img);
Byte* brushData = (Byte *) malloc(width * height * 4);
CGContextRef cgctx = CGBitmapContextCreate(brushData, width, height, 8, width * 4, CGImageGetColorSpace(img), kCGImageAlphaPremultipliedLast);
CGContextDrawImage(cgctx, CGRectMake(0, 0, (CGFloat)width, (CGFloat)height), img);
CGContextRelease(cgctx);

[nyartwrapper initNyARTwithWidth:width andHeight:height];
[nyartwrapper setNyARTBuffer:brushData];
[nyartwrapper detectMarker:ogl_camera_matrix];

First I read the image to UIImage, then get it's respective CGImage. But what I need are bytes, so I create a temporary CGBitmapContext, draw the image into it and use the context pixel data (allocated by me).

Adding the 3D rendering

This is nice, but nothing is shown to the screen, which sux. So the next step will be to create an OpenGL scene, and draw some 3D using the calibration we now have. To do this I used EAGLView from Apple's OpenGL ES docs.
This view will setup an environment to draw a 3D scene, by giving you a delegate to do the actual drawing while hiding all the perepherial code (frame buffers... and other creatures you wouldn't want to meet in a dark 3D alley scene).

All I needed to implement in my code were two functions defined in the protocol:

@protocol _DGraphicsViewDelegate<NSObject>

@required

// Draw with OpenGL ES
-(void)drawView:(_DGraphicsView*)view;

@optional
-(void)setupView:(_DGraphicsView*)view;

@end

'setupView' will initialize the scene, and 'drawView' will draw each frame. In setupView we'll have the viewport setting, lighting, generating texture buffers etc., You can see all that in the code, it's not very interesting...

In drawView we'll draw the background and the 3D scene. Now this took some trickery. First I though i'll take the easy route and just have the 3D scene be transparent, draw the view using a simple UIView of some kind, and overlay the 3D over it. I didn't manage to get it to work, so I took a different path (harder? don't know) and I decided to paint the background over a 3D plane, in the 3D scene itself, using textures. This is how I did it in all my AR app on other devices.
Now, the camera video feed is 304x400 pixels, and OpenGL textures are best optimized at power-of-2 sizes, so I created a 512x512 texture. But for now we're talking about a single frame.

const GLfloat spriteTexcoords[] = {0,0.625f,   0.46f,0.625f,   0,0,   0.46f,0,};
const GLfloat spriteVertices[] =  {0,0,0,   1,0,0,   0,1,0   ,1,1,0};

glMatrixMode(GL_PROJECTION);
glPushMatrix();
glLoadIdentity();
glOrthof(0, 1, 0, 1, -1000, 1);
glMatrixMode(GL_MODELVIEW);
glPushMatrix();
glLoadIdentity();
	
// Sets up pointers and enables states needed for using vertex arrays and textures
glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, spriteVertices);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer(2, GL_FLOAT, 0, spriteTexcoords);	
	
glBindTexture(GL_TEXTURE_2D, spriteTexture);
glEnable(GL_TEXTURE_2D);
		
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
	
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);

glMatrixMode(GL_PROJECTION);
glPopMatrix();
glMatrixMode(GL_MODELVIEW);
glPopMatrix();

Basically, I go into orthographic mode and draw a rectangle with the texture on it, nothing fancy.

Next up - drawing the perspective part of the scene, the part that aligns with the actual camera...

//Load the projection matrix (intrinsic parameters)
glMatrixMode(GL_PROJECTION);
glLoadMatrixf(ogl_projection_matrix);

//Load the "camera" matrix (extrinsic parameters)
glMatrixMode(GL_MODELVIEW);
glLoadMatrixf(ogl_camera_matrix);

glLightfv(GL_LIGHT0, GL_POSITION, lightPosition); 
glEnable(GL_LIGHTING);
glEnable(GL_LIGHT0);
	
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);

glPushMatrix();	

glScalef(kTeapotScale, kTeapotScale, kTeapotScale);

{
        static GLfloat spinZ = 0.0;
        glRotatef(spinZ, 0.0, 0.0, 1.0);
        glRotatef(90.0, 1.0, 0.0, 0.0);
        spinZ += 1.0;
}

glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);    
glVertexPointer(3 ,GL_FLOAT, 0, teapot_vertices);
glNormalPointer(GL_FLOAT, 0, teapot_normals);
glEnable(GL_NORMALIZE);
	
for(int i = 0; i < num_teapot_indices; i += new_teapot_indicies[i] + 1)
{
        glDrawElements(GL_TRIANGLE_STRIP, new_teapot_indicies[i], GL_UNSIGNED_SHORT, &new_teapot_indicies[i+1]);
}
	 
glPopMatrix();

For this also I learned from Apple's OpenGL ES docs (find it here). I ended up with this:
Picture 5

Tying it together with the camera

This runs on the simulator, since the camera is not involved just yet. I used it to fix the lighting and such, before moving to the device. But we're here to get it work on the device, so next I plugged in the code from Norio Nomura.
Some people have asked me to post up a working version of Nomura's code, so you can get it with the code for this app (scroll down). Nomura was kind enough to make it public under MIT license.

First, I set up a timer to fire in ~11fps, and initialize the camera hook to grab the frames from the internal buffers:

repeatingTimer = [NSTimer scheduledTimerWithTimeInterval:0.0909 target:self selector:@selector(load2DTexFromFile:) userInfo:nil repeats:YES];

ctad = [[CameraTestAppDelegate alloc] init];
[ctad doInit];

And then I take the pixel data and use it for the background texture and the marker detection:

-(void)load2DTexWithBytes:(NSTimer*) timer {
	if([ctad getPixelData] != NULL) {
		CGSize s = [ctad getVideoSize];
		glBindTexture(GL_TEXTURE_2D, spriteTexture);
		glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, s.width, s.height, GL_BGRA, GL_UNSIGNED_BYTE, [ctad getPixelData]);

		if(![nyartwrapper wasInit]) {
			[nyartwrapper initNyARTwithWidth:s.width andHeight:s.height];
			[nyartwrapper getProjectionMatrix:ogl_projection_matrix];

			[nyartwrapper setNyARTBuffer:[ctad getPixelData]];
		}
		
		[nyartwrapper detectMarker:ogl_camera_matrix];
	}
}

All this happens 11 times per second, so it must be concise.

Video proof time...

Well, looks like we are pretty much done! time for a video...

How did you get the phone to stand still so nicely?

An important issue... when it comes to shooting the phone w/o holding it.
Well I used a little piece of metal that's used to block the PCI docks in the PC. In hebrew will call these scrap metal "Flakch"s (don't try to pronounce this at home). I bended it in the middle to create a kind of "leg", and the ledge to hold the phone already exists.
metal iPhone stand

The code

As promised, here's the code (I omitted some files whose license is questionable).

That's all folks!
See you when I get this to work on the Android...
Roy.

Share

103 responses so far

103 Responses to “Augmented reality on the iPhone using NyARToolkit [w/ code]”

  1. micahon 07 Jul 2009 at 7:28 pm

    Hey man, great post. Was gonna try implementing it myself, but was wondering if you did this with the SDK or with the open toolchain? If you were using SDK can you post the .xcodeproj file? Thanks.

  2. Royon 14 Jul 2009 at 9:32 am

    Hi micah

    I did use the SDK (v2.1) for this, but with a small hack to use the internal camera buffers to get the frames in real-time speeds (Norio Nomura's code).

    I can't post the xcodeproj file, and I think you don't really need it.
    All the code exists, so just add the .m, .mm and .cpp files to a new project in xcode and you're golden.

    One more thing, I hear (and see) people doing augmented reality on the iPhone OS v3.x. So maybe Apple have released the camera APIs to the "public", and the hack is no longer needed. I havn't got to trying it myself, but it's worth looking into.

    Good luck,
    Roy

  3. Daniele Salattion 21 Jul 2009 at 9:28 am

    Hi!
    I'm trying to download the code, but it seems that the link is not working.
    In the mean time I'll try to follow step-by-step what you wrote to get it to work on my iPhone...
    By the way, any news on the iPhone OS v3.x camera APIs?

    Thanks,
    Daniele

  4. Alexon 21 Jul 2009 at 10:11 am

    Hey

    Very nice post . I also looked at the code and put it on Xcode and tried to compile it but i get errors in the NyARToolkitWrapper.mm . It doesn't seem to recognize all the c++ syntax. Am I doing something wrong ?

    Thanks and keep up the good work

  5. Royon 21 Jul 2009 at 10:14 am

    Hi

    The code is available via Google Code: http://code.google.com/p/morethantechnical/source/browse/#svn/trunk/NyARToolkit-iPhone
    Their site might be down...

    I haven't yet explored the camera APIs of 3.0, but I imagine it will not be too different.
    The only change to do is manage how to get the camera's video frame bytes, and the rest should stay the same.

    If you do get it to work with 3.0, please let me know

    Good luck
    Roy

  6. Annie Ok : tangent » interesting 7.5.09on 27 Jul 2009 at 12:56 am

    [...] Augmented reality on the iPhone using NyARToolkit [w/ code] [...]

  7. Eddyon 08 Aug 2009 at 11:48 pm

    Hey Roy,
    great post.
    I've been trying to build your code in XCode with no luck. I took all files threw them into an xcode project but I seem to be missing 3DGraphicsView.h.
    Is this part of a framework that I need to include?

    Any advice appreciated.

    thanks,
    Eddy

  8. Royon 09 Aug 2009 at 8:11 am

    Hi Eddy
    The file is indeed missing. I noted in the post that removed files with questionable license, and this file comes from apple's user guides and has a strict license.
    You can find the file in here (the link also appears in the post in "Adding 3D rendering"). Its the same file only the name was changed.

    good luck
    Roy

  9. kimmyon 21 Aug 2009 at 5:24 pm

    Hi, I just got the code. Thank you first. Do i need cross-compile the library? If that, could you tell me how to cross-compile the library?

  10. kimmyon 21 Aug 2009 at 7:27 pm

    Another thing is where is the file "teapot.h"?

  11. Royon 22 Aug 2009 at 12:21 pm

    Hi kimmy

    First, you are cross-compiling the NyARToolkit code if you do as instructued in the post (attach all the C++ code to the XCode project).
    Second, the teapot.h file comes from Apple's documentation. (http://developer.apple.com/iphone/library/samplecode/GLGravity/listing6.html)

    good luck
    Roy

  12. Ambrozon 24 Aug 2009 at 4:01 pm

    I have a problem with a missing file I suppose as a get the following error: _DGraphicsView undeclared.

    Any help appreciated

  13. Ambrozon 24 Aug 2009 at 6:36 pm

    Has anyone managed to create a functional xCode project? Please post it. I just get too many errors if I import posted classes.

  14. kimmyon 24 Aug 2009 at 9:05 pm

    Thank you for your fast responding!
    Yes, I should include all the c++ files and it pass compiling even if it reports warning: no rule to process file '$(PROJECT_DIR)/Classes/Classes/NyARToolkit/forLinux/libNyARToolkit/makefile' of type sourcecode.make for architecture armv6

    Could I ask another question on the view" _DGraphicsView" which you defined in the program? Does it point to the EAGLView? I replace the 3DGraphicsView.h with the EAGLView.h, add your protocol into the EAGLView.h and delete its own drawview and setupview functions. Am i right in doing this? I cannot wait to see the teapot now! Thank you

  15. kimmyon 24 Aug 2009 at 11:18 pm

    Hi I found the file used for implement the missed view. It is in the project GLGravity. I replace my view file with that file, but I still get nothing in my screen. It gives me a error "501 error" when i first draw the view.

  16. Royon 24 Aug 2009 at 11:34 pm

    Hi

    kimmy, I don't know this "501 error", perhaps it has a description or something a little more informative?

    Ambroz, I cannot post the XCode project file as it has information I'm not allowed to disclose by law.

    It shouldn't be hard (I tried it myself) just add:
    - all the files to the project,
    - the PhotoLibrary private framework,
    - the 3DGraphicsView, which is exactly EAGLView from Apple's docs with the name changed.
    and you're good to go.

    Of course, it might take some tinkering around with the code, but it's doable and you'll learn a lot in the process.

    Good luck
    Roy.

  17. kimmyon 25 Aug 2009 at 1:32 am

    I am pretty sure there is the 501 error occuring at the following code.

    CGRect rect = view.bounds;//the following size is not right and generate an error
    glFrustumf(-size, size, -size / (rect.size.width / rect.size.height), size / (rect.size.width / rect.size.height), zNear, zFar);
    glViewport(0, 0, rect.size.width, rect.size.height);

    It is defined as GL_INVALID_VALUE. More information can be found at the following link:
    http://pyopengl.sourceforge.net/documentation/manual/glCopyTexImage2D.3G.html

  18. kimmyon 25 Aug 2009 at 2:12 am

    Could the problem be caused by the image size?:
    CGImageRef img = [[UIImage imageNamed:@"IMG_0020_small.png"] CGImage]; Because I don't have this image file so I replace it with a different one

  19. kimmyon 25 Aug 2009 at 5:11 am

    Hi

    I change the znear from -1000 to 0.1 and the 501 error not there any more but I cannot see any animation for the image I choose. Even if I use an ipod touch I think I still can get an basic animation without camera, right? Sorry for so many questions.

  20. Ambrozon 25 Aug 2009 at 9:21 am

    OK, I am now stuck at the only error left:

    extra qualification"NYArtToolKitCPP::NyARDoubleMatrix33::" on member createArray

    How to solve that one?

  21. Ambrozon 25 Aug 2009 at 10:11 am

    Another problem:

    duplicate symbol _spriteVertices in

    build/CameraTest.build/Release-iphonesimulator/CameraTest.build/Objects-normal/i386/EAGLView.o

    and

    build/CameraTest.build/Release-iphonesimulator/CameraTest.build/Objects-normal/i386/NyARToolkitCrossCompileAppDelegate.o

  22. Royon 25 Aug 2009 at 10:23 am

    Hi again guys

    Ambroz, you can try changing the name of the duplicate symbol so the compiler won't scream at you.
    As for the NyARDoubleMatrix33, I don't know, you'll have to play around with the code to make it work.

    kimmy, I was able to get marker detection on a single image on the simulator (no camera needed).

    Anyway, I can't really provide insight into your problems because I didn't run into them myself, and I can't reproduce them on my setup.
    My advice is - don't give up. Keep trying to make it work and you'll succeed.
    I think both of you are very close to getting it to work.
    Try to work incrementally like I did: first only marker detection w/o graphics at all (print results to console), then a single picture marker detection with the teapot, only then integrate camera.

    Good luck
    Roy

  23. kimmyon 25 Aug 2009 at 12:42 pm

    Thank you Roy.

    Ambroz, for the problem ”NYArtToolKitCPP::NyARDoubleMatrix33::”, just delete the NYArtToolKitCPP::
    You may need delete the duplicate symbol in EAGLView file if you use it. You also need take a look at the view at the below link.
    http://developer.apple.com/iphone/library/samplecode/GLGravity/listing6.html

  24. kimmyon 25 Aug 2009 at 8:31 pm

    Hi Roy

    Could you tell me what version the code works on? I know the camera part does not work for iphone OS 3.0. How about NyArttookit?

  25. Royon 26 Aug 2009 at 9:15 am

    Hi kimmy
    NyARToolkit will work on any version, since your are building it from source and the compiler makes sure it works properly.
    For camera, iPhoneOS 3.1 should allow superimposing over live video feed from the camera, although I haven't got to actually trying it (I will soon). Right now I got it to work on 2.1 and 2.2.1, using the camera callback hack.

    Roy.

  26. Ambrozon 26 Aug 2009 at 11:40 am

    Roy, one more question. In NyARToolkitCrossCompileAppDelegate.m method ApplicationDidFinishLaunching you set a delegate to EAGLView instance, but I get an error that says: request for member 'delegate' in something not a structure of union.
    Have you defined the delegate yourself?

    EAGLView *glView = [[EAGLView alloc] initWithFrame:rect];

    glView.delegate = self; ????

  27. Ambrozon 26 Aug 2009 at 12:18 pm

    Well, I declared the @property (assign, nonatomic) id delegate; but app still crashes at glDeleteFramebuffersOES. It's so frustrating.

  28. Royon 26 Aug 2009 at 1:04 pm

    Ambroz,
    Please use GLGravityView code from here

    The delegate is declared properly there (EAGLView is missing the delegate protocol declaration), it should work fine with the project.

    Roy.

  29. Ambrozon 26 Aug 2009 at 7:16 pm

    Thanks for all your help. I managed to get as far as kimmy did. Roy, could you post your image "IMG_0020_small.png", as my app threw exception after I replaced it by "hiro.png". Or just post the url where you got the picture. Thanks

  30. kimmyon 27 Aug 2009 at 2:23 am

    Hi

    The camera code owner said he will not work on iphone os 3.0 or later version. I cannot get a camera view with his code now.

    I changed the code a lot and finally can detect the marker according to console output information. I am not sure if I did this right because I cannot get the result the same as the picture you show in the blog. Can you get the result the same as that in your blog's picture just for image detection? I only can get a dark teapot rotating in front of a big "hiro" image in my ipod touch.

  31. Royon 27 Aug 2009 at 10:22 am

    Good news kimmy, you are on the way...
    Your problem is with the lighting of the teapot? I think I mention something about it
    Indeed the lighting issue was complicated, it took a while until I got it right
    Make sure you're enabling GL_LIGHTING and GL_LIGHT0, the position of the light is correct, and the material of the teapot is defined and applied (all this is in the beginning of NyARToolkitCrossCompileAppDelegate.m file)

    Ambroz, good job!
    For the picture, I took it myself with the iPhone's camera.
    I suggest you print out a "hiro" marker yourself and attach it to a cardboard for easy manipulation.
    Anyway, here's the picture, you can resize it to be smaller (300x400).

    Good luck guys!
    Roy.

  32. Ambrozon 27 Aug 2009 at 11:08 am

    kimmy could you email me the camera classes for OS 3.0 (ambroz[dot]homar[at]gmail[dot]com)? I'd really love to see how it works on iPhone!

  33. kimmyon 27 Aug 2009 at 5:26 pm

    Hi Roy

    i use your picture and get a pretty close screen as yours, but the console shows that I fail to detect the image because it still offset a little of correct position. If possible, could you give me a email so that I can send you a snapshot. Maybe you can give me some clue on how to adjust the position. If not, it is ok. Thank you again. It is indeed a fun learning experience even if sometimes it is really frustrated.

    Hi Ambroz
    It is not for iphone os 3, but you can test it on iphone os 2 if you can use it with a camera.
    http://github.com/norio-nomura/iphonetest/tree/9713242dda6c6bc897da4bd639a1fdadc29b6fd7/CameraTest

  34. kimmyon 27 Aug 2009 at 6:13 pm

    Hi Roy,

    I got it!!

    Your code has no problem. The only problem is you do not mention we need a image file you just post yesterday. Now it works well even if I still have no idea about the data file, camera file, and code file and their relationship with the library. Maybe you can explain them on next post. That will be much useful for learning this library. Thank you again! Nice job!

  35. kimmyon 27 Aug 2009 at 10:31 pm

    Hi Roy,

    The image marker can be detected, but the teapot is still not at the center of the image. Why does it happen? What should I do so that I can adjust the position of the teapot?

  36. Jonon 01 Sep 2009 at 8:29 pm

    So what devices (iPhone v1, 3G, 3GS) and OS's (3.0, 3.1beta) have people gotten this to work on?

  37. Royon 07 Sep 2009 at 9:28 am

    Hi Jon
    This works on any iPhone that runs OS 2.X (no support for 3.X yet, the camera private code needs to be hacked again).

    Roy.

  38. Fawadon 09 Sep 2009 at 6:52 am

    Hello,
    Thanks for a wonderful code. I am building the code iphone sdk 3.0
    But its throwing the following error. Kindly help me

    Command /Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/g++-4.2 failed with exit code 1

  39. Royon 09 Sep 2009 at 8:57 am

    Hi Fawad

    This error does not supply any information as to what went wrong, only that the compiler wasn't able to compile the code.
    Please supply the full error (the part that says which file and line is erroneous)

    Roy.

  40. Fawadon 10 Sep 2009 at 4:25 am

    Hello Roy,
    I have solved that problem,
    But roy the app is still throwing the 501 error

  41. Fawadon 10 Sep 2009 at 4:34 am

    Hey Roy,
    the app is throwing Bad access exception here

    in this method
    void NyARRasterFilter_ARToolkitThreshold::convert32BitRgbx(const NyAR_BYTE_t* i_in, int* i_out, const TNyARIntSize* i_size)const

    in this file.
    NyARRasterFilter_ARToolkitThreshold.cpp

    when i commented the code in this block, it runs just fine, but its only displaying the hiro.png on full screen, not the camera

  42. Fawadon 10 Sep 2009 at 8:00 am

    Hello Roy,
    can u tell me that which is the default delegate file to be used in Mainwindow.xib?
    CameraTestDelegate or NyARToolKitDelegateApp?

  43. Fawadon 11 Sep 2009 at 5:04 am

    somebody please upload the project or individual files, I need help :(
    I will be very thankful.

  44. Royon 14 Sep 2009 at 9:20 am

    The NyARToolKitDelegateApp is the main delegate for the app.

    Roy.

  45. Nicoon 21 Sep 2009 at 8:52 pm

    Hi! has anybody got it working on iPhone OS 3.x??

  46. nofluxon 08 Oct 2009 at 11:47 am

    Does any one has a working project on iPhone OS 3.x?

  47. Quintanaon 30 Nov 2009 at 3:51 pm

    Hello Roy,

    I followed all the instructions of the comments above and I did compile
    your code successfully. However, every time I run it, it gives me a error “501 error” too.

    I discovered that the 501 error occurs in the setupView. To make sure, I put your setupView in the GLGravity project and the 501 error also occurs on that project.

    I'm running the code in the iPhone simulator 2.2, I want to ask what is the project behavior in the simulator. I was wondering if this code only runs on iPhone device 2.2..

    I also tried running the code on the device 3.1 but the same error occurred.. I do not know if it occurs because of any dependency on the camera.

  48. Weson 20 Jan 2010 at 1:55 pm

    Hey Roy,

    First of all, thanks a lot for sharing your findings with us!

    I'm trying to compile your code, but I'm currently stuck on a single error.
    I've followed every 'step', imported your classes into a clean app.
    Then I created a 3DGraphicsView.h file based on GLGravityView.h/EAGLView.h, and added this code to it to declare the protocol:

    @protocol _DGraphicsViewDelegate
    @required
    // Draw with OpenGL ES
    -(void)drawView:(_DGraphicsView*)view;
    @optional
    -(void)setupView:(_DGraphicsView*)view;
    @end

    Now I keep running into this single error:
    "NyARToolkitCrossCompileAppDelegate.m:295: error: request for member 'delegate' in something not a structure or union"

    I'm quite stuck, and can't figure out what's wrong. I really need this to work, as it's part of my graduation project which I want to demo/proof of concept.
    Can anyone help me out please?

  49. Weson 20 Jan 2010 at 2:06 pm

    Oh, FYI, the beforementioned error links to this line (NyARToolkitCrossCompileAppDelegate.m:295)

    glView.delegate = self;

  50. Weson 22 Jan 2010 at 4:46 pm

    I think I managed to solve the issue, I didn't declare the delegate properly in the 3dGraphicsView header file or so it seemed.

    Now it compiles fine, yet I don't see anything when the app starts. No camera-feed, no nothing. Just a black screen with a gray statusbar which shouldn't even be there.
    The thing is, when I redirect the delegate file in Mainwindow.xib back to CameraTestAppDelegate I DO get a camera feed.

    Also, you mention some posts before that the default delegate file should be NyARToolKitDelegateApp, however I don't ee such a file in your source code. There's only a NyARToolKitCrossCompileAppDelegate , is this the file you mention or am I missing some files?

    Sorry to bother with all these questions, it's just rather important for me to get a demo of this working for my exam next week ;)

  51. Ray Leeon 28 Jan 2010 at 5:38 pm

    Dear Roy,

    I think your work is amazing, however I don't know how to implement it, actually I just need the marker detect part, so what should I do??

    Ray

  52. FlimFlamon 28 Jan 2010 at 7:52 pm

    Brilliant stuff Roy - but I've been really struggling today to get this implemented. And it's get to be in my 3dGraphicsView where I am going wrong. Am I right in using GLGravityView.h / .m - I've changed there names to 3dGraphicsView to avoid conflict and then added the protocol at the end of the header file?

    I've learnt a lot implementing this so far but seem to be falling at the final hurdle.

  53. Royon 31 Jan 2010 at 4:13 pm

    Keep in mind that when you rename the class to 3DGraphicsView - XCode will automatically change it to _DGraphicsView, since you can't have a number as the first letter of a class. Perhaps this is the problem.

    Roy.

  54. Alexon 08 Feb 2010 at 12:14 am

    Hello,

    i tried to compile the NyArtToolkit and all your code, but i got only Warnings and 135 Errors, it seem like the cross-compiling doesn't work. Can you please describe how to compile the NyArtToolkit for the Iphone?

  55. FlimFlamon 17 Feb 2010 at 4:42 pm

    I got this working! My issue's were just my lack of knowledge about delegates and how to implement them correctly, but worked through it and figured it out.

    Now...does anyone know how to get my own marker in there? I've created the marker, changed all references to the old patt.hiro to my patt.metro marker and yet it still only looks for the patt.hiro marker?! Anyone got any nuggets of info that might help me out?

    Cheers

  56. Fastrakon 19 Feb 2010 at 6:59 am

    FlimFlam,

    Can you please elaborate on exactly what you did?

    I think I am stuck at the same point you were,

    glView.delegate = self;
    glView.animationInterval = 1.0 / kRenderingFrequency;

    are both giving errors about not a structure or union

  57. cathelperon 25 Feb 2010 at 8:08 pm

    I'm having the same problem as FlimFlam. I can't change delegates at runtime. Googled it everywhere , i'll keep on the net trying to look for an answer, but if you can share some light i'l appreciate it a lot.

  58. Johnon 25 Feb 2010 at 8:53 pm

    Does anyone have a complete example project source? Not just the classes etc..

  59. FlimFlamon 08 Mar 2010 at 3:48 pm

    The best advice I can give is to head here for some brilliant tutorials on opengl on the iphone. Read the tutorials, download the source files and you'll quickly figure out what all this is about and solve the issues you are having:

    http://iphonedevelopment.blogspot.com/2009/05/opengl-es-from-ground-up-table-of.html

  60. Alexon 16 Mar 2010 at 6:07 pm

    Hi guys, if anyone got complete source please share it

  61. Julianoon 26 Mar 2010 at 3:34 am

    Hi Roy,

    I'm trying to compile the code, made all the right answers, but I'm getting an error

    "Unknown class NyARToolkitCrossCompileAppDelegate in Interface Builder"

  62. Geargeon 05 Apr 2010 at 9:01 pm

    I'm just trying to work through your tutorial on this brilliant website, just getting slightly confused where some of the coding goes, what files they go in, is there any chance u can help?

  63. Tarkraton 09 Apr 2010 at 6:58 pm

    Some source files (specifically NyARToolkitWrapper.m) seem to be written in Objective-C++! It #include's cstdio, iostream, and fstream and contains try-catch blocks (instead of the #try-#catch blocks in Objective-C), and Xcode is moaning about all of this with compile-errors.
    I thought only Objective-C code could be compiled to run on the iPhone!
    What must I do? I'm very confused! Must I download another compiler?

  64. Royon 10 Apr 2010 at 4:29 pm

    XCode can compile c++ just fine for the iPhone, there's nothing wrong with the compiler. However it could be outdated, you can try and update to a new version of the iPhone SDK.
    For mixing objective-C and C++ you must create an ".mm" file.

  65. Tarkraton 10 Apr 2010 at 5:40 pm

    Oh I see, so it must be NyARToolkitWrapper.mm with two m's for Objective-C++ code.
    Now that file compiles, thanks!

  66. Gemmaon 10 Apr 2010 at 6:31 pm

    Brilliant advice and source thanks ever so much. I have an error I can't seem to budge

    extra qualification 'NyARToolkitCPP::NyARDoubleMatrix33::' on member 'createArray'

    I have tinkered with the code, and not too sure what the problem might be. Any chance you might no reasons why or how i can get rid of it?

    Also I'm using iPhone OS 3.0 is this tutorial compatible with it?

    Hope to hear back :)
    Gemma

  67. Tarkraton 10 Apr 2010 at 10:37 pm

    Suppose I've got my own marker image in a .png-file.
    How must I process this image for use with this program?
    Which file in the data directory is read by the program for recognition?

  68. Gemmaon 11 Apr 2010 at 1:26 pm

    Hi there, what a great site this is!! It's fantastic. I just wondered if you could help me. I have 4 errors all to do with:

    extra qualification on a member, for example:

    error: extra qualification 'NyARToolkitCPP::NyARLabelingLabelStack::' on member 'NyARLabelingLabelStack'

    Also is this compatible on iPhone OS 3.1?

    Thanks ever so much for tutorial it's great! :)

    Gemma

  69. Gemon 11 Apr 2010 at 1:30 pm

    Hi there, I just wondered if you could help, I have 4 errors to do with

    extra qualification on member, for example:
    error: extra qualification 'NyARToolkitCPP::NyARLabelingLabelStack::' on member 'NyARLabelingLabelStack'

    Just wondered if you could help?
    Also is this compatible on iphone os 3.1?

    Gem

  70. Tarkraton 11 Apr 2010 at 4:50 pm

    Now another thing. I'm trying to see how I would adapt the code in NyARToolkitWrapper.mm to do multiple marker detection.
    I could construct multiple NyARSingleMarkerDetect objects - one for each marker I'd like to detect - but I think that would be inefficient.
    From other sources I've determined that the NyARToolkit also contains an NyARMarkerDetect class. Apparently this class contains a constructor that takes as its parameters an ARParams, an array of Codes, a marker width, number of Codes, and the raster buffer type.
    This class also appears in the source code for this project, but have a look at this class' declaration:

    class NyARDetectMarker
    {
    public:
    NyARDetectMarker(void);
    virtual ~NyARDetectMarker(void);
    };

    This class doesn't seem to do anything. What's going on here?
    How do I do multiple marker detection?

  71. Tarkraton 13 Apr 2010 at 10:22 pm

    Hi Gemma,
    in which source file does this error occur?
    Did you try removing the "NyARToolkitCPP::"-part?

  72. Benjamin Blundellon 26 Apr 2010 at 6:32 pm

    Ok, I can confirm I've got this working with a static image on my iPhone running OS 3.1.3. However, because the camera loophole was closed, the only authorised method to access the camera is "UIGetScreen" method which, unfortunately, grabs the entire screen, including the OpenGL layer we are drawing, therefore it is pretty useless. There may very well be a hack to get around this but I don't know it :(

  73. Jimmyon 10 May 2010 at 5:59 pm

    Hi,

    I'm compiling for OS 3.1.3, I now have 39 errors (started with about 130 errors) left to fix. It looks like 38 of the errors are a variation of the following:

    Extra qualification 'NyARToolkitCPP::NyARDoubleMatrix33:: on member 'createarray'

    In xCode I am using C/C++ compiler version GCC 4.2. Any advice on how to remove this error would be great. I've googled the issue several times and I still can't fix it.

    The other error is in NyARToolkitWrapper.mm:
    colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);

    I'm receiving a 'kCGColorSpaceGenericRGB is unavailable' error. Yes, I do have the CoreGraphics.framework included in the project.

    Any help would be greatly appreciated.

  74. Jimmyon 11 May 2010 at 7:50 am

    Hi, I'm still fighting with this. I've got it down to 7 errors now. This is one of the errors.

    In

    glView.delegate = self;
    glView.animationInterval = 1.0 / kRenderingFrequency;

    // I receive the error "request for 'blah' in something not a structure or union" for both of those lines.

    The other errors are:

    Extra qualification ‘NyARToolkitCPP::NyARDoubleMatrix33:: on member ‘createarray’

    The other error is in NyARToolkitWrapper.mm:
    colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);

    I’m receiving a ‘kCGColorSpaceGenericRGB is unavailable’ error.

    I'll keep fighting with this but any suggestions would be greatly appreciated.

  75. Jimmyon 11 May 2010 at 9:24 am

    Okay, down to a couple of errors:

    1. Duplicate symbol _spriteVertices (referring to the 3DGraphicsView & NyARToolKitCrossCompileAppDelegate).

    2. glView.delegate = self; (error in NyARToolKitCrossCompileAppDelegate.m)

    Almost there. Again, any help would be appreciated.

  76. pabloon 17 May 2010 at 3:44 pm

    Hi, could you upload the whole project? I can't make it work properly, i've got tons of errors like the ones described above. thanks

  77. Robon 20 May 2010 at 2:15 pm

    I have a ridiculous question.

    How do I start? I've created a project with all files from the NyARToolkit-iPhone folder.
    This got me 136 errors and a feeling of been too dumb to figure it out by myself.

    The first error is it cannot find 3DGraphicsView.h.
    Like Duke would say, "Where is it?"

  78. Yjnnon 20 May 2010 at 11:50 pm

    Where can I get the file 3DGraphicsView.h/m from?

  79. Royon 21 May 2010 at 9:28 am

    Hi Rob
    As I mentioned in the article, the 3DGraphicsView is a copy of Apple Docs' EAGLView class, here.
    The rest of the errors you'll have to figure out on your own, but there are many answers already in the comments...
    Roy.

  80. Royon 21 May 2010 at 9:29 am

    Use this and change the name to 3DGraphicsView.

    Roy.

  81. Zander Cageon 14 Jun 2010 at 10:25 am

    hi,
    I had 135 errors initially.i brought it down to 1 error which i cannot understand. Can somebody please help.
    This is the error.

    Line Location Tool:0: Command /Developer/Platforms/iPhoneSimulator.platform/Developer/usr/bin/g++-4.0 failed with exit code 1

  82. Zander Cageon 14 Jun 2010 at 12:17 pm

    hi,
    I am getting error because of this line.
    glView.animationInterval = 1.0 / kRenderingFrequency;

    error:request for member 'animationInterval' is something not a structure or union.

  83. Markon 15 Jun 2010 at 8:55 am

    Thanks for the great work and generous post Roy. However, I can't even get past square one when it comes to compiling the CPP. It sounds like a lot of other people are also getting stuck at this early stage. Do you have an .xcodeproj that can just be loaded in and run/compiled? If not, perhaps you can discuss in more detail how "it just compiled off the bat! No tweaking done, what so ever." Thanks again!

  84. Emiliaon 16 Jun 2010 at 10:34 pm

    Hi Roy!

    Great work you have here.
    I was wondering if you planing to update this code for working with the new SDK 4?
    As now is possible to get real video stream from the iPhone. I would like to work on some AR for the iPhone using the new SDK.
    For the moment I have acces to video frames and I also display the video on the window.
    Do you think it s possible to modify your code in order to get the video frame buffer as input device?
    If so, wich classes should I modify? CameraTestAppDelegate?

    Many thanks for sharing your work.

    Best regards.

    Emilia

  85. Greensourceon 01 Jul 2010 at 9:59 am

    Hi folks!
    First, Roy thanks for this articles and the web site in general. And aslo congrats for MIT ;)

    Emilia, I'm currently working on AR on iOS too. I get video frame just like you and now I try to make Roy sample code functional with it ;)
    It's not totally "off the bat" (I don't know this expression, i'm French ^^) but I try to work on it.

    I let you my email, if we work on the same thing may be we can help each other: g r e e n s o u r c e [at]gmail.com

    Regards,
    Pierre DUCHENE

  86. Maxon 06 Jul 2010 at 5:05 am

    hi Roy, Brilliant tutorial for those who are interested in Augmented Reality, far one of the few tutorial websites i can search in the net! Im interested more on android because i have android, and in the country I stay, iPhone is very rare still. So hope u could share the android version if you had done. :) Cheers!

  87. Domsouon 07 Jul 2010 at 12:59 pm

    @Zander Cage:

    replacing ‘animationInterval’ by ‘animationFrameInterval’ solved this error in my case.

  88. Rogeron 13 Jul 2010 at 6:52 am

    Hi Roy, I managed to correct every error and compile correctly, but when running the simulator displays the following message on the console:
    "Failed to make complete framebuffer object 0"

    Whose operation is in this function "- (BOOL) createFramebuffer" of the library GLGravity

    If someone did the same hope to guide me.
    Greetings.

  89. Rogeron 15 Jul 2010 at 7:33 am

    Hello Roy
    That so, by testing and research, I finally achieved correct what the console message "Failed to make complete framebuffer object 0"

    The solution was to see an older version of GLGravityView, and take for example the protocol call, my mistake was to declare the protocol.

    Now my next step is to get to work on my iphone, that the console gives me a 501 error.

    In the list of comments on this post is a solution for this problem, I hope I serve.

    Roy thank you very much, which helped me a lot.

    Sorry for my English.

  90. Nishon 21 Aug 2010 at 9:55 pm

    Hello,

    Can anybody send the full source code of this application please ?
    Email : xpert.developer@gmail.com

    Thanks.

  91. ALEon 26 Sep 2010 at 1:53 pm

    Hi Roy, I don't understand how do the cross-compiling.
    I have created a View-Based-Application project for IOS Application in XCode and I have imported your files using drug and drop. Then I have corrected all errors of missing files (EAGLView.h, EAGLView.m and Teapot.h), but when i built the application, Xcode return me warnings (warning: no rule to process file '$(PROJECT_DIR)/Classes/NyARToolkit/forLinux/libNyARToolkit/makefile' of type sourcecode.make for architecture i386) and one error (duplicate symbol _spriteVertices in /Users/ale/NyARToolkit-iPhone/build/NyARToolkit-iPhone.build/Debug-iphonesimulator/NyARToolkit-iPhone.build/Objects-normal/i386/EAGLView.o and /Users/ale/NyARToolkit-iPhone/build/NyARToolkit-iPhone.build/Debug-iphonesimulator/NyARToolkit-iPhone.build/Objects-normal/i386/NyARToolkitCrossCompileAppDelegate.o))...
    Is it depends by a wrong cross-compiling???? How i do to resolve???? It's very important for me!!!!!
    Thank you and sorry for my english!!!

  92. fjordanon 04 Oct 2010 at 4:09 pm

    Hi

    I can't make it work... Can anybody send me the fullsource code of this application? Thank you very much

    fjordan@gpm.es

    Thank you

  93. jiigion 12 Oct 2010 at 6:22 am

    Can anybody send the full source code of this application for iPhone please?

    jiigi@naver.com

  94. victorfdezon 14 Oct 2010 at 11:48 am

    Hi! My project compiles fine and it starts running but is showing a black screen. The problem seems to be that I'm using a newer version of the EAGLView that is used in this project. Anyone can reply me with a link to the good version or send me the files or paste the code here or wathever?

    Thank you very much in advance.

  95. Rehmanon 18 Oct 2010 at 7:01 am

    I want to compile the coding. for this i downloaded the NyARtoolkit C++ and write the coding in this article. But when i compile it gives me hundreds of errors.
    Now i am confused in that, What is cross-compiling and cross-compile NyARToolkit for iPhone’s CPU architecture (Arm) And how can we configure the NyARToolkit library with xcode to work properly.

    Thank you

  96. KoPandaon 01 Nov 2010 at 10:54 am

    Hi Roy,

    Thank you for your great tutorial. I managed to compile the app and get Failed to make complete framebuffer object 0" as Roger described. I tried to get the GLGravity code to see how to correctly declare the protocol but the version on Apple is updated. Could you or Roger share how to make do this? I found a file here and tried to copy that to my code but still get the same error. I see there is an initWithFrame() function. If I put this into my 3DGraphicsView.m the framebuffer error disappeared but it's still in blank screen. More guidance on this is highly appreciated.

    Thank you for your work and everybody's comments. It's very useful.

  97. KoPandaon 03 Nov 2010 at 6:08 am

    I eventually can run the app without the framebuffer error and 501 error. I simply copy the initWithCoder() to initWithFrame() and fixed the 501 error with Kimmy's suggestion. The remaining part is the camera feed. I read through the comments again and figured out that the CameraTestAppDelegate is not working for iOS 3 and later. Is there any version I can use? I'm developing it with SDK 4.

    Thank you!

  98. [...] Arkit NyARRToolkit for iPhone (image recognition and drawin objects) iphone ar kit iphone AR blog with [...]

  99. Aditya Mon 20 Jul 2011 at 9:49 pm

    Hi there,

    Thank you for a great tutorial. I have a general question about finding the intrinsic parameters, not limited to this tutorial. When you try to calibrate the camera, how do you account for any variations in focal length that happen as the camera's focus adjusts? Or does the marker need to always be nearly a constant distance away? I've been combing the Internet for an answer to this but so far no one seems to have mentioned it.

  100. Ashleyon 29 Oct 2011 at 3:50 pm

    Hi Roy

    i have a few problem ,
    1)where do i find this file NyAR_core.h?
    2)i am facing this error message "expect')' before _DGRaphicsView"
    3)where do i add this
    @protocol _DGraphicsViewDelegate

    @required

    // Draw with OpenGL ES
    -(void)drawView:(_DGraphicsView*)view;

    @optional
    -(void)setupView:(_DGraphicsView*)view;

    @end

  101. Natalia B.on 17 Mar 2012 at 10:50 pm

    This is an amazing tutorial/explanation. Hope to get with it soon and then I can give some more feedback, but it has great explanation. Thank you

  102. Production Plan & Research | mullereinaon 26 Feb 2013 at 11:49 pm

    [...] experiments at trying to implement obj-c code for our mobile AR application.  I tried using this tutorial that uses an AR tool kit with an obj-c wrapper to run on iPhone.  However I was unsuccessful and [...]

  103. Nagenderon 21 May 2013 at 1:27 pm

    Hi guys, if anyone got complete source please share it on my mail id kumarnagender09@gmail.com

Trackback URI | Comments RSS

Leave a Reply