graphics Mobile phones video

iPhoneOS 3.1 will not allow marker-based AR

I had very high hopes for iPhoneOS 3.1 in the AR arena. With all the hype about it, I naturally thought that with 3.1 developers will be able to bring marker-detection AR to the app-store – meaning, using legal and published APIs. A look around 3.1’s APIs I wasn’t able to find anything that will allow this.
Not all AR is banned. In fact AR apps like Layar will be very much possible, as they rely on compass & gyro to create the AR effect. These don’t require processing the live video feed from the camera, only overlaying data over it. This can be done easily with the new cameraOverlayView property of UIImagePickerController. All you need to do is create a transparent view with the required data, and it will be overlaid on the camera preview.
Sadly, to get marker-detection abilities developers must still hack the system (camera callback rerouting), or use very slow methods (UIGetScreenImage). I can only hope apple will see the potential of letting developers manipulate the live video feed.

20 replies on “iPhoneOS 3.1 will not allow marker-based AR”

Can you tell me how to use UIGetScreenImage to do the marker detection?? I need to implement it on iPhone OS 3.1, thx.

Hi Ray
Unfortunately using UIGetScreenImage will not help you in AR, becuase it will also capture any augmentations you might put on top of the video.
The key is to get the clean pixels of the incoming video, work them to find the marker (using NyARTk) and then augment (w/ OpenGL ES).
From what I heard, ARToolkit have a working version of their framework on iPhone OS3.x, incl. live video detection.
To anyone that baught the ARToolkit framework – can you please shed some light on how they extract the video frames? I know personally that a great deal of people will love to get this ability.

Leave a Reply

Your email address will not be published. Required fields are marked *