If you haven't yet tapped into the (well by now it's a) phenomena that is Kutiman's "Thru You" project - don't walk, run and do it now.
I'm a long time fan of Kutiman's work, his last CD ("Kutiman") is playing repeatedly in my company-leased car. And I have been watching closely to hear some of his new beats.
I must say he totally surprised me. The music is awesome, but this was to be expected. He surprised me beacuse he single handedly create a new concept - Social Music. What he did was a natural development of music in the Web 2.0 spirit - use the enourmous amount of "data" laying around freely on the internet, and bring it together to create something new.
Anyway, enjoy his work, it's truly aspiring.
As my search for the best platform to roll-out my new face detection concept continues, I decided to give ol' Qt framework a go.
I like Qt. It's cross-platform, a clear a nice API, straightforward, and remindes me somewhat of Apple's Cocoa.
My intention is to get some serious face detection going on mobile devices. So that means either the iPhone, which so far did a crummy job performance-wise, or some other mobile device, preferably linux-based.
This led me to the decision to go with Qt. I believe you can get it to work on any linux-ish platform (limo, moblin, android), and since Nokia baught Trolltech - it's gonna work on Nokia phones soon, awesome!
Lets get to the details, shall we?
Continue reading "Qt & OpenCV combined for face detecting QWidgets"
Morning traffic jams can really bum you out on some days, and most people try to avoid them. But actually doing it takes a bit of practice and a lot of time, so it took me just 3 years to perfect my morning route to work. I think I am now able to shave off between 5 and 20 minutes of traffic every day, depends on how crowded the roads are that day.
On a "light" day these tips aren't worth much, but on a crowded day (like today) it can really save time and gas. Also, since this is my route to work, and I live in Israel and work in Ramat Ha'hayal Tel-Aviv, the example images are in Hebrew and they help people who need to get to Habarzel street. But the tips are genuine, and can help anyone optimize thier morning route.
OK, so on to the point.
Continue reading "Life changing traffic tips"
Recently I was working on an iPhone app for work, for demo purposes, but my company cheaped out on the Apple iPhone Developers registration.
More accurately, the process of binding a multi-million-$-a-year company with Apple Inc. takes a very long time, so we took the back-door and just pwned our test iPhone devices (firmware 2.1).
"How hard can it possibly be to install apps on a jail-broken iPhone?" we thought, Well as it turns out, it's pretty difficult, especially for Mac-first-timers like myself.
In the end, I overcame this obstacle - but not before compiling a compiler, installing a gazillion support apps, compiling my app with at least 6 different compilers, doing it on WinXP, Ubuntu, on the iPhone itself, and on the Mac.
So I thought why not share with you the way that actually produced the working result.
Continue reading "So you're trying to get your homemade app on your pwned iPhone"
I had a little project at work recently, that involved creating movie clips using AviSynth.
And I was appalled by the shabbiness of existing transition plugins available freely for AviSynth, they always reminded me of 80s-like video editing...
So I set out to integrate AviSynth with OpenGL to create a nice 3D transition effect for our movie clips.
I had 2 major bases to cover:
- AviSynth plugin API
- OpenGL rendering
AviSynth API is not so well documented, but they have very good ground-up examples on how to DIY plugin. Here is the one I used, that basically does nothing but copy the input frame to the output frame.
Open GL on the other hand is very well documented and "tutorialed". I based my code on this example from NeHe.
So basically what I wanted to achive is:
- Read input frame (AviSynth)
- Paint frame as texture over 3D model (OpenGL)
- Draw rendered 3D image to output frame (OpenGL+AviSynth)
Reading the frame is pretty straightforward. Frames come encoded as RGB 24bit, with a little twist: rows size in bytes is not width*3 as you'd expect it be, but AviSynth use a parameter called "Pitch" to determine row size in bytes.
Update (14/9/09): source is now available in the repo: browse download
Continue reading "OpenGL for AviSynth [Update: now w/code]"
The Israeli hi-tech market inspired me...
I recently had to build a demo client that shows short video messages for Ubuntu environment.
After checking out GTK+ I decided to go with the more natively OOP Qt toolbox (GTKmm didn't look right to me), and I think i made the right choice.
So anyway, I have my video files encoded in some unknown format and I need my program to show them in a some widget. I went around looking for an exiting example, but i couldn't find anything concrete, except for a good tip here that led me here for an example of using ffmpeg's libavformat and libavcodec, but no end-to-end example including the Qt code.
The ffmpeg example was simple enough to just copy-paste into my project, but the whole painting over the widget's canvas was not covered. Turns out painting video is not as simple as overriding paintEvent()...
Firstly, you need a separate thread for grabbing frames from the video file, because you won't let the GUI event thread do that.
That makes sense, but when the frame-grabbing thread (I called VideoThread) actually grabbed a frame and inserted it somewhere in the memory, I needed to tell the GUI thread to take that buffered pixels and paint them over the widget's canvas.
This is the moment where I praise Qt's excellent Signals/Slots mechanism. So I'll have my VideoThread emit a signal notifying some external entity that a new frame is in the buffer.
Here's a little code:
Continue reading "Showing video with Qt toolbox and ffmpeg libraries"
I was recently building a simple GUI in .NET to operate an algorithm as part of a school project, and I encountered a weird problem using BackgroundWorkers. I spent a lot of time debugging it, mainly because the code seemed to be perfect (which was true) but the run-time behavior was so strange...
Anyway, to make my algorithm as weakly-coupled as possible decided not to use 'BackgroundWorker.ReportProgress', because then my algorithm will have to know what a BackgroundWorker is...
I decided to actually fire my own event whenever I wanted to report on the algorithm progress (which is rather lenghty). So I defined my delegate and event inside my one-function class that runs the algorithm:
Continue reading "The strange case of the BackgroundWorker and the disappearing exception"