Categories
programming

OpenGL for AviSynth [Update: now w/code]

Hi
I had a little project at work recently, that involved creating movie clips using AviSynth.
And I was appalled by the shabbiness of existing transition plugins available freely for AviSynth, they always reminded me of 80s-like video editing…
So I set out to integrate AviSynth with OpenGL to create a nice 3D transition effect for our movie clips.
I had 2 major bases to cover:

  • AviSynth plugin API
  • OpenGL rendering

AviSynth API is not so well documented, but they have very good ground-up examples on how to DIY plugin. Here is the one I used, that basically does nothing but copy the input frame to the output frame.
Open GL on the other hand is very well documented and “tutorialed”. I based my code on this example from NeHe.
So basically what I wanted to achive is:

  1. Read input frame (AviSynth)
  2. Paint frame as texture over 3D model (OpenGL)
  3. Draw rendered 3D image to output frame (OpenGL+AviSynth)

Reading the frame is pretty straightforward. Frames come encoded as RGB 24bit, with a little twist: rows size in bytes is not width*3 as you’d expect it be, but AviSynth use a parameter called “Pitch” to determine row size in bytes.
Update (14/9/09): source is now available in the repo: browse download

Categories
ffmpeg graphics gui linux programming qt video

Showing video with Qt toolbox and ffmpeg libraries

I recently had to build a demo client that shows short video messages for Ubuntu environment.
After checking out GTK+ I decided to go with the more natively OOP Qt toolbox (GTKmm didn’t look right to me), and I think i made the right choice.
So anyway, I have my video files encoded in some unknown format and I need my program to show them in a some widget. I went around looking for an exiting example, but i couldn’t find anything concrete, except for a good tip here that led me here for an example of using ffmpeg’s libavformat and libavcodec, but no end-to-end example including the Qt code.
The ffmpeg example was simple enough to just copy-paste into my project, but the whole painting over the widget’s canvas was not covered. Turns out painting video is not as simple as overriding paintEvent()…
Firstly, you need a separate thread for grabbing frames from the video file, because you won’t let the GUI event thread do that.
That makes sense, but when the frame-grabbing thread (I called VideoThread) actually grabbed a frame and inserted it somewhere in the memory, I needed to tell the GUI thread to take that buffered pixels and paint them over the widget’s canvas.
This is the moment where I praise Qt’s excellent Signals/Slots mechanism. So I’ll have my VideoThread emit a signal notifying some external entity that a new frame is in the buffer.
Here’s a little code:

Categories
programming

The strange case of the BackgroundWorker and the disappearing exception

I was recently building a simple GUI in .NET to operate an algorithm as part of a school project, and I encountered a weird problem using BackgroundWorkers. I spent a lot of time debugging it, mainly because the code seemed to be perfect (which was true) but the run-time behavior was so strange…
Anyway, to make my algorithm as weakly-coupled as possible decided not to use BackgroundWorker.ReportProgress‘, because then my algorithm will have to know what a BackgroundWorker is…
I decided to actually fire my own event whenever I wanted to report on the algorithm progress (which is rather lenghty). So I defined my delegate and event inside my one-function class that runs the algorithm: