Categories
ffmpeg graphics gui linux programming qt video

Showing video with Qt toolbox and ffmpeg libraries

I recently had to build a demo client that shows short video messages for Ubuntu environment.
After checking out GTK+ I decided to go with the more natively OOP Qt toolbox (GTKmm didn’t look right to me), and I think i made the right choice.
So anyway, I have my video files encoded in some unknown format and I need my program to show them in a some widget. I went around looking for an exiting example, but i couldn’t find anything concrete, except for a good tip here that led me here for an example of using ffmpeg’s libavformat and libavcodec, but no end-to-end example including the Qt code.
The ffmpeg example was simple enough to just copy-paste into my project, but the whole painting over the widget’s canvas was not covered. Turns out painting video is not as simple as overriding paintEvent()…
Firstly, you need a separate thread for grabbing frames from the video file, because you won’t let the GUI event thread do that.
That makes sense, but when the frame-grabbing thread (I called VideoThread) actually grabbed a frame and inserted it somewhere in the memory, I needed to tell the GUI thread to take that buffered pixels and paint them over the widget’s canvas.
This is the moment where I praise Qt’s excellent Signals/Slots mechanism. So I’ll have my VideoThread emit a signal notifying some external entity that a new frame is in the buffer.
Here’s a little code: