Real-Time Video Capture with FFmpeg

Posted: June 11th, 2013 | Tags: , | 10 Comments »

Working on a distributed team means that often the best way to share new results is via video captures of simulations. Previously I would do this by dumping uncompressed frames from OpenGL to disk, and then compressing with FFmpeg. I prefer this over tools like Fraps because it gives more control over compression quality, and has no watermarking or time limits.

The problem with this method is simply that saving uncompressed frames generates a large amount of data that quickly fills up the write cache and slows down the whole system during capture, it also makes FFmpeg disk bound on reads during encoding.

Thankfully there is a better alternative, by using a direct pipe between the app and FFmpeg you can avoid this disk IO entirely. I couldn't find a concise example of this on the web, so here's how to do it in a Win32 GLUT app.

At startup:

#include <stdio.h>

// start ffmpeg telling it to expect raw rgba 720p-60hz frames
// -i - tells it to read frames from stdin
const char* cmd = "ffmpeg -r 60 -f rawvideo -pix_fmt rgba -s 1280x720 -i - "
                  "-threads 0 -preset fast -y -pix_fmt yuv420p -crf 21 -vf vflip output.mp4";

// open pipe to ffmpeg's stdin in binary write mode
FILE* ffmpeg = _popen(cmd, "wb");

int* buffer = new int[width*height];

After rendering each frame, grab back the framebuffer and send it straight to the encoder:

glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

fwrite(buffer, sizeof(int)*width*height, 1, ffmpeg);

When you're done, just close the stream as follows:


With these settings FFmpeg generates a nice H.264 compressed mp4 file, and almost manages to keep up with my real-time simulations.

This has has vastly improved my workflow, so I hope someone else finds it useful.

Update: Added -pix_fmt yuv420p to the output params to generate files compatible with Windows Media Player and Quicktime.

Update: For OSX / Linux, change:

FILE* ffmpeg = _popen(cmd, "wb");


FILE* ffmpeg = popen(cmd, "w");


10 Comments on “Real-Time Video Capture with FFmpeg”

  1. 1 Anvesh Manne said at 2:08 pm on July 8th, 2013:


    Once i am done writing all the frames, how do i tell the ffmpeg program that i have reached the end of my frames?


  2. 2 mmack said at 10:14 pm on July 8th, 2013:

    Hi Anvesh, you just need to call _pclose() on the stream. I updated the post to include this step.


  3. 3 Mads Ravn said at 6:38 pm on July 28th, 2013:

    Video capture with FFmpeg can also be done by hooking onto the X server with FFmpeg.

    It is explained here

  4. 4 GL: How to capture OpenGL to video on the fly | Here be dragons said at 6:33 am on January 29th, 2015:

    […] However, there is a way to save both time and space a lot with a minimal code change. Please see Real-Time Video Capture with FFmpeg by Miles Macklin for the […]

  5. 5 Dries said at 10:31 am on February 12th, 2015:


    thanks for this post. I do have a question. I want to do this with multiple inputs (one for video like you and one for audio). Both should be written in the same way with fwrite() because both streams are generated realtime.

    Do you know how I can do this?

  6. 6 Murray said at 5:32 pm on July 29th, 2015:

    Thanks for this post, Miles. I was looking for something very similar to this, however it involved streaming the OpenGL content over the network, rather than writing to a local file. In case you or anyone else is interested, here's the ffmpeg command line options that I've used:

    ffmpeg -r 30 -f rawvideo -pix_fmt rgba -s 1280x720 -i - -threads 2 -preset ultrafast -tune zerolatency -y -vcodec libx264 -maxrate 3000k -bufsize 3000k -pix_fmt yuv420p -vf vflip -f mpegts udp://

    Thanks again,

  7. 7 Frank Gennari said at 6:30 am on October 27th, 2015:

    I downloaded ffmpeg and added this code to my 3DWorld project, and it works! It's slower than Fraps (I get between 28 and 45FPS @ 1080p) but there is no time limit. Thanks!

  8. 8 Opeen Geellover said at 4:09 pm on February 21st, 2016:

    THIS DID IT FOR ME! YES! THANK YOU! I want to generate mp4's with Libav directly though. This should be straightforward, shouldn't it? I don't want to use a command line interface - because the actual Libav library doesn't require a command line interface, does it? I want an OpenGL animation that converts directly to a movie file, without any kind of intermediate console or visible command line interface.

  9. 9 Jan said at 9:37 pm on April 29th, 2016:

    Thank you very much, for Linux/Mac OS X:
    FILE* ffmpeg = _popen(cmd, "wb");
    FILE* ffmpeg = popen(cmd, "w");

    (note the missing b), and use pclose(cmd) instead of _pclose(cmd). My ffmpeg settings:

    std::stringstream cmd;
    cmd << "/usr/bin/ffmpeg -f rawvideo -pix_fmt rgba "
    << "-s " << width << "x" << height << " "
    << "-i - -y -threads 0 "
    << "-preset veryfast " // Encoding speed: ultrafast, superfast, veryfast, faster, fast, medium, slow, slower, veryslow
    << "-c:v libx264 -crf 18 " // codec & quality. Range is logarithmic 0 (lossless) to 51 (worst quality). Default is 23.
    << "-vf vflip -framerate 30 "
    << filename;

  10. 10 Frank Gennari said at 5:50 am on September 27th, 2016:

    Does anyone know how to record audio from OpenAL along with the video? I would assume the audio needs to be captured from somewhere other than stdin.

Leave a Reply