SIGGRAPH slides

Posted: July 25th, 2013 | Tags: , , , , , | 2 Comments »

Slides for my SIGGRAPH presentation of Position Based Fluids are available here:

http://mmacklin.com/pbf_slides.pdf

During the presentation I showed videos of some more recent results including two-way coupling of fluids with clothing and rigid bodies. They're embedded below:

Overall it has been a great SIGGRAPH, I met tons of new people who provided lots of inspiration for new research ideas. Thanks!

{ 2 Comments » }


Position Based Fluids

Posted: April 24th, 2013 | Tags: , , , , , | 6 Comments »

Position Based Fluids (PBF) is the title of our paper that has been accepted for presentation at SIGGRAPH 2013. I've set up a project page where you can download the paper and all the related content here:

http://blog.mmacklin.com/publications

I have continued working on the technique since the submission, mainly improving the rendering, and adding features like spray and foam (based on the excellent paper from the University of Freiburg: Unified Spray, Foam and Bubbles for Particle-Based Fluids). You can see the results in action below, but I recommend checking out the project page and downloading the videos, they look great at full resolution and 60hz.

{ 6 Comments » }


Blackbody Rendering

Posted: December 29th, 2010 | Tags: , , , | 2 Comments »

In between bouts of festive over-eating I added support for blackbody emission to my fluid simulator and thought I'd describe what was involved.

Briefly, a blackbody is an idealised substance that gives off light when heated. Planck's formula describes the intensity of light per-wavelength with units W·sr-1·m-2·m-1 for a given temperature in Kelvins.

Radiance has units W·sr-1·m-2 so we need a way to convert the wavelength dependent power distribution given by Planck's formula to a radiance value in RGB that we can use in our shader / ray-tracer.

The typical way to do this is as follows:

  1. Integrate Planck's formula against the CIE XYZ colour matching functions (available as part of PBRT in 1nm increments)
  2. Convert from XYZ to linear sRGB (do not perform gamma correction yet)
  3. Render as normal
  4. Perform tone-mapping / gamma correction

We are throwing away spectral information by projecting into XYZ but a quick dimensional analysis shows that now we at least have the correct units (because the integration is with respect to measured in meters the extra m-1 is removed).

I was going to write more about the colour conversion process, but I didn't want to add to the confusion out there by accidentally misusing terminology. Instead here are a couple of papers describing the conversion from Spectrum->RGB and RGB->Spectrum, questions about these come up all the time on various forums and I think these two papers do a good job of providing background and clarifying the process:

And some more general colour space links:

Here is a small sample of linear sRGB radiance values for different Blackbody temperatures:

1000K: 1.81e-02, 1.56e-04, 1.56e-04
2000K: 1.71e+03, 4.39e+02, 4.39e+02
4000K: 5.23e+05, 3.42e+05, 3.42e+05
8000K: 9.22e+06, 9.65e+06, 9.65e+06

It's clear from the range of values that we need some sort of exposure control and tone-mapping. I simply picked a temperature in the upper end of my range (around 3000K) and scaled intensities around it before applying Reinhard tone mapping and gamma correction. You can also perform more advanced mapping by taking into account the human visual system adaptation as described in Physically Based Modeling and Animation of Fire.

Again the hardest part was setting up the simulation parameters to get the look you want, here's one I spent at least 4 days tweaking:

Simulation time is ~30s a frame (10 substeps) on a 128^3 grid tracking temperature, fuel, smoke and velocity. Most of that time is spent in the tri-cubic interpolation during advection, I've been meaning to try MacCormack advection to see if it's a net win.

There are some pretty obvious artifacts due to the tri-linear interpolation on the GPU, that would be helped by a higher resolution grid or manually performing tri-cubic in the shader.

Inspired by Kevin Beason's work in progress videos I put together a collection of my own failed tests which I think are quite amusing:

{ 2 Comments » }


Adventures in Fluid Simulation

Posted: November 1st, 2010 | Tags: , , , , | 9 Comments »

I have to admit to being simultaneously fascinated and slightly intimidated by the fluid simulation crowd. I've been watching the videos on Ron Fedkiw's page for years and am still in awe of his results, which sometimes seem little short of magic.

Recently I resolved to write my first fluid simulator and purchased a copy of Fluid Simulation for Computer Graphics by Robert Bridson.

Like a lot of developers my first exposure to the subject was Jos Stam's stable fluids paper and his more accessible Fluid Dynamics for Games presentation, while the ideas are undeniable great I never came away feeling like I truly understood the concepts or the mathematics behind it.

I'm happy to report that Bridson's book has helped change that. It includes a review of vector calculus in the appendix that is given in a wonderfully straight-forward and concise manner, Bridson takes almost nothing for granted and gives lots of real-world examples which helps for some of the less intuitive concepts.

I'm planning a bigger post on the subject but I thought I'd write a quick update with my progress so far.

I started out with a 2D simulation similar to Stam's demos, having a 2D implementation that you're confident in is really useful when you want to quickly try out different techniques and to sanity check results when things go wrong in 3D (and they will).

Before you write the 3D sim though, you need a way of visualising the data. I spent quite a while on this and implemented a single-scattering model using brute force ray-marching on the GPU.

I did some tests with a procedural pyroclastic cloud model which you can see below, this runs at around 25ms on my MacBook Pro (NVIDIA 320M) but you can dial the sample counts up and down to suit:

Here's a simplified GLSL snippet of the volume rendering shader, it's not at all optimised apart from some branches to skip over empty space and an assumption that absorption varies linearly with density:

uniform sampler3D g_densityTex;
uniform vec3 g_lightPos;
uniform vec3 g_lightIntensity;
uniform vec3 g_eyePos;
uniform float g_absorption;

void main()
{
    // diagonal of the cube
    const float maxDist = sqrt(3.0);

    const int numSamples = 128;
    const float scale = maxDist/float(numSamples);

    const int numLightSamples = 32;
    const float lscale = maxDist / float(numLightSamples);

    // assume all coordinates are in texture space
    vec3 pos = gl_TexCoord[0].xyz;
    vec3 eyeDir = normalize(pos-g_eyePos)*scale;

    // transmittance
    float T = 1.0;
    // in-scattered radiance
    vec3 Lo = vec3(0.0);

    for (int i=0; i < numSamples; ++i)
    {
        // sample density
        float density = texture3D(g_densityTex, pos).x;

        // skip empty space
        if (density > 0.0)
        {
            // attenuate ray-throughput
            T *= 1.0-density*scale*g_absorption;
            if (T <= 0.01)
                break;

            // point light dir in texture space
            vec3 lightDir = normalize(g_lightPos-pos)*lscale;

            // sample light
            float Tl = 1.0; // transmittance along light ray
            vec3 lpos = pos + lightDir;

            for (int s=0; s < numLightSamples; ++s)
            {
                float ld = texture3D(g_densityTex, lpos).x;
                Tl *= 1.0-g_absorption*lscale*ld;

                if (Tl <= 0.01)
                    break;

                lpos += lightDir;
            }

            vec3 Li = g_lightIntensity*Tl;

            Lo += Li*T*density*scale;
        }

        pos += eyeDir;
    }

    gl_FragColor.xyz = Lo;
    gl_FragColor.w = 1.0-T;
}

I'm pretty sure there's a whole post on the ways this could be optimised but I'll save that for next time. Also this example shader doesn't have any wavelength dependent variation. Making your absorption coefficient different for each channel looks much more interesting and having a different coefficient for your primary and shadow rays also helps, you can see this effect in the videos.

To create the cloud like volume texture in OpenGL I use a displaced distance field like this (see the SIGGRAPH course for more details):

// create a volume texture with n^3 texels and base radius r
GLuint CreatePyroclasticVolume(int n, float r)
{
    GLuint texid;
    glGenTextures(1, &texid);

    GLenum target = GL_TEXTURE_3D;
    GLenum filter = GL_LINEAR;
    GLenum address = GL_CLAMP_TO_BORDER;

    glBindTexture(target, texid);

    glTexParameteri(target, GL_TEXTURE_MAG_FILTER, filter);
    glTexParameteri(target, GL_TEXTURE_MIN_FILTER, filter);

    glTexParameteri(target, GL_TEXTURE_WRAP_S, address);
    glTexParameteri(target, GL_TEXTURE_WRAP_T, address);
    glTexParameteri(target, GL_TEXTURE_WRAP_R, address);

    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);

    byte *data = new byte[n*n*n];
    byte *ptr = data;

    float frequency = 3.0f / n;
    float center = n / 2.0f + 0.5f;

    for(int x=0; x < n; x++)
    {
        for (int y=0; y < n; ++y)
        {
            for (int z=0; z < n; ++z)
            {
                float dx = center-x;
                float dy = center-y;
                float dz = center-z;

                float off = fabsf(Perlin3D(x*frequency,
                               y*frequency,
                               z*frequency,
                               5,
                               0.5f));

                float d = sqrtf(dx*dx+dy*dy+dz*dz)/(n);

                *ptr++ = ((d-off) < r)?255:0;
            }
        }
    }

    // upload
    glTexImage3D(target,
                 0,
                 GL_LUMINANCE,
                 n,
                 n,
                 n,
                 0,
                 GL_LUMINANCE,
                 GL_UNSIGNED_BYTE,
                 data);

    delete[] data;

    return texid;
}

An excellent introduction to volume rendering is the SIGGRAPH 2010 course, Volumetric Methods in Visual Effects and Kyle Hayward's Volume Rendering 101 for some GPU specifics.

Once I had the visualisation in place, porting the fluid simulation to 3D was actually not too difficult. I spent most of my time tweaking the initial conditions to get the smoke to behave in a way that looks interesting, you can see one of my more successful simulations below:

Currently the simulation runs entirely on the CPU using a 128^3 grid with monotonic tri-cubic interpolation and vorticity confinement as described in Visual Simulation of Smoke by Fedkiw. I'm fairly happy with the result but perhaps I have the vorticity confinement cranked a little high.

Nothing is optimised so its running at about 1.2s a frame on my 2.66ghz Core 2 MacBook.

Future work is to port the simulation to OpenCL and implement some more advanced features. Specifically I'm interested in A Vortex Particle Method for Smoke, Water and Explosions which Kevin Beason describes on his fluid page (with some great videos).

On a personal note, I resigned from LucasArts a couple of weeks ago and am looking forward to some time off back in New Zealand with my family and friends. Just in time for the Kiwi summer!

Links

GPU Gems - Fluid Simulation on the GPU
GPU Gems 3 - Real-Time Rendering and Simulation of 3D Fluids
Fluid Simulation For Computer Graphics: A Tutorial in Grid Based and Particle Based Methods

{ 9 Comments » }