Tag: photon mapping

  • How to make a Photon Mapper : Photons everywhere

    Bleed Color and Caustics.
    Bleed Color and Caustics.

    Hello,

    It has been a long time since I posted anything on this blog, I am so sorry about it.
    I will try to diversify my blog, I’ll talk about C++, Rendering (always <3), and Qt, a framework I love.

    So, in this last part, we will talk about photons.

    What exactly are Photons ?

    A photon is a quantum of light. It carries the light straightforwardly on the medium. Thanks to it, we can see objects etc.

    How could we transform photons in a “visible value” like RGB color ?

    We saw that the eyes only “see” the radiance !
    So we have to transform our photons in radiance.

    From physic of photons

    We know that one photon have for energy :

    \displaystyle{}E_{\lambda}={h\nu}=\frac{hc}{\lambda}

    where \lambda is the wavelength in nm and E in Joules.
    Say we have n_\lambda photons of E_{\lambda} each.
    We can lay the luminous energy right now :

    \displaystyle{Q_{\lambda}=n_{\lambda}E{\lambda}}

    The luminous flux is just the temporal derivative about the luminous energy :

    \displaystyle{\phi_{\lambda}=\frac{dQ_{\lambda}}{dt}}

    The idea is great, but we have a luminous flux function of the wavelength, but the radiance is waiting for a general luminous flux
    So, we want to have a general flux which is the integral over all wavelengths in the visible spectrum of the \lambda luminous flux.

    \displaystyle{\phi=\int_{380}^{750}d\phi_{\lambda}=\int_{380}^{750}\frac{\partial \phi_{\lambda}}{\partial\lambda}d\lambda}

    Now, we have the radiance

    \displaystyle{L=\frac{d^2 \phi}{cos \theta dAd\omega}=\frac{d^2(\int_{380}^{750}\frac{\partial \phi_{\lambda}}{\partial \lambda}d\lambda)}{cos(\theta)dAd\omega}=\int_{380}^{750}\frac{d^3\phi_{\lambda}}{cos(\theta)dAd\omega d\lambda}d\lambda}

    Using the rendering equation, we get two forms :

    \displaystyle{L^O=\int_{380}^{750}\int_{\Omega^+}fr(\mathbf{x}, \omega_i,\omega_o,\lambda)\frac{d^3\phi_{\lambda}}{dAd\lambda}d\lambda}
    \displaystyle{\int_{\Omega^+}fr(\mathbf{x}, \omega_i,\omega_o)\frac{d^2\phi}{dA}}

    The first one take care about dispersion since the second doesn’t.
    In this post, I am not going to use the first one, but I could write an article about it latter.

    Let’s make our Photon Mapper

    What do we need ?

    We need a Light which emits photons, so we could add a function “emitPhotons” .

    /**
     * @brief      Interface for a light
     */
    class AbstractLight {
    public:
        AbstractLight(glm::vec3 const &flux);
    
        virtual glm::vec3 getIrradiance(glm::vec3 const &position, glm::vec3 const &normal) = 0;
    
        virtual void emitPhotons(std::size_t number) = 0;
    
        virtual ~AbstractLight() = default;
    
    protected:
        glm::vec3 mTotalFlux;
    };

    We also need a material which bounces photons :

    class AbstractMaterial {
    public:
        AbstractMaterial(float albedo);
        
        virtual glm::vec3 getReflectedRadiance(Ray const &ray, AbstractShape const &shape) = 0;
        
        virtual void bouncePhoton(Photon const &photon, AbstractShape const &shape) = 0;
        virtual ~AbstractMaterial() = default;
        
        float albedo;
    protected:
        virtual float brdf(glm::vec3 const &ingoing, glm::vec3 const &outgoing, glm::vec3 const &normal) = 0;
    };

    Obviously, we also need a structure for our photons. This structure should be able to store photons and compute irradiance at a given position.

    class AbstractPhotonMap {
    public:
        AbstractPhotonMap() = default;
    
        virtual glm::vec3 gatherIrradiance(glm::vec3 position, glm::vec3 normal, float radius) = 0;
        virtual void addPhoton(Photon const &photon) = 0;
        virtual void clear() = 0;
    
        virtual ~AbstractPhotonMap() = default;
    private:
    };

    How could we do this ?

    Photon emitting is really easy :

    void SpotLight::emitPhotons(std::size_t number) {
        Photon photon;
    
        photon.flux = mTotalFlux / (float)number;
        photon.position = mPosition;
    
        for(auto i(0u); i < number; ++i) {
            vec3 directionPhoton;
            do
                directionPhoton = Random::random.getSphereDirection();
            while(dot(directionPhoton, mDirection) < mCosCutoff);
    
            photon.direction = directionPhoton;
            tracePhoton(photon);
        }
    }

    We divide the total flux by the number of photons and we compute a random direction, then we could trace the photon

    Bouncing a photon depends on your material :

    void UniformLambertianMaterial::bouncePhoton(const Photon &_photon, const AbstractShape &shape) {
        Photon photon = _photon;
    
        float xi = Random::random.xi();
        float d = brdf(vec3(), vec3(), vec3());
    
        if(photon.recursionDeep > 0) {
            // Photon is absorbed
            if(xi > d) {
                World::world.addPhoton(_photon);
                return;
            }
        }
    
        if(++photon.recursionDeep > MAX_BOUNCES)
            return;
    
        photon.flux *= color;
        photon.direction = Random::random.getHemisphereDirection(shape.getNormal(photon.position));
        tracePhoton(photon);
    }

    To take care about conservation of energy, we play Russian roulette.
    Obviously, to take care about conservation of energy, we have to modify the direct lighting as well ^^.

    vec3 UniformLambertianMaterial::getReflectedRadiance(Ray const &ray, AbstractShape const &shape) {
        vec3 directLighting = getIrradianceFromDirectLighting(ray.origin, shape.getNormal(ray.origin));
        float f = brdf(vec3(), vec3(), vec3());
    
        return color * (1.f - f) * f * (directLighting + World::world.gatherIrradiance(ray.origin, shape.getNormal(ray.origin), 0.5f));
    }

    Finally, we need to compute the irradiance at a given position :
    It is only :

    \displaystyle{E=\sum \frac {\phi}{\pi r^2}}

    So we could easily write :

    vec3 SimplePhotonMap::gatherIrradiance(glm::vec3 position, glm::vec3 normal, float radius) {
        float radiusSquare = radius * radius;
        vec3 irradiance;
        for(auto &photon : mPhotons)
            if(dot(photon.position - position, photon.position - position) < radiusSquare)
                if(dot(photon.direction, normal) < 0.0)
                    irradiance += photon.flux;
    
        return irradiance / ((float)M_PI * radiusSquare);
    }

    To have shadows, you could emit shadow photons like this :

    void traceShadowPhoton(const Photon &_photon) {
        Photon photon = _photon;
        Ray ray(photon.position + photon.direction * RAY_EPSILON, photon.direction);
    
        photon.flux = -photon.flux;
    
        auto nearest = World::world.findNearest(ray);
    
        while(get<0>(nearest) != nullptr) {
            ray.origin += ray.direction * get<1>(nearest);
            photon.position = ray.origin;
    
            if(dot(ray.direction, get<0>(nearest)->getNormal(ray.origin)) < 0.f)
                World::world.addPhoton(photon);
    
            ray.origin += RAY_EPSILON * ray.direction;
    
            nearest = World::world.findNearest(ray);
        }
    }

    That’s all ! If you have any question, please, let me know !

  • How to make a Photon Mapper : Whitted Ray Tracer

    Hello there,

    Today, we are going to see how to make the first part of our photon mapper. We are unfortunately not going to talk about photons, but only about normal ray tracing.

    Whitted Ray Tracer without shadows
    Whitted Ray Tracer without shadows

    This is ugly !!! There is no shadow…

    It is utterly wanted, shadows will be draw by photon mapping with opposite flux. We will see that in the next article.

    Introduction

    In this chapter, we are going to see one implementation for a whitted ray tracer

    Direct Lighting

    Direct lighting equation could be exprimed by :

    \displaystyle{L_o(\mathbf{x}, \vec{\omega_o}) = \sum_{i\in \{Lights\}}f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o})\frac{\phi_{Light}}{\Omega r^2}cos(\theta)}

    The main difficult is to compute the solid angle \Omega .
    For a simple isotropic spot light, the solid angle could be compute as :

    \displaystyle{\Omega=\int_{0}^{angleCutOff}\int_{0}^{2\pi}sin(\theta)d\theta d\phi =-2\pi(cos(angleCutOff)-1)}

    with :

    1. \Omega the solid angle.
    2. \phi_{Light} the total flux carried by the light.
    3. cos(\theta) the attenuation get by projected light area on lighted surface area.
    4. angleCutOff \in [0; pi].

    Refraction and reflection

    Both are drawn by normal ray tracing.

    Architecture

    Now, we are going to see how our ray tracer works :

    Shapes :

    Shapes are bases of renderer. Without any shapes, you can’t have any render. We can have many differents shape, so, we can use one object approach for our shapes.

    
    #ifndef SHAPE_HPP
    #define SHAPE_HPP
    
    #include "ray.hpp"
    #include "material.hpp"
    
    /**
     * @brief      This class provides an interface to manage differents shape
     */
    class AbstractShape {
    public:
        AbstractShape() = default;
        AbstractShape(std::unique_ptr &&material);
    
        /**
         * @brief      Return the distance between shape and ray emetter   
         *
         * @param      ray     
         *
         * @return     Negative value if not found. Distance between object and ray emetter
         */
        virtual float intersect(Ray const &ray) const = 0;
        virtual glm::vec3 getNormal(glm::vec3 const &position) const = 0;
    
        /**
         * @brief      Return the radiance returned by the material owned 
         *
         * @param      ray  
         *
         * @return     radiance
         */
        glm::vec3 getReflectedRadiance(Ray const &ray);
    
        virtual ~AbstractShape() = default;
    
    private:
        std::unique_ptr mMaterial = std::make_unique(glm::vec3(1.0, 1.0, 1.0), 1.0);
    };
    

    Materials

    For each shapes, we obviously have a particular material. The material have to give us a brdf and can reflect radiance.

    
    /**
     * @brief      This class describes a material
     */
    class AbstractMaterial {
    public:
        AbstractMaterial(float albedo);
    
        /**
         * @brief      Get the reflected radiance
         *
         * @param      ray    
         * @param      shape  Useful to get normal, UV for texture???
         *
         * @return     { description_of_the_return_value }
         */
        virtual glm::vec3 getReflectedRadiance(Ray const &ray, AbstractShape const &shape) = 0;
    
        virtual void bouncePhoton(Photon const &photon, AbstractShape const &shape) = 0;
    
        virtual ~AbstractMaterial() = default;
    
        float albedo;
    protected:
        /**
         * @brief      Get the Bidirectionnal Reflectance Distribution Function
         *
         * @param      ingoing   
         * @param      outgoing  
         * @param      normal    
         *
         * @return     the brdf
         */
        virtual float brdf(glm::vec3 const &ingoing, glm::vec3 const &outgoing, glm::vec3 const &normal) = 0;
    };
    

    Storage Shapes

    To have a better ray tracing algorithm, we could use a spatial structure like Kd-tree or other like one :

    
    /**
     * @brief      This class provide a structure to store shapes
     */
    class AbstractShapeStorage {
    public:    
        /**
         * @brief      Add a shape in structure
         *
         * @param      shape 
         */
        virtual void addShape(std::shared_ptr const &shape) = 0;
    
        /**
         * @brief      Get the nearest shape
         *
         * @param      ray   
         *
         * @return     a tuple with shape and distance. Shape could be null if no shape found
         */
        virtual std::tuple<std::shared_ptr, float> findNearest(Ray const &ray) = 0;
    
        ~AbstractShapeStorage() = default;
    };
    

    Algorithm

    The main algorithm part is on the materials side. Below, a piece of code where I compute the reflected radiance for a lambertian material and a mirror. You could see that material part has access to other shape via the global variable world.

    
    float LambertianMaterial::brdf(const glm::vec3 &, const glm::vec3 &, glm::vec3 const &) {
         return albedo / M_PI;
    }
    
    vec3 UniformLambertianMaterial::getReflectedRadiance(Ray const &ray, AbstractShape const &shape) {
        vec3 directLighting = getIrradianceFromDirectLighting(ray, shape);
        float f = brdf(vec3(), vec3(), vec3());
    
        return color * f * (directLighting);
    }
    
    float MirrorMaterial::brdf(const vec3&, const vec3&, const vec3&) {
        return albedo;
    }
    
    vec3 MirrorMaterial::getReflectedRadiance(const Ray &ray, const AbstractShape &shape) {
        if(ray.recursionDeep >= MAX_BOUNCES)
            return vec3();
    
        Ray reflectedRay = getReflectedRay(ray, shape.getNormal(ray.origin + ray.direction * ray.distMax));
        auto nearest = World::world.findNearest(reflectedRay);
    
        if(get(nearest) != nullptr) {
            reflectedRay.distMax = get(nearest);
            return brdf(vec3(), vec3(), vec3()) * get(nearest)->getReflectedRadiance(reflectedRay);
        }
    
        return vec3();
    }
    

    Lights

    Lighting is a useful feature in a render. It’s thanks to lights that you can see the relief. A light carry a flux. Irradiance is the flux received by a surface.

    So, our interface is :

    
    /**
     * @brief      Interface for a light
     */
    class AbstractLight {
    public:
        AbstractLight(glm::vec3 const &flux);
    
        /**
         * @brief      Compute irradiance receive by the projected area in position
         *
         * @param      position  surface's position
         * @param      normal    surface's normal
         *
         * @return     irradiance
         */
        virtual glm::vec3 getIrradiance(glm::vec3 const &position, glm::vec3 const &normal) = 0;
    
        virtual void emitPhotons(std::size_t number) = 0;
    
        virtual ~AbstractLight() = default;
    
    protected:
        glm::vec3 mTotalFlux;
    };
    

    Below a piece of code about computing irradiance :

    
    vec3 SpotLight::getIrradiance(const vec3 &position, const vec3 &normal) {
        vec3 posToLight = mPosition - position;
        vec3 posToLightNormalized = normalize(posToLight);
    
        if(dot(-posToLightNormalized, mDirection) > mCosCutoff) {
            float solidAngle = - 2.f * M_PI * (mCosCutoff - 1);
            return lambertCosineLaw(posToLightNormalized, normal) * mTotalFlux /
                (solidAngle * dot(posToLight, posToLight));
        }
    
        return vec3();
    }
    

    The next time, we will see how to integrate a photon mapper to our photon mapper. If you want to have the complete code, you could get it here :
    GitHub

    Bye my friends :).

  • How to make a Photon Mapper : Rendering Equation Debunked

    Introduction:

    Hi guys!

    I have not written anything here since a long time, I had exams, professional mission in C++ with Qt and too many things to do. Don’t be afraid, activities in this blog will be start again soon.

    In computer graphics, I didn’t do beautiful things, except a little ray tracer with photon mapping. I’m going to explain how to make a little photon mapper. Firstly, we’ll see the it on the CPU side, after we’ll try to implement it in the GPU side.

    We will see several points :

    1. Explanations about rendering equations
    2. A raytracer
    3. A photon mapper
    4. A GPU side with Photons Volume

    Perhaps, some chapters will be divided into several parts.

    Rendering Equation :

    Rendering_eq

    The rendering equation is :

    \displaystyle{L_o(\mathbf{x}, \vec{\omega_o}) = L_e(\mathbf{x}, \vec{\omega_o}) + \int_{\Omega}f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o})L_i(\mathbf{x}, \vec{\omega_i})(\vec{\omega_i}\cdot\vec{n}) d\omega_i =}
    \displaystyle{L_e(\mathbf{x}, \vec{\omega_o}) + \int_{\Omega}f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o})dE_i(\vec{\omega_i}) =}
    \displaystyle{L_e(\mathbf{x}, \vec{\omega_o}) + \int_{\Omega}f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o})\frac{d^2 \phi_i(\mathbf{x}, \vec{\omega_i})}{dA}}

    What is it burried into this equation?

    1. L_o(\mathbf{x}, \vec{\omega_o}) is the outgoing radiance. It will be our pixel color in R, G, B space. It is exprimed in W\cdot sr^{-1}\cdot m^{-2}.
    2. L_e(\mathbf{x}, \vec{\omega_o}) is the emmited radiance. It should be zero for almost all materials, excepting those that emit light (fluorescent material or lights for examples).
    3. f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o}) is the Bidirectional Reflectance Distribution Function(aka BRDF). It defines how light is reflected on surface point \mathbf{x}. It is exprimed in sr^{-1}. Its integral over \Omega should be inferior or equal to one.
    4. L_i(\mathbf{x}, \vec{\omega_i}) is the incoming radiance. It is like the light received by \mathbf{x} in a direct or indirect way.
    5. (\vec{\omega_i}\cdot\vec{n}) is the lambert cosine law.
    6. d\omega_i is the differential solid angle. It is exprimed in sr.
    7. dE(\vec{\omega_i}) is differential irradiance. It is the radiant flux (or power) received by a surface per unit area. It is exprimed in W\cdot m^{-2}.
    8. \phi_i(\mathbf{x}, \vec{\omega_i}) is the radiant flux(or power) passing from \mathbf{x} and have for direction \omega_i. It could be seen as one of our photons. It is exprimed in R, G, B space as well. It is exprimed in W.
    9. dA is the area which received the radiant power.

    Manipulate rendering equation for an approximation

    This part is going to be a bit mathematical with “physical approximation :D”.

    Simplification of irradiance

    Zack Waters : Irradiance and flux
    Zack Waters : Irradiance and flux

    \displaystyle{dE_i=\frac{d^2 \phi_i}{dA}\rightarrow E_i=\frac{d\phi_i}{dA}}

    Assuming the “floor” of the hemisphere is flat and photon hitting this floor is like hit \mathbf{x}, we have :

    \displaystyle{E_i \simeq \sum \frac{\phi_i}{\pi r^2}\rightarrow E \simeq \sum_i \sum \frac{\phi_i}{\pi r^2} = \sum \frac{\phi}{\pi r^2}}.

    So, to have a better approximation, you must have to reduce the hemisphere’s radius and increase the photons number.

    The rendering equation simplified

    \displaystyle{L_o(\mathbf{x}, \vec{\omega_o}) = L_e(\mathbf{x}, \vec{\omega_o}) + \int_{\Omega}f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o})L_i(\mathbf{x}, \vec{\omega_i})(\vec{\omega_i}\cdot\vec{n}) d\omega_i =}
    \displaystyle{L_o(\mathbf{x}, \vec{\omega_o}) = L_e(\mathbf{x}, \vec{\omega_o}) + \frac{1}{\pi r^2}\sum f_r(\mathbf{x}, \vec{\omega_{photon}}, \vec{\omega_o})\phi(\mathbf{x}, \vec{\omega_{photon}})}

    Direct Illumination

    Direct illumination need an accurate quality, so, we will not use the simplified equation seen above but the real one. The main difficulty is to compute the solid angle. For example, a point light with power \phi_L :

    \displaystyle{L_r = f_r(\mathbf{x}, \vec{\omega_i},\vec{\omega_o})(\vec{\omega_i}\cdot\vec{n})\frac{\phi_l}{4\pi r^2}}

    We remind that E = \frac{I}{r^2} = \frac{\phi_l}{\Omega r^2} with \Omega the solid angle.

    Photon tracing, reflection and refractions

    The photon tracing is easy. Imagine you have a 1000W lamp, and you want to share all of its power into 1 000 photons, each photon should have a 1W power (We could say the colour of this photon is (1.0, 1.0, 1.0) in RGB normalized space). When photon hit a surface, you store it into the photon maps and reflect it with the BRDF :).

    If you didn’t get it all, don’t be afraid, in our futur implementation, we are going to see how it works. We will probably not do any optimization, but it should be good :).

    Bye my friends 🙂 !