Author: Antoine MORRIER

  • Flux: Qt Quick with unidirectional data flow

    Hello !

    Today is an important day, is the day of the first article of Qt I wrote.

    Have you ever heard anything about Model View Controller or Model View Delegate???? Yes obviously, but right now, we’re going to talk about another thing (yes I am funny I know) . We are going to talk about Facebook, I mean a pattern which comes from Facebook.

    What are we going to talk about??

    We are going to talk about the Flux pattern, this pattern says data flow should be unidirectional as opposed to the Model View Delegate pattern.

    modelview-overviewModel View Delegate multi directional data flow : Credit Qt

    Flux pattern representation

    Flux unidirectional data flow

    What is the advantages to use Flux pattern?

    1. Signals propagation is easy and do not require any copy and paste.
    2. The code is easy to read.
    3. There is low coupling

    What is the Action Creator?

    When a user wants to interact with the application, he want to do an “Action“. For example, he could wants to add things to a todo list, so he could launch an Action(“Things to do”);
    The Action Creator is here to give Action to our dispatcher.

    What is the Dispatcher?

    A dispatcher takes an action and its arguments and dispatchs it through all stores.

    What is the Store?

    A store is like a collection of datas but it also has logic buried inside it.

    What is the View?

    It shows all data.

    What are we going to see?

    We are going to see how to write a little application using Flux.
    Our application could seem to that :
    Screenshot_2016-02-19-20-22-20

    Let’s code !

    First, we need to know what our application has to do.
    It must have possibility to add a counter, increment and decrement them. We exactly have 3 actions. It is that simple !

    pragma Singleton
    import QtQuick 2.0
    
    Item {
        property string add: "add";
        property string inc: "inc";
        property string dec: "dec";
    }
    pragma Singleton
    import QtQuick 2.0
    
    Item {
        function add() {AppDispatcher.dispatch("add", {});}
        function inc(id) {AppDispatcher.dispatch("inc", {id:id});}
        function dec(id) {AppDispatcher.dispatch("dec", {id:id});}
    }
    

    Now, we are going to see what is the AppDispatcher.

    A dispatcher should take as argument the action type and a message (id for example). This dispatcher should dispatch this action as well.

    class Dispatcher : public QObject {
        Q_OBJECT
    public:
        Dispatcher() = default;
    
    public slots:
        Q_INVOKABLE void dispatch(QString action, QJSValue args) {
            emit dispatched(action, args);
        }
    
    signals:
        void dispatched(QString action, QJSValue args);
    };
    #include <QApplication>
    #include <QQmlApplicationEngine>
    #include <QQmlContext>
    #include "dispatcher.h"
    
    int main(int argc, char *argv[])
    {
        QApplication app(argc, argv);
        Dispatcher dispatcher;
    
        QQmlApplicationEngine engine;
    
        engine.rootContext()->setContextProperty("AppDispatcher", QVariant::fromValue(&dispatcher));
    
        engine.load(QUrl(QStringLiteral("qrc:/main.qml")));
    
        return app.exec();
    }
    

    You easily could improve the dispatch behaviour. Indeed, it could be more safe to use a queue… But in this example, I just show the mechanism and try to don’t overcomplicate the app.

    I remind dispatcher dispatchs through the stores.
    A store is a singleton which manages all objects of the same type. Our store manages all counters in the app.

    A counter is an object with an id and a value, knowing that, we easily get :

    pragma Singleton
    import QtQuick 2.0
    import "."
    
    Item {
        property alias model: listModel;
        property int nextId: 1;
    
        ListModel {
            id: listModel;
    
            ListElement {
                idModel: 0;
                value: 0;
            }
        }
    
        function getItemID(idModel) {
            for(var i = 0; i < model.count; ++i) {
                if(model.get(i).idModel == idModel)
                    return model.get(i);
            }
        }
    
        Connections {
            target: AppDispatcher;
    
            onDispatched: {
                if(action === ActionType.add)
                    model.append({idModel: nextId++, value:0});
    
                else if(action === ActionType.inc)
                    getItemID(args.id).value++;
    
                else if(action === ActionType.dec)
                    getItemID(args.id).value--;
            }
        }
    }
    

    On the onDispatched check if it is the good action, and if it is, do the required work.

    Now we just need a view, as you see before, we use a “model” to store  all data, it will be the same for the view / delegate, but even if we use model view delegate, we will keep a unidirectional data flow.

    The delegate will explains how an item should be rendered.
    It is componed by 2 buttons (+ and -) and a value :

    import QtQuick 2.0
    import QtQuick.Controls 1.4
    
    import "."
    
    Rectangle {
        property alias text: t.text;
        property int fontSize: 10;
        signal clicked;
    
    
        MouseArea {
            anchors.fill: parent;
            onClicked: parent.clicked();
        }
    
        Text {
            anchors.centerIn: parent;
            font.pointSize: fontSize;
            id:t;
        }
    }
    
    
    import QtQuick 2.0
    import "."
    
    Rectangle{
        width: text.width;
        height: text.height;
        color: palette.window;
    
        Text {
            id: text;
            text:value;
            font.pointSize: mainWindow.width < mainWindow.height ? mainWindow.width / 16: mainWindow.height / 16;
        }
    
        Button {
            anchors.left: text.right;
            anchors.verticalCenter: parent.verticalCenter;
    
            color: Qt.rgba(0.4, 0.7, 0.2, 1);
            width: mainWindow.width / 10;
            height: mainWindow.height / 10;
            text: "+";
            fontSize: mainWindow.width < mainWindow.height ? mainWindow.width / 20 : mainWindow.height / 20;
    
            onClicked: ActionCreator.inc(idModel);
        }
    
        Button {
            anchors.right: text.left;
            anchors.verticalCenter: parent.verticalCenter;
    
            color: Qt.rgba(0.7, 0.4, 0.2, 1);
            width: mainWindow.width / 10;
            height: mainWindow.height / 10;
            text: "-";
            fontSize: mainWindow.width < mainWindow.height ? mainWindow.width / 20 : mainWindow.height / 20;
    
            onClicked: ActionCreator.dec(idModel);
        }
    }
    

    Yeah I know, there is some duplication of code, it is not good…
    Now, we have the possibility to render items, we should print many of them.
    Flux tells us datas are coming from Store, so let’s implement what Flux says!

    import QtQuick 2.0
    import QtQuick.Controls 1.4
    import "."
    
    Rectangle {
        width: view.contentItem.childrenRect.width;
        height: view.contentItem.childrenRect.height;
    
        color: palette.window;
    
        ListView {
            id: view;
            anchors.fill: parent;
            model: CounterStore.model;
    
            spacing: 10;
    
            delegate: CounterItem{}
        }
    }
    
    import QtQuick 2.5
    import QtQuick.Controls 1.4
    import "."
    
    ApplicationWindow {
        id: mainWindow;
        visible: true
        width: 640
        height: 480
        title: qsTr("Hello World")
    
        SystemPalette {
            id: palette;
        }
    
    
        Button {
            id: buttonAdd;
            anchors.verticalCenter: parent.verticalCenter;
            width: parent.width / 5;
            height: parent.height;
            text: "Add";
            fontSize: 30;
            color: Qt.rgba(0.1, 0.3, 0.7, 1.0);
    
            onClicked: AppDispatcher.dispatch("add", {});
        }
    
        Rectangle {
            color: palette.window;
            anchors.right: parent.right;
            anchors.left: buttonAdd.right;
            anchors.top: parent.top;
            anchors.bottom: parent.bottom;
    
            CounterView {
                anchors.centerIn: parent;
            }
        }
    }
    

    It is the end, if you have any questions, please, let me know !
    Hope you enjoyed it and learned somethings !

    References

    Flux by Facebook
    Quick Flux : Problems about MVC and introduction

    Thanks !

  • How to make a Photon Mapper : Photons everywhere

    Bleed Color and Caustics.
    Bleed Color and Caustics.

    Hello,

    It has been a long time since I posted anything on this blog, I am so sorry about it.
    I will try to diversify my blog, I’ll talk about C++, Rendering (always <3), and Qt, a framework I love.

    So, in this last part, we will talk about photons.

    What exactly are Photons ?

    A photon is a quantum of light. It carries the light straightforwardly on the medium. Thanks to it, we can see objects etc.

    How could we transform photons in a “visible value” like RGB color ?

    We saw that the eyes only “see” the radiance !
    So we have to transform our photons in radiance.

    From physic of photons

    We know that one photon have for energy :

    \displaystyle{}E_{\lambda}={h\nu}=\frac{hc}{\lambda}

    where \lambda is the wavelength in nm and E in Joules.
    Say we have n_\lambda photons of E_{\lambda} each.
    We can lay the luminous energy right now :

    \displaystyle{Q_{\lambda}=n_{\lambda}E{\lambda}}

    The luminous flux is just the temporal derivative about the luminous energy :

    \displaystyle{\phi_{\lambda}=\frac{dQ_{\lambda}}{dt}}

    The idea is great, but we have a luminous flux function of the wavelength, but the radiance is waiting for a general luminous flux
    So, we want to have a general flux which is the integral over all wavelengths in the visible spectrum of the \lambda luminous flux.

    \displaystyle{\phi=\int_{380}^{750}d\phi_{\lambda}=\int_{380}^{750}\frac{\partial \phi_{\lambda}}{\partial\lambda}d\lambda}

    Now, we have the radiance

    \displaystyle{L=\frac{d^2 \phi}{cos \theta dAd\omega}=\frac{d^2(\int_{380}^{750}\frac{\partial \phi_{\lambda}}{\partial \lambda}d\lambda)}{cos(\theta)dAd\omega}=\int_{380}^{750}\frac{d^3\phi_{\lambda}}{cos(\theta)dAd\omega d\lambda}d\lambda}

    Using the rendering equation, we get two forms :

    \displaystyle{L^O=\int_{380}^{750}\int_{\Omega^+}fr(\mathbf{x}, \omega_i,\omega_o,\lambda)\frac{d^3\phi_{\lambda}}{dAd\lambda}d\lambda}
    \displaystyle{\int_{\Omega^+}fr(\mathbf{x}, \omega_i,\omega_o)\frac{d^2\phi}{dA}}

    The first one take care about dispersion since the second doesn’t.
    In this post, I am not going to use the first one, but I could write an article about it latter.

    Let’s make our Photon Mapper

    What do we need ?

    We need a Light which emits photons, so we could add a function “emitPhotons” .

    /**
     * @brief      Interface for a light
     */
    class AbstractLight {
    public:
        AbstractLight(glm::vec3 const &flux);
    
        virtual glm::vec3 getIrradiance(glm::vec3 const &position, glm::vec3 const &normal) = 0;
    
        virtual void emitPhotons(std::size_t number) = 0;
    
        virtual ~AbstractLight() = default;
    
    protected:
        glm::vec3 mTotalFlux;
    };

    We also need a material which bounces photons :

    class AbstractMaterial {
    public:
        AbstractMaterial(float albedo);
        
        virtual glm::vec3 getReflectedRadiance(Ray const &ray, AbstractShape const &shape) = 0;
        
        virtual void bouncePhoton(Photon const &photon, AbstractShape const &shape) = 0;
        virtual ~AbstractMaterial() = default;
        
        float albedo;
    protected:
        virtual float brdf(glm::vec3 const &ingoing, glm::vec3 const &outgoing, glm::vec3 const &normal) = 0;
    };

    Obviously, we also need a structure for our photons. This structure should be able to store photons and compute irradiance at a given position.

    class AbstractPhotonMap {
    public:
        AbstractPhotonMap() = default;
    
        virtual glm::vec3 gatherIrradiance(glm::vec3 position, glm::vec3 normal, float radius) = 0;
        virtual void addPhoton(Photon const &photon) = 0;
        virtual void clear() = 0;
    
        virtual ~AbstractPhotonMap() = default;
    private:
    };

    How could we do this ?

    Photon emitting is really easy :

    void SpotLight::emitPhotons(std::size_t number) {
        Photon photon;
    
        photon.flux = mTotalFlux / (float)number;
        photon.position = mPosition;
    
        for(auto i(0u); i < number; ++i) {
            vec3 directionPhoton;
            do
                directionPhoton = Random::random.getSphereDirection();
            while(dot(directionPhoton, mDirection) < mCosCutoff);
    
            photon.direction = directionPhoton;
            tracePhoton(photon);
        }
    }

    We divide the total flux by the number of photons and we compute a random direction, then we could trace the photon

    Bouncing a photon depends on your material :

    void UniformLambertianMaterial::bouncePhoton(const Photon &_photon, const AbstractShape &shape) {
        Photon photon = _photon;
    
        float xi = Random::random.xi();
        float d = brdf(vec3(), vec3(), vec3());
    
        if(photon.recursionDeep > 0) {
            // Photon is absorbed
            if(xi > d) {
                World::world.addPhoton(_photon);
                return;
            }
        }
    
        if(++photon.recursionDeep > MAX_BOUNCES)
            return;
    
        photon.flux *= color;
        photon.direction = Random::random.getHemisphereDirection(shape.getNormal(photon.position));
        tracePhoton(photon);
    }

    To take care about conservation of energy, we play Russian roulette.
    Obviously, to take care about conservation of energy, we have to modify the direct lighting as well ^^.

    vec3 UniformLambertianMaterial::getReflectedRadiance(Ray const &ray, AbstractShape const &shape) {
        vec3 directLighting = getIrradianceFromDirectLighting(ray.origin, shape.getNormal(ray.origin));
        float f = brdf(vec3(), vec3(), vec3());
    
        return color * (1.f - f) * f * (directLighting + World::world.gatherIrradiance(ray.origin, shape.getNormal(ray.origin), 0.5f));
    }

    Finally, we need to compute the irradiance at a given position :
    It is only :

    \displaystyle{E=\sum \frac {\phi}{\pi r^2}}

    So we could easily write :

    vec3 SimplePhotonMap::gatherIrradiance(glm::vec3 position, glm::vec3 normal, float radius) {
        float radiusSquare = radius * radius;
        vec3 irradiance;
        for(auto &photon : mPhotons)
            if(dot(photon.position - position, photon.position - position) < radiusSquare)
                if(dot(photon.direction, normal) < 0.0)
                    irradiance += photon.flux;
    
        return irradiance / ((float)M_PI * radiusSquare);
    }

    To have shadows, you could emit shadow photons like this :

    void traceShadowPhoton(const Photon &_photon) {
        Photon photon = _photon;
        Ray ray(photon.position + photon.direction * RAY_EPSILON, photon.direction);
    
        photon.flux = -photon.flux;
    
        auto nearest = World::world.findNearest(ray);
    
        while(get<0>(nearest) != nullptr) {
            ray.origin += ray.direction * get<1>(nearest);
            photon.position = ray.origin;
    
            if(dot(ray.direction, get<0>(nearest)->getNormal(ray.origin)) < 0.f)
                World::world.addPhoton(photon);
    
            ray.origin += RAY_EPSILON * ray.direction;
    
            nearest = World::world.findNearest(ray);
        }
    }

    That’s all ! If you have any question, please, let me know !

  • How to make a Photon Mapper : Whitted Ray Tracer

    Hello there,

    Today, we are going to see how to make the first part of our photon mapper. We are unfortunately not going to talk about photons, but only about normal ray tracing.

    Whitted Ray Tracer without shadows
    Whitted Ray Tracer without shadows

    This is ugly !!! There is no shadow…

    It is utterly wanted, shadows will be draw by photon mapping with opposite flux. We will see that in the next article.

    Introduction

    In this chapter, we are going to see one implementation for a whitted ray tracer

    Direct Lighting

    Direct lighting equation could be exprimed by :

    \displaystyle{L_o(\mathbf{x}, \vec{\omega_o}) = \sum_{i\in \{Lights\}}f_r(\mathbf{x}, \vec{\omega_i}, \vec{\omega_o})\frac{\phi_{Light}}{\Omega r^2}cos(\theta)}

    The main difficult is to compute the solid angle \Omega .
    For a simple isotropic spot light, the solid angle could be compute as :

    \displaystyle{\Omega=\int_{0}^{angleCutOff}\int_{0}^{2\pi}sin(\theta)d\theta d\phi =-2\pi(cos(angleCutOff)-1)}

    with :

    1. \Omega the solid angle.
    2. \phi_{Light} the total flux carried by the light.
    3. cos(\theta) the attenuation get by projected light area on lighted surface area.
    4. angleCutOff \in [0; pi].

    Refraction and reflection

    Both are drawn by normal ray tracing.

    Architecture

    Now, we are going to see how our ray tracer works :

    Shapes :

    Shapes are bases of renderer. Without any shapes, you can’t have any render. We can have many differents shape, so, we can use one object approach for our shapes.

    
    #ifndef SHAPE_HPP
    #define SHAPE_HPP
    
    #include "ray.hpp"
    #include "material.hpp"
    
    /**
     * @brief      This class provides an interface to manage differents shape
     */
    class AbstractShape {
    public:
        AbstractShape() = default;
        AbstractShape(std::unique_ptr &&material);
    
        /**
         * @brief      Return the distance between shape and ray emetter   
         *
         * @param      ray     
         *
         * @return     Negative value if not found. Distance between object and ray emetter
         */
        virtual float intersect(Ray const &ray) const = 0;
        virtual glm::vec3 getNormal(glm::vec3 const &position) const = 0;
    
        /**
         * @brief      Return the radiance returned by the material owned 
         *
         * @param      ray  
         *
         * @return     radiance
         */
        glm::vec3 getReflectedRadiance(Ray const &ray);
    
        virtual ~AbstractShape() = default;
    
    private:
        std::unique_ptr mMaterial = std::make_unique(glm::vec3(1.0, 1.0, 1.0), 1.0);
    };
    

    Materials

    For each shapes, we obviously have a particular material. The material have to give us a brdf and can reflect radiance.

    
    /**
     * @brief      This class describes a material
     */
    class AbstractMaterial {
    public:
        AbstractMaterial(float albedo);
    
        /**
         * @brief      Get the reflected radiance
         *
         * @param      ray    
         * @param      shape  Useful to get normal, UV for texture???
         *
         * @return     { description_of_the_return_value }
         */
        virtual glm::vec3 getReflectedRadiance(Ray const &ray, AbstractShape const &shape) = 0;
    
        virtual void bouncePhoton(Photon const &photon, AbstractShape const &shape) = 0;
    
        virtual ~AbstractMaterial() = default;
    
        float albedo;
    protected:
        /**
         * @brief      Get the Bidirectionnal Reflectance Distribution Function
         *
         * @param      ingoing   
         * @param      outgoing  
         * @param      normal    
         *
         * @return     the brdf
         */
        virtual float brdf(glm::vec3 const &ingoing, glm::vec3 const &outgoing, glm::vec3 const &normal) = 0;
    };
    

    Storage Shapes

    To have a better ray tracing algorithm, we could use a spatial structure like Kd-tree or other like one :

    
    /**
     * @brief      This class provide a structure to store shapes
     */
    class AbstractShapeStorage {
    public:    
        /**
         * @brief      Add a shape in structure
         *
         * @param      shape 
         */
        virtual void addShape(std::shared_ptr const &shape) = 0;
    
        /**
         * @brief      Get the nearest shape
         *
         * @param      ray   
         *
         * @return     a tuple with shape and distance. Shape could be null if no shape found
         */
        virtual std::tuple<std::shared_ptr, float> findNearest(Ray const &ray) = 0;
    
        ~AbstractShapeStorage() = default;
    };
    

    Algorithm

    The main algorithm part is on the materials side. Below, a piece of code where I compute the reflected radiance for a lambertian material and a mirror. You could see that material part has access to other shape via the global variable world.

    
    float LambertianMaterial::brdf(const glm::vec3 &, const glm::vec3 &, glm::vec3 const &) {
         return albedo / M_PI;
    }
    
    vec3 UniformLambertianMaterial::getReflectedRadiance(Ray const &ray, AbstractShape const &shape) {
        vec3 directLighting = getIrradianceFromDirectLighting(ray, shape);
        float f = brdf(vec3(), vec3(), vec3());
    
        return color * f * (directLighting);
    }
    
    float MirrorMaterial::brdf(const vec3&, const vec3&, const vec3&) {
        return albedo;
    }
    
    vec3 MirrorMaterial::getReflectedRadiance(const Ray &ray, const AbstractShape &shape) {
        if(ray.recursionDeep >= MAX_BOUNCES)
            return vec3();
    
        Ray reflectedRay = getReflectedRay(ray, shape.getNormal(ray.origin + ray.direction * ray.distMax));
        auto nearest = World::world.findNearest(reflectedRay);
    
        if(get(nearest) != nullptr) {
            reflectedRay.distMax = get(nearest);
            return brdf(vec3(), vec3(), vec3()) * get(nearest)->getReflectedRadiance(reflectedRay);
        }
    
        return vec3();
    }
    

    Lights

    Lighting is a useful feature in a render. It’s thanks to lights that you can see the relief. A light carry a flux. Irradiance is the flux received by a surface.

    So, our interface is :

    
    /**
     * @brief      Interface for a light
     */
    class AbstractLight {
    public:
        AbstractLight(glm::vec3 const &flux);
    
        /**
         * @brief      Compute irradiance receive by the projected area in position
         *
         * @param      position  surface's position
         * @param      normal    surface's normal
         *
         * @return     irradiance
         */
        virtual glm::vec3 getIrradiance(glm::vec3 const &position, glm::vec3 const &normal) = 0;
    
        virtual void emitPhotons(std::size_t number) = 0;
    
        virtual ~AbstractLight() = default;
    
    protected:
        glm::vec3 mTotalFlux;
    };
    

    Below a piece of code about computing irradiance :

    
    vec3 SpotLight::getIrradiance(const vec3 &position, const vec3 &normal) {
        vec3 posToLight = mPosition - position;
        vec3 posToLightNormalized = normalize(posToLight);
    
        if(dot(-posToLightNormalized, mDirection) > mCosCutoff) {
            float solidAngle = - 2.f * M_PI * (mCosCutoff - 1);
            return lambertCosineLaw(posToLightNormalized, normal) * mTotalFlux /
                (solidAngle * dot(posToLight, posToLight));
        }
    
        return vec3();
    }
    

    The next time, we will see how to integrate a photon mapper to our photon mapper. If you want to have the complete code, you could get it here :
    GitHub

    Bye my friends :).