Newton fractal

This tutorial shows how to render a rectangle with a newton fractal shader covering the whole screen. If you are not familiar with the basics of *OGLplus*, it is recommended that you read the RGB triangle first for a more detailed introduction. You can also see the Basic usage with GLUT and GLEW for a complete standalone example including the code responsible for window initialization, OpenGL initialization and event handling.

The new things that this tutorial covers are:

- More complex fragment shader
- Uniform variables

For a full working code see the `oglplus/004_newton.cpp`

file in the example directory.

First we include a helper header that in turn includes `GL3/gl3.h`

or makes sure that the OpenGL symbols are defined.

#include <oglplus/gl.hpp>

The next line includes everything from *OGLplus*, except the image loaders and generators and geometric shape data generators.

#include <oglplus/all.hpp>

Now we include the header declaring the base class for *OGLplus* examples.

#include "example.hpp"

Just like the other examples that come with the library this one is also implemented inside of the `oglplus`

namespace.

namespace oglplus {

using namespace oglplus;

Doing this however on the global scope may cause name clashes with other libraries especially in larger applications, so it may be a better idea to use fully qualified names or apply the `using`

directive only in local scopes.

As before, the example code is encapsulated in the `RectangleExample`

class which is derived from `oglplus::Example`

.

class RectangleExample : public Example

{

Most of the member variables are the same as in the previous tutorials. There is an instance of the `oglplus::Context`

class wrapping the current context functions, a `oglplus::VertexShader`

and `oglplus::FragmentShader`

, a `oglplus::Program`

which define the custom functionality of the rendering pipeline and a `oglplus::VertexArray`

object managing the vertex data for the rendered rectangle.

private:

// wrapper around the current OpenGL context

Context gl;

// Vertex shader

VertexShader vs;

// Fragment shader

FragmentShader fs;

// Program

Program prog;

// A vertex array object for the rendered rectangle

VertexArray rectangle;

In this example we'll need just one `Buffer`

object - to store the rectangle vertex positions.

// VBOs for the rectangle's vertices used also as the viewport coords

Buffer verts;

The public interface consists of a constructor and a couple of member functions.

public:

The constructor takes no arguments and again sets the vertex shader source, which is same as in the previous tutorials. It just passes the vertex positions down the pipeline without any significant transformations.

RectangleExample(void)

{

// Set the vertex shader source

vs.Source(" \

#version 330\n \

in vec2 Position; \

out vec2 vertCoord; \

void main(void) \

{ \

vertCoord = Position; \

gl_Position = vec4(Position, 0.0, 1.0); \

} \

");

The vertex shader is complete and can be compiled.

vs.Compile();

The fragment shader is similar to the one from the previous tutorials. It has one input variable - the vertex coordinate that will be used in the newton fractal computations.

fs.Source(" \

#version 330\n \

in vec2 vertCoord; \

This time there are two uniform variables, that specify the colors of the gradient used for the colorization of the fractal.

uniform vec3 Color1, Color2; \

As before, the output of the shader is a vec4 representing the color of the fragment.

out vec4 fragColor; \

We are going to visualise the newton fractal for the f(x) = x^3 - 1 polynomial. The next function computes the value of this polynomial at the specified coordinate on the *complex* plane (we use a vec2 as a complex number, for both the argument and the result).

vec2 f(vec2 n) \

{ \

return vec2( \

n.x*n.x*n.x - 3.0*n.x*n.y*n.y - 1.0, \

-n.y*n.y*n.y + 3.0*n.x*n.x*n.y \

); \

} \

We'll also need the values of the derivative of this polynomial (which is f'(x) = 3x^2) at specified complex coordinates which is what the `df`

function computes.

vec2 df(vec2 n) \

{ \

return 3.0 * vec2( \

n.x*n.x - n.y*n.y, \

2.0 * n.x * n.y \

); \

} \

And we'll need to divide two comples numbers - hence the `cdiv`

function:

vec2 cdiv(vec2 a, vec2 b) \

{ \

float d = dot(b, b); \

if(d == 0.0) return a; \

else return vec2( \

(a.x*b.x + a.y*b.y) / d, \

(a.y*b.x - a.x*b.y) / d \

); \

} \

The Newton or (Newton-Raphson) method was originaly devised for the finding of successivelly better approximations to the roots of real valued functions. The iterative nature of the algorithm can also be used for fractal image rendering. There are various methods for the colorization of the final image but the basic algorithm is usually the same:

- Assign a complex-value coordinate
`Z`

to every pixel on a 2d surface - For every coordinate
`z`

calculate a sequence of numbers such that- Z0 = Z
- Zi+1 = Zi - f(Zi)/f'(Zi)

- Stop when
- the distance between Zi+1 and Zi is smaller than a specified constant
- the number of iterations has reached a specified maximum

void main(void) \

{ \

vec2 z = vertCoord; \

int i, max = 128; \

for(i = 0; i != max; ++i) \

{ \

vec2 zn = z - cdiv(f(z), df(z)); \

if(distance(zn, z) < 0.00001) break; \

z = zn; \

} \

fragColor = vec4( \

This example uses the number of iterations used to reach a good enough approximation to create a color gradient between the `Color1`

and `Color2`

values specified as uniform variables above.

mix( \

Color1.rgb, \

Color2.rgb, \

float(i) / float(max) \

), \

1.0 \

); \

} \

");

We can now compile the fragment shader source code, attach both shaders to the shading program and try to link and use it.

fs.Compile();

// attach the shaders to the program

prog.AttachShader(vs);

prog.AttachShader(fs);

// link and use it

prog.Link();

prog.Use();

Now we can start to specify the data for the individual vertex attributes of the rectangle we're going to render. The first step as before is to *bind* the vertex array object managing the vertex data.

rectangle.Bind();

Again since this is a simple example the coordinates are hardcoded.

GLfloat rectangle_verts[8] = {

-1.0f, -1.0f,

-1.0f, 1.0f,

1.0f, -1.0f,

1.0f, 1.0f

};

We bind the VBO for vertex positions to the `ARRAY_BUFFER`

target.

verts.Bind(Buffer::Target::Array);

The data are uploaded from client's memory to the server's memory by using the `Buffer`

's `Data`

static function.

Buffer::Data(Buffer::Target::Array, rectangle_verts);

Then we use a oglplus::VertexArrayAttrib object referencing the `Position`

input variable in the `prog`

program, to tell OpenGL about the structure of the data in the currently bound VBO and to enable this vertex attribute.

That's it for the vertex data specification. Now we specify the values of the two uniform variables referenced by the fragment shader. The uniforms were both declared as `vec3`

in the shader code (i.e. a vector of three float values). The classes that *OGLplus* provides for manipulation of shader input variables are templated and allow to set only values of the specified type. This provides additional compile-time type checking and allows to avoid type mismatches between the C++ program and the GLSL shader code which result in runtime errors. There are several ways how to set the value of a uniform variable. The `Uniform`

specializations for simple types like `float`

, `int`

, `uint`

, etc. (besides setting values of the specified scalar types) also allow to set the values of vectors and arrays of the same type. There are also specializations for the `Vector`

and `Matrix`

classes that can be used to specify only the values of vectors, matrices and arrays thereof and they are more generic. This tutorial uses a specialization of `Uniform`

for `GLfloat`

and its `SetVector`

member function to set the value of the 3D float vectors that store the color values.

Uniform<Vec3f>(prog, "Color1").Set(Vec3f(0.2f, 0.02f, 0.05f));

Uniform<Vec3f>(prog, "Color2").Set(Vec3f(1.0f, 0.95f, 0.98f));

As the last step of initialization we disable depth testing since we do not need it in this example:

gl.Disable(Capability::DepthTest);

}

The `Reshape`

function get called when the window is created and everytime the window's size changes. Here we tell the GL that the rendering viewport has changed.

void Reshape(GLuint width, GLuint height)

{

gl.Viewport(width, height);

}

This function redraws our scene and is basically the same as in the previous tutorials; (we don't clear any buffers here since it is not necessary) so we just tell the GL to draw a rectangle from the vertex data stored in the buffer objects tied to the currently bound VAO, which is still `rectangle`

, because we didn't bind any other VAO since initialization. More preciselly we draw the rectangle composed from 4 vertices starting at index 0 in the buffers.

void Render(double)

{

gl.DrawArrays(PrimitiveType::TriangleStrip, 0, 4);

}

Now the `RectangleExample`

class is complete:

};

The last thing in this example's source is the `makeExample`

function. This function is called by shared internal piece of code (that comes together with the examples) which does the initialization and event processing common to all the examples. `makeExample`

creates an instance of our `RectangleExample`

class. The common code then calls the event handler functions like `Reshape`

and `Render`

when appropriate. To see how this can be incorporated into a complete application see the standalone examples and the related tutorials.

std::unique_ptr<Example> makeExample(const ExampleParams& /*params*/)

{

return std::unique_ptr<Example>(new RectangleExample);

}

<matus.chochlik -at- fri.uniza.sk>

<chochlik -at -gmail.com>

Documentation generated on Mon Sep 22 2014 by Doxygen (version 1.8.6).