Fragment shaders – A short introduction on how to make your game look awesomer. – What are shaders?



Fragment shaders – A short introduction on how to make your game look awesomer. – What are shaders?

0 0


fragmentshader-talk

a short(?) talk about fragment-shaders

On Github Amadiro / fragmentshader-talk

Fragment shaders

A short introduction on how to make your game look awesomer.

Jonathan Ringstad / jwringstad@gmail.com

What are shaders?

A very short and rough overview.

What are shaders?

A shader is basically a small program that you send to the GPU for execution. The GPU can execute the shader in a parallel fashion, speeding up execution massively. The shader program is usually sent in source-form (plaintext) to the driver. The driver has a built-in compiler that compiles it to the native GPU architecture. The drivers compiler can give you back syntax errors, warnings, ... like any compiler. The language shaders are written in is called GLSL. It is much like C. There are additional datatypes like vec2, vec3, vec4, sampler2D, etc. Shaders are most of the time quite short, since they only do one thing.

Example: Desaturating an image

An example of a task that is well-suited to execution in a shader is for instance desaturating an image before showing it on the screen. To desaturate the image, we use the formula (pseudo-code)

output_color = [
						(input_color.r + input_color.g + input_color.b)/3,
						(input_color.r + input_color.g + input_color.b)/3,
						(input_color.r + input_color.g + input_color.b)/3
						]
Demo

The myth of the fast CPU...

Here are some examples of what you can do with shaders.

This talk focuses mostly on 2D, but it's worth mentioning that 3D games use shaders for absolutely everything.

From "The vanishing of Ethan Carter" (The Astronauts/Nordic Games)

But most modern 2D games use shaders as well. For instance for color-correction:

Image by Phil Holland, reduser.net

Chromatic abberation:

Image from tlc-systems.com, taken with a Panasonic DMC-FZ5

Vignette:

Image by Lucas Cobb Design

Noise:

Image by T. Roos, P. Myllymäki, J. Rissanen

Replacing colors:

Image by carriontrooper.deviantart.com

Blur:

Image by mynameismjp.wordpress.com

Motion blur:

Image from wikimedia.org, user fir0002

Bloom:

Image from "Swindle" by sizefivegames

Warping:

Image from BSNES with CRT emulation, game by Nintendo

Lighting, shading and shadowing in general:

Image from "The Swapper", Facepalm games

Procedural Animations:

[ demo ]

						    $ love08 bitquest.love godmode # requires love 0.8
						

Refractions & reflections:

[ demo ]

						    $ love refraction.love # requires love 0.9.1
						

Hardware & framework support

What hardware can I use fragment shaders on? What frameworks let me use them?

Hardware & platforms:

All android devices that are currently on the market All iOS devices that are currently on the market Desktop operating systems (win, linux, OSX) as long as a driver is installed. On linux, the FOSS drivers support them as well Any browser that supports WebGL: Firefox, Opera, Chrom(e|ium), IE11 (kinda) Nintendo 3DS, all home consoles, PSP/Vita, ...

Frameworks:

Some frameworks like to provide their own language that you write shaders in, or make their own GLSL "flavour". There are some good and some bad reasons to do this. One major reason: conversion to native shading languages on consoles.

Frameworks:

SDL2 (somewhat difficult) SFML Processing libGDX Love2D Three.js Phaser Cocos2d Unity3D (through custom "ShaderLab" language) MonoGame GameMaker Construct2 Stencyl Blender Game Engine QtQuick

Frameworks:

You cannot use fragment shaders if you use one of these frameworks:

MelonJS (uses canvas, not webgl) PyGame (possible but difficult. Based on SDL1) iio engine (uses canvas, not webgl) GameSalad (Doesn't expose them) JQuery (uses DOM, not webgl)

More theory. The pipeline.

How does GL/the GPU pipeline fundamentally work?

Everything the CPU renders is made from triangles.

When your 2D framework renders a sprite, it typically renders it by using two triangles

Triangles can be positioned in 2D or 3D; the GPU doesn't care.

Rasterization of a triangle

Rasterization of a line

For every one of those generated fragments, your fragment shader gets to run, and return a color.

We can switch out which shader to use before drawing an object. Hence every object in our scene can be drawn with a different shader.

This allows us to draw objects with very different properties/appearances.

But there is another common way: to apply the shader to the entire screen. This is commonly called post-processing.

This is achieved by first rendering the scene to a texture (possibly using many different shaders), and then rendering that texture onto the screen using the post-processing shader.

Before we finish this section off, a few more words about GLSL as a language

Basically C Supports vec2, vec3, vec4 datatypes Supports sampler2d datatype, which is basically an image. Requires you to make a main function that assigns the final color to gl_FragColor

Practical part

Lets write a basic shader.

Most basic: return a 50% gray-color for each pixel.

void main(void){
    // Returns a 4-vector (0.5, 0.5, 0.5, 0.5)
    gl_FragColor = vec4(0.5);
}
						

Why not red?

void main(void){
    // Returns pure red
    gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
}
						

Always having the same color is boring. Hence we can use uniform variables to change things up.

Uniforms are like global variables that are set by the CPU to some fixed value, before the CPU kicks off the draw call. The GPU then sees that value in the shader.

Shaders cannot change the value of a uniform variable, and all pixels see the same value, always.

Simple example:

uniform time;

void main(void){
    gl_FragColor = 
        vec4(sin(time), cos(time), -sin(time), 1);
}
						

[ EXAMPLE ]

What the CPU-side of this looks like depends on your framework.

LÖVE:

-- love autodetects the type you want to send.
shader:send("time", 2345)

libGDX:

// libGDX does not auto-detect -- the "f" stands for "float".
shaderprogram.setUniformf("time", 2345);

processing:

// processing auto-detects what you want to send.
shader.set("time", 2345);

Unity3D (C#):

// Unity does not auto-detect
object.material.SetFloat("time", 2345.0F);

SFML:

// SFML auto-detects.
shader.setParameter("time", 2345.0f);

But there is an elephant in the room:

If our shader gets executed for each pixel, how can we get different pixels to have different colors?

The same main() with the same input will always result in the same color!

Enter varyings.

sorta like uniforms! they can vary from pixel to pixel! defined by extremes, GPU interpolates! one useful built-in varying: gl_FragCoord

Lets try to use a varying:

void main(void){
    gl_FragColor = vec4(
        gl_FragCoord.x/400.0, // varying!
        gl_FragCoord.y/200.0, // varying!
        1.0, 
        1.0);
}

You can also define your own varyings, although we will not go over here on how to do it. But here is a visual example of how you could use them:

varying float my_varying;
void main(void){
    gl_FragColor = vec4(my_varying,
                        my_varying,
                        my_varying,
                        1.0);
}

Lets apply that shader to this triangle.

But first, we'll have to set the varyings.

Define the values of the varyings for the extreme points:

The top corner is 0, the two bottom corners are 1.

Shaded result:

The GPU interpolated the value for us between the extreme points. All we did was to return the varying value as grayscale-color, but it resulted in a gradient.

To understand the most important use for varyings, we need to learn another concept first.

Textures!

A texture is just a piece of memory that the shader can access at runtime, basically.

The texture is accessed with 2D coordinates that go from 0 to 1.

"bad guy" by http://opengameart.org/users/hollander

How do we tell a shader we want to use a texture?

Simple! We just give the texture to the GPU as a uniform.

LÖVE:

shader:send("my_image", love.graphics.newImage("badguy.png"))

libGDX:

// libGDX does not auto-detect type, so we put "i" for "image".
shader.setUniformi("my_image", context.textureBinder.bind(texture));

SFML:

shader.setParameter("my_image", new sf::Texture("badguy.png"));

Stupid texture example

Assume this 2x2 pixel texture is loaded by the CPU into my_texture.

and we use this shader:

uniform sampler2D my_image;
void main(void){
    gl_FragColor = texture2D(my_image, vec2(0.0, 0.0));
}

This will give the following image:

Remember that (0, 0) is in the lower left corner.

Why is the whole triangle yellow?

    gl_FragColor = texture2D(my_image, vec2(0.0, 0.0));

We only ever access the texture at (0, 0), we don't vary the texture coordinate we are accessing...

(you can probably guess what comes next)

Combining textures and varyings

Now we can finally use textures in a meaningful way.

Assume we're rendering a triangle that covers the entire screen of size 1920x1080, and we use this shader:

uniform sampler2D my_image;
void main(void){
    gl_FragColor = texture2D(my_image, 
                             vec2(gl_FragCoord.x/1920.0, 
                                  gl_FragCoord.y/1080.0));
}

we get:

(note the flipped colors)

Now that we know how to use a shader to map a texture to a triangle, we can also do other stuff, e.g. update the texture each frame.

[ EXAMPLE ]

Full-screen shaders

So far we've been looking at rendering individual objects with a shader.

This has one major drawback:

A shader applied to some object can only affect that object, it cannot "blend" with the background.

A good-looking blur for instance is impossible to achieve this way.

The solution is simple:

As we've explained before, we can apply a shader to the whole screen by just rendering the whole scene to a texture, and then rendering that texture to the screen, with our shader applied.

Other times it just makes sense to apply a shader to the whole screen, e.g. to desaturate or color-correct the whole screen in one swoop.

So how do we render to a texture?

Unfortunately this again depends on the framework...

LÖVE:

canvas = love.graphics.newCanvas(width, height) -- create canvas
love.graphics.setCanvas(canvas) -- set this canvas as render-target
... draw stuff ...
love.graphics.setCanvas() -- unset (now rendering to the screen again)
-- give the canvas we've rendered to to shader
myShader:send("my_canvas", canvas)
love.graphics.setShader(myShader) -- activate shader
love.graphics.draw(canvas) -- draw canvas object with shader applied
love.graphics.setShader() -- deactivate shader

SFML:

sf::RenderTexture rt; // create render-texture
sf::Sprite fullscreenSprite; // create a sprite that will cover the screen

rt.create(width, height); // actually allocate space
// tell the sprite to use the rt
fullscreenSprite.setTexture(rt.getTexture());
rt.clear(sf::Color(0, 0, 0)); // clear the render-texture to black
rt.draw(stuffYouWantToDraw); // draw something to the rendertexture
rt.display();

mainwindow.clear(sf::Color(0, 0, 0)); // clear mainwindow
// draw the fullscreen-sprite w/ rt to screen
mainwindow.draw(fullscreenSprite);

etc. Check the code samples for more frameworks.

Shadertoy

Getting to work

Lets look at some nice shaders that you might want to use in your game.

Bloom

[ EXAMPLE ]

Color correct

[ EXAMPLE ]

Blur

[ EXAMPLE ]

Colorize

[ EXAMPLE ]

Vignette

No example :(

Chromatic abberation

[ EXAMPLE ]

Resources

This presentation: http://sonengamejam.org/talks/ TODO: Sample code/templates for this presentation LÖVE shader introduction SFML shader introduction libGDX shader introduction Processing shader introduction

Thanks for listening!

Questions? Ask me now, later, or send a mail to jwringstad@gmail.com