On Github Amadiro / fragmentshader-talk
Jonathan Ringstad / jwringstad@gmail.com
A very short and rough overview.
An example of a task that is well-suited to execution in a shader is for instance desaturating an image before showing it on the screen. To desaturate the image, we use the formula (pseudo-code)
output_color = [ (input_color.r + input_color.g + input_color.b)/3, (input_color.r + input_color.g + input_color.b)/3, (input_color.r + input_color.g + input_color.b)/3 ]
The myth of the fast CPU...
This talk focuses mostly on 2D, but it's worth mentioning that 3D games use shaders for absolutely everything.
From "The vanishing of Ethan Carter" (The Astronauts/Nordic Games)But most modern 2D games use shaders as well. For instance for color-correction:
Image by Phil Holland, reduser.netChromatic abberation:
Image from tlc-systems.com, taken with a Panasonic DMC-FZ5Vignette:
Image by Lucas Cobb DesignNoise:
Image by T. Roos, P. Myllymäki, J. RissanenReplacing colors:
Image by carriontrooper.deviantart.comBlur:
Image by mynameismjp.wordpress.comMotion blur:
Image from wikimedia.org, user fir0002Bloom:
Image from "Swindle" by sizefivegamesWarping:
Image from BSNES with CRT emulation, game by NintendoLighting, shading and shadowing in general:
Image from "The Swapper", Facepalm gamesProcedural Animations:
$ love08 bitquest.love godmode # requires love 0.8
Refractions & reflections:
$ love refraction.love # requires love 0.9.1
What hardware can I use fragment shaders on? What frameworks let me use them?
Hardware & platforms:
All android devices that are currently on the market All iOS devices that are currently on the market Desktop operating systems (win, linux, OSX) as long as a driver is installed. On linux, the FOSS drivers support them as well Any browser that supports WebGL: Firefox, Opera, Chrom(e|ium), IE11 (kinda) Nintendo 3DS, all home consoles, PSP/Vita, ...Some frameworks like to provide their own language that you write shaders in, or make their own GLSL "flavour". There are some good and some bad reasons to do this. One major reason: conversion to native shading languages on consoles.
You cannot use fragment shaders if you use one of these frameworks:
MelonJS (uses canvas, not webgl) PyGame (possible but difficult. Based on SDL1) iio engine (uses canvas, not webgl) GameSalad (Doesn't expose them) JQuery (uses DOM, not webgl)Everything the CPU renders is made from triangles.
When your 2D framework renders a sprite, it typically renders it by using two triangles
Triangles can be positioned in 2D or 3D; the GPU doesn't care.
Rasterization of a triangle
Rasterization of a line
For every one of those generated fragments, your fragment shader gets to run, and return a color.
We can switch out which shader to use before drawing an object. Hence every object in our scene can be drawn with a different shader.
This allows us to draw objects with very different properties/appearances.
But there is another common way: to apply the shader to the entire screen. This is commonly called post-processing.
This is achieved by first rendering the scene to a texture (possibly using many different shaders), and then rendering that texture onto the screen using the post-processing shader.
Before we finish this section off, a few more words about GLSL as a language
Basically C Supports vec2, vec3, vec4 datatypes Supports sampler2d datatype, which is basically an image. Requires you to make a main function that assigns the final color to gl_FragColorLets write a basic shader.
Most basic: return a 50% gray-color for each pixel.
void main(void){ // Returns a 4-vector (0.5, 0.5, 0.5, 0.5) gl_FragColor = vec4(0.5); }
Why not red?
void main(void){ // Returns pure red gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0); }
Always having the same color is boring. Hence we can use uniform variables to change things up.
Uniforms are like global variables that are set by the CPU to some fixed value, before the CPU kicks off the draw call. The GPU then sees that value in the shader.
Shaders cannot change the value of a uniform variable, and all pixels see the same value, always.
Simple example:
uniform time; void main(void){ gl_FragColor = vec4(sin(time), cos(time), -sin(time), 1); }
What the CPU-side of this looks like depends on your framework.
LÖVE:
-- love autodetects the type you want to send. shader:send("time", 2345)
libGDX:
// libGDX does not auto-detect -- the "f" stands for "float". shaderprogram.setUniformf("time", 2345);
processing:
// processing auto-detects what you want to send. shader.set("time", 2345);
Unity3D (C#):
// Unity does not auto-detect object.material.SetFloat("time", 2345.0F);
SFML:
// SFML auto-detects. shader.setParameter("time", 2345.0f);
But there is an elephant in the room:
If our shader gets executed for each pixel, how can we get different pixels to have different colors?
The same main() with the same input will always result in the same color!
Enter varyings.
sorta like uniforms! they can vary from pixel to pixel! defined by extremes, GPU interpolates! one useful built-in varying: gl_FragCoordLets try to use a varying:
void main(void){ gl_FragColor = vec4( gl_FragCoord.x/400.0, // varying! gl_FragCoord.y/200.0, // varying! 1.0, 1.0); }
You can also define your own varyings, although we will not go over here on how to do it. But here is a visual example of how you could use them:
varying float my_varying; void main(void){ gl_FragColor = vec4(my_varying, my_varying, my_varying, 1.0); }
Lets apply that shader to this triangle.
But first, we'll have to set the varyings.
Define the values of the varyings for the extreme points:
The top corner is 0, the two bottom corners are 1.
Shaded result:
The GPU interpolated the value for us between the extreme points. All we did was to return the varying value as grayscale-color, but it resulted in a gradient.
To understand the most important use for varyings, we need to learn another concept first.
A texture is just a piece of memory that the shader can access at runtime, basically.
The texture is accessed with 2D coordinates that go from 0 to 1.
How do we tell a shader we want to use a texture?
Simple! We just give the texture to the GPU as a uniform.
LÖVE:
shader:send("my_image", love.graphics.newImage("badguy.png"))
libGDX:
// libGDX does not auto-detect type, so we put "i" for "image". shader.setUniformi("my_image", context.textureBinder.bind(texture));
SFML:
shader.setParameter("my_image", new sf::Texture("badguy.png"));
Assume this 2x2 pixel texture is loaded by the CPU into my_texture.
and we use this shader:
uniform sampler2D my_image; void main(void){ gl_FragColor = texture2D(my_image, vec2(0.0, 0.0)); }
This will give the following image:
Remember that (0, 0) is in the lower left corner.
Why is the whole triangle yellow?
gl_FragColor = texture2D(my_image, vec2(0.0, 0.0));
We only ever access the texture at (0, 0), we don't vary the texture coordinate we are accessing...
(you can probably guess what comes next)
Now we can finally use textures in a meaningful way.
Assume we're rendering a triangle that covers the entire screen of size 1920x1080, and we use this shader:
uniform sampler2D my_image; void main(void){ gl_FragColor = texture2D(my_image, vec2(gl_FragCoord.x/1920.0, gl_FragCoord.y/1080.0)); }
we get:
(note the flipped colors)
Now that we know how to use a shader to map a texture to a triangle, we can also do other stuff, e.g. update the texture each frame.
So far we've been looking at rendering individual objects with a shader.
This has one major drawback:
A shader applied to some object can only affect that object, it cannot "blend" with the background.
A good-looking blur for instance is impossible to achieve this way.
The solution is simple:
As we've explained before, we can apply a shader to the whole screen by just rendering the whole scene to a texture, and then rendering that texture to the screen, with our shader applied.
Other times it just makes sense to apply a shader to the whole screen, e.g. to desaturate or color-correct the whole screen in one swoop.
So how do we render to a texture?
Unfortunately this again depends on the framework...
LÖVE:
canvas = love.graphics.newCanvas(width, height) -- create canvas love.graphics.setCanvas(canvas) -- set this canvas as render-target ... draw stuff ... love.graphics.setCanvas() -- unset (now rendering to the screen again) -- give the canvas we've rendered to to shader myShader:send("my_canvas", canvas) love.graphics.setShader(myShader) -- activate shader love.graphics.draw(canvas) -- draw canvas object with shader applied love.graphics.setShader() -- deactivate shader
SFML:
sf::RenderTexture rt; // create render-texture sf::Sprite fullscreenSprite; // create a sprite that will cover the screen rt.create(width, height); // actually allocate space // tell the sprite to use the rt fullscreenSprite.setTexture(rt.getTexture()); rt.clear(sf::Color(0, 0, 0)); // clear the render-texture to black rt.draw(stuffYouWantToDraw); // draw something to the rendertexture rt.display(); mainwindow.clear(sf::Color(0, 0, 0)); // clear mainwindow // draw the fullscreen-sprite w/ rt to screen mainwindow.draw(fullscreenSprite);
etc. Check the code samples for more frameworks.
Shadertoy
Lets look at some nice shaders that you might want to use in your game.
No example :(
Questions? Ask me now, later, or send a mail to jwringstad@gmail.com