WebGL API is about building a graphics pipeline...
in an **imperative and stateful way**.
[CODE](https://github.com/gre/behind-asteroids/blob/master/src/effects.js)
But basically it's just this graph:
![](./images/ZQwHb4E.png)
*where each block is just a function!*
## Fragment Shader
Each pipeline brick is a function from pixels to colors...
## `{x,y} => {r,g,b,a}`
...implemented in a **Fragment Shader**.
I like to call this...
### [**functional rendering**](http://greweb.me/2013/11/functional-rendering/).
**Let's see how to draw in this paradigm...**
### Introducing gl-react
WebGL bindings for react to implement complex effects over content.
## Shared node optimization
### Capturing a frame
Via a *ref* you can `captureFrame()` to get an image out of the canvas.
### `ndarray` support
**ndarray** is a library to define multi-dimensional arrays, like textures.
You can programmatically generate a Texture in JavaScript to use in `gl-react`.
![](./images/9.gif)
## WebGL API
### hybrid paradigms
WebGL is a very low level **imperative** API...
but fundamentally uses **functional** bricks.
## fragment shader
How to draw things in WebGL:
### `{x,y} => {r,g,b,a}`
### *"Functional Rendering"*
### gl-react-native
The same wrapper on top of **OpenGL** for React Native.
Write universal effects for both the Web and Native.