I want to create a blur effect using a fragment shader in OpenGL ES 2.0. The algorithm I am interested in is simply an averaging blur - add all adjacent pixels to myself and di
Elaborating a bit more on what Matias said:
Yes. You render the image into a texture (best done using FBOs) and in the second (blur) pass you bind this texture and read from it. You cannot perform the render and blur passes in one step, as you cannot access the framebuffer you're currently rendering into. This would introduce data dependencies, as your neighbours need not have their final color yet, or worse there color depends on you.
You get the current pixel's coordinates in the special fragment shader variable gl_FragCoord
and use these as texture coordinates into the texture containing the previously rendered image and likewise gl_FragCoord.x +/- 1
and gl_FragCoord.y +/- 1
for the neighbours. But like Matias said, you need to devide these values by width and height (of the image) respectively, as texture coordinates are in [0,1]. By using GL_CLAMP_TO_EDGE
as wrapping mode for the texture, the edge cases are handled automatically by the texturing hardware. So at an edge you still get 9 values, but only 6 distinct ones (the other 3, the ones actually outside the image, are just duplicates of their inside neighbours).
1) Yes, using a FBO is the way to go.
2) With math, if you are at pixel (x, y), then the neighbors are (x+1, y), (x, y+1), (x+1, y+1), (x-1, y), etc. Edge cases are handled with the wrap modes of the texture. Notice that since GL_TEXTURE_2D uses normalized coordinates, the offsets aren't 1, but 1 / width and 1 / height of the texture.