fragment-shader

How exactly does OpenGL do perspectively correct linear interpolation?

我们两清 提交于 2019-12-17 06:34:36
问题 If linear interpolation happens during the rasterization stage in the OpenGL pipeline, and the vertices have already been transformed to screen-space, where does the depth information used for perspectively correct interpolation come from? Can anybody give a detailed description of how OpenGL goes from screen-space primitives to fragments with correctly interpolated values? 回答1: The output of a vertex shader is a four component vector, vec4 gl_Position . From Section 13.6 Coordinate

GLSL shader to boost the color

只愿长相守 提交于 2019-12-13 11:11:25
问题 Is their any GLSL shader that can help me to "boost" the color of a texture ? How to write such shader ? I would like to do something like the picture below : 回答1: It seems they are mostly boosting saturation here, so that's what you'd need to do in your fragment shader. To boost saturation, you need to take your data from the RVB color space to the HSL (hue saturation lightness) space. you then need to do: hslColor = vec3(hslCOlor.x, hslColor.y * **boost**, hslCOlor.z); Of course to do that

Opengl ES diffuse shader not working

笑着哭i 提交于 2019-12-13 07:06:42
问题 I'm implementing simple ray tracing for spheres in a fragment shader and I'm currently working on the function that computes color for a diffusely shaded sphere. Here is the code for the function: vec3 shadeSphere(vec3 point, vec4 sphere, vec3 material) { vec3 color = vec3(1.,2.,3.); vec3 N = (point - sphere.xyz) / sphere.w; vec3 diffuse = max(dot(Ldir, N), 0.0); vec3 ambient = material/5; color = ambient + Lrgb * diffuse * max(0.0, N * Ldir); return color; } I'm getting errors on the two

will this sinus approximation be faster than a shader CG sinus function?

£可爱£侵袭症+ 提交于 2019-12-13 04:48:38
问题 I have some functions that are not really sines but they are a lot quicker than conventional processing, they are simple parabole functions. Will this be faster on a graphics processor than the built-in graphics sinus function: float par (float xx){////// sinus approximation half xd =((fmod(abs(xx), 2.4)) - 1.2); if ( fmod (abs(xx) , 4.8) > 2.4) { xd=(-xd*xd)+2.88;} else {xd = xd*xd;} xd = -xd*0.694444444+1; if ( (xx<0) ) { xd=-xd;} return xd; } 回答1: MAIN ANSWER There is absolutely no way

gradient multicolor metaballs with threejs and marchingcubes

一个人想着一个人 提交于 2019-12-13 04:04:44
问题 I am looking on how to implement something similar than this work: https://vimeo.com/9121195 . But with explicitly attributing colors to each metaball from an array of given colors. As far as I can see, this is done entirely from the shader side in this example but I was wondering if this could not be implemented with Threejs and marchingcubes. By creating a new THREE.ShaderMaterial I assume I can pass positions of each metaball to the shader and then evaluate the distance of each vertex with

GLSL Shader - Coverflow Reflection of a 2D object

百般思念 提交于 2019-12-13 00:04:12
问题 I want to write a shader that creates a reflection of an image similiar to the ones used for coverflows. // Vertex Shader uniform highp mat4 u_modelViewMatrix; uniform highp mat4 u_projectionMatrix; attribute highp vec4 a_position; attribute lowp vec4 a_color; attribute highp vec2 a_texcoord; varying lowp vec4 v_color; varying highp vec2 v_texCoord; mat4 rot = mat4( -1.0, 0.0, 0.0, 0.0, 0.0, -1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0 ); void main() { gl_Position = (u

Can glsl be used instead of webgl?

杀马特。学长 韩版系。学妹 提交于 2019-12-12 17:27:50
问题 This may be a bit of a naive question so please go easy on me. But I was looking at shaders at shadertoy.com and I'm amazed at how small the glsl code is for the 3d scenes. Digging deeper I noticed how most of the shaders use a technique called ray marching. This technique makes it possible to avoid using vertices/triangles altogether and just employ the pixel shader and some math to create some pretty complex scenes. So I was wondering why is it that 3d scenes often use triangle meshes with

How to create and use very large palette textures for use in opengl?

被刻印的时光 ゝ 提交于 2019-12-12 17:25:42
问题 Details: I have a glsl fragment shader with a uniform texture, "u_MapTexture" with several thousand colors on it (max of about 10k-15k unique rgb values). I also have a uniform palette texture ("u_paletteTexture") that is 16384 × 1 that I want to use to index the colors on u_MapTexture to. My problem is that no matter what I try mathematically, I can't seem to properly index the colors from the first texture to the palette texture using the RGB values of the passed color. Amy thoughts or

Unity 3d Sprite Shader (How do I limit Max Brightness to 1 with Multiple Lights Hitting)

送分小仙女□ 提交于 2019-12-12 10:09:23
问题 I am creating a videogame in Unity. Every sprite is rendered with a Sprite Renderer with a Material that has the CornucopiaShader.shader. The problem I have is I want to limit the max brightness (or color) of the sprite to just be a normal image of the sprite regardless of the power of how many point lights are hitting it, the intensity of the lights, and also the ambient light in the unity scene. When the intensity of the lights hitting the sprite is below that max brightness level I want it

SKShader to create parallax background

有些话、适合烂在心里 提交于 2019-12-12 08:13:44
问题 A parallax background with a fixed camera is easy to do, but since i'm making a topdown view 2D space exploration game, I figured that having a single SKSpriteNode filling the screen and being a child of my SKCameraNode and using a SKShader to draw a parallax starfield would be easier. I went on shadertoy and found this simple looking shader. I adapted it successfully on shadertoy to accept a vec2() for the velocity of the movement that I want to pass as an SKAttribute so it can follow the