fragment-shader

Vertex shader vs Fragment Shader [duplicate]

梦想与她 提交于 2019-11-28 15:14:24
This question already has an answer here: What are Vertex and Pixel shaders? 5 answers I've read some tutorials regarding Cg, yet one thing is not quite clear to me. What exactly is the difference between vertex and fragment shaders? And for what situations is one better suited than the other? The Surrican A fragment shader is the same as pixel shader. One main difference is that a vertex shader can manipulate the attributes of vertices. which are the corner points of your polygons. The fragment shader on the other hand takes care of how the pixels between the vertices look. They are

3D texture in WebGL/three.js using 2D texture workaround?

谁说我不能喝 提交于 2019-11-28 14:04:58
I would like to use some 3D textures for objects that I'm rendering in WebGL. I'm currently using the following method in a fragment shader, as suggested on WebGL and OpenGL Differences : // tex is a texture with each slice of the cube placed horizontally across the texture. // texCoord is a 3d texture coord // size is the size if the cube in pixels. vec4 sampleAs3DTexture(sampler2D tex, vec3 texCoord, float size) { float sliceSize = 1.0 / size; // space of 1 slice float slicePixelSize = sliceSize / size; // space of 1 pixel float sliceInnerSize = slicePixelSize * (size - 1.0); // space of

OpenGL - How to access depth buffer values? - Or: gl_FragCoord.z vs. Rendering depth to texture

馋奶兔 提交于 2019-11-28 06:04:47
I want to access the depth buffer value at the currently processed pixel in a pixel shader. How can we achieve this goal? Basically, there seems to be two options: Render depth to texture. How can we do this and what is the tradeoff? Use the value provided by gl_FragCoord.z - But: Is this the correct value? On question 1: You can't directly read from the depth buffer in the fragment shader (unless there are recent extensions I'm not familiar with). You need to render to a Frame Buffer Object (FBO). Typical steps: Create and bind an FBO. Look up calls like glGenFramebuffers and

What does `precision mediump float` mean?

坚强是说给别人听的谎言 提交于 2019-11-28 03:55:12
In the learningwebgl tutorial1 I've found an interesting line in the fragment shader. precision mediump float; I've found an article about it here , but I still can't understand what does it mean? And if I remove this line, nothing changes. Everything is the same. So what does precision mediump float mean? HowDoIDoComputer This determines how much precision the GPU uses when calculating floats. highp is high precision, and of course more intensive than mediump (medium precision) and lowp (low precision). Some systems do not support highp at all, which will cause code not to work at all on

Drawing a border on a 2d polygon with a fragment shader

倖福魔咒の 提交于 2019-11-27 18:12:02
I have some simple polygons (fewer than 20 vertices) rendering flat on a simple xy plane, using GL_TRIANGLES and a flat color, a 2d simulation. I would like to add a border of variable thickness and a different color to these polygons. I have something implemented using the same vertices and glLineWidth/GL_LINE_LOOP, which works, but is another rendering pass and repeats all the vertex transforms. I think I should be able to do this in the fragment shader using gl_FragCoord and the vertex data and/or texture coordinates, but I'm not sure, and my naive attempts have been obviously incorrect. I

Simulating palette swaps with OpenGL Shaders (in LibGDX)

 ̄綄美尐妖づ 提交于 2019-11-27 14:58:36
问题 I'm trying to use LibGDX to make a retro-style little game, and I'd like to let the players to choose the colors of several characters, so I thought about loading png indexed images and then updating palettes programatically... How wrong I was ^^U It seems that color pallettes are something of the past, and also seems that the best option to achieve a similar result is by using Shaders. Here is an image explaining what I'm trying right now: My intention is to use 2 images. One of them, pixel

Vertex shader vs Fragment Shader [duplicate]

别等时光非礼了梦想. 提交于 2019-11-27 09:04:14
问题 This question already has an answer here: What are Vertex and Pixel shaders? 5 answers I've read some tutorials regarding Cg, yet one thing is not quite clear to me. What exactly is the difference between vertex and fragment shaders? And for what situations is one better suited than the other? 回答1: A fragment shader is the same as pixel shader. One main difference is that a vertex shader can manipulate the attributes of vertices. which are the corner points of your polygons. The fragment

Draw Quadratic Curve on GPU

走远了吗. 提交于 2019-11-27 08:57:22
My task is to render quadratic Bezier curve (path) via Stage3d (Adobe Flash) technology, which have no any extensions for that drawing out-of-the box (while OpenGl have it, as I know). Yea, there is a Starling-Extension-Graphics, but it uses simple method to divide a curve segment to many straight lines, that generates very many triangles for my long curve path. So.. There is a perfect way for rendering resolution independed shapes for Loop and Blinn. I've read GPUGems3 article (gpugems3_ch25.html) and ported that fragment shader to AGAL2: Quadratic Curve Pixel Shader float4 QuadraticPS(float2

3D texture in WebGL/three.js using 2D texture workaround?

做~自己de王妃 提交于 2019-11-27 08:03:53
问题 I would like to use some 3D textures for objects that I'm rendering in WebGL. I'm currently using the following method in a fragment shader, as suggested on WebGL and OpenGL Differences: // tex is a texture with each slice of the cube placed horizontally across the texture. // texCoord is a 3d texture coord // size is the size if the cube in pixels. vec4 sampleAs3DTexture(sampler2D tex, vec3 texCoord, float size) { float sliceSize = 1.0 / size; // space of 1 slice float slicePixelSize =

Fragment Shader - Average Luminosity

梦想与她 提交于 2019-11-27 07:02:59
问题 Does any body know how to find average luminosity for a texture in a fragment shader? I have access to both RGB and YUV textures the Y component in YUV is an array and I want to get an average number from this array. 回答1: I recently had to do this myself for input images and video frames that I had as OpenGL ES textures. I didn't go with generating mipmaps for these due to the fact that I was working with non-power-of-two textures, and you can't generate mipmaps for NPOT textures in OpenGL ES