fragment-shader

Shader wireframe of an object

百般思念 提交于 2019-12-08 13:48:32
问题 I want to see a wireframe of an object without the diagonals like Currently, I add lines according to the vertices, the problem is after I have several of those I experience a major performance degradation. The examples here are either too new for my version of Three or don't work (I commented there about it). So I want to try to implement a shader instead. I tried to use this shader: https://stackoverflow.com/a/31610464/4279201 but it breaks the shape to parts and I'm getting WebGL errors.

How to modify/displace pixel position in a Cg fragment shader?

左心房为你撑大大i 提交于 2019-12-08 06:05:21
问题 Is it possible to modify pixel coordinates in a fragment (pixel) shader using Cg? I'm sure such functionality became available in 2nd/3rd-generation shaders but I don't know what profiles exactly, or how to do it. 回答1: No, it is not possible. The only coordinate you can modify in a fragment shader is the Z, going into the Z-buffer. And even that has performance implications, as it defeats some optimizations (like Hierarchical Z). the X and Y positions are set before the fragment shader is

WebGL: How to bind an array of samplers

情到浓时终转凉″ 提交于 2019-12-08 02:12:22
问题 As mentioned here it would be possible to "bind all the textures you need to a sampler array in the shader and then index it with a vertex attribute" . How would I do the binding? Currently I bind my textures like so (if that's correct in the first place; it works at least): sampler[i] = gl.getUniformLocation(program, "u_sampler" + i); ... for (var i = 0, len = textures.length; i < len; i++) { gl.activeTexture(gl.TEXTURE0 + i); gl.bindTexture(gl.TEXTURE_2D, textures[i]); gl.uniform1i(sampler

Texture Mapping and Lighting Vertex Shader ErrorJava OpenGL

孤街浪徒 提交于 2019-12-07 14:07:36
问题 I am trying to map texture to a 3D cube and trying to write shaders so that it has lighting and texture. I have tried writing texture shader only and it works. I have also tried lighting shader only with 3D values set to red color, and that light shader works too. But when i try to combine that two, I am having problem. I have provided my code below, but i am getting error that Attached vertex shader is not compiled. Shader could not be linked. Vertex Shader #version 330 core layout(location

OpenGL - Fixed pipeline shader defaults (Mimic fixed pipeline with shaders)

梦想的初衷 提交于 2019-12-07 08:59:04
问题 Can anyone provide me the shader that are similar to the Fixed function Pipeline? I need the Fragment shader default the most, because I found a similar vertex shader online. But if you have a pair that should be fine! I want to use fixed pipeline, but have the flexability of shaders, so I need similar shaders so I'll be able to mimic the functionality of the fixed pipeline. Thank you very much! I'm new here so if you need more information tell me:D This is what I would like to replicate:

GLSL integration function

可紊 提交于 2019-12-07 06:36:24
问题 Any recommendation on how to implement efficient integral functions, like SumX and SumY, in GLSL shaders? SumX(u) = Integration with respect to x = I(u0,y) + I(u1,y) +... + I(uN,y); u=normalized x coordinate SumY(v) = Integration with respect to y = I(x,v0) + I(x,v1) +... + I(x,vN); v=normalized y coordinate For instance the 5th pixel of the first line would be the sum of all five pixels on the first line. And the last pixel would be the sum of all previous pixels including the last pixel

How to modify/displace pixel position in a Cg fragment shader?

房东的猫 提交于 2019-12-07 03:11:27
Is it possible to modify pixel coordinates in a fragment (pixel) shader using Cg? I'm sure such functionality became available in 2nd/3rd-generation shaders but I don't know what profiles exactly, or how to do it. No, it is not possible. The only coordinate you can modify in a fragment shader is the Z, going into the Z-buffer. And even that has performance implications, as it defeats some optimizations (like Hierarchical Z). the X and Y positions are set before the fragment shader is ever executed (in the Rasterizer ). Typical rasterizers actually generate at the very least 2x2 chunks of

Webgl: alternative to writing to gl_FragDepth

℡╲_俬逩灬. 提交于 2019-12-07 02:21:54
问题 In WebGL, is it possible to write to the fragment's depth value or control the fragment's depth value in some other way? As far as I could find, gl_FragDepth is not present in webgl 1.x, but I am wondering if there is any other way (extensions, browser specific support, etc) to do it. What I want to archive is to have a ray traced object play along with other elements drawn using the usual model, view, projection. 回答1: There is the extension EXT_frag_depth Because it's an extension it might

Why do I need to define a precision value in webgl shaders?

我与影子孤独终老i 提交于 2019-12-06 17:39:09
问题 I'm trying to get this tutorial to work but I ran into two issues, one of which is the following. When I run the code as is I get an error in the fragment shader saying: THREE.WebGLShader: gl.getShaderInfoLog() ERROR: 0:2: '' : No precision specified for (float) . So what I did was specifying a precision for every float/vector I define like so varying highp vec3 vNormal . This eliminates the error but I don't get why? I can't find any other example where precision values are added to variable

How vertex and fragment shaders communicate in OpenGL?

被刻印的时光 ゝ 提交于 2019-12-06 12:35:40
I really do not understand how fragment shader works. I know that vertex shader runs once per vertices fragment shader runs once per fragment Since fragment shader does not work per vertex but per fragment how can it send data to the fragment shader? The amount of vertices and amount of fragments are not equal. How can it decide which fragment belong to which vertex? To make sense of this, you'll need to consider the whole render pipeline. The outputs of the vertex shader (besides the special output gl_Position ) is passed along as "associated data" of the vertex to the next stages in the