fragment-shader

How can I add a uniform width outline to WebGL shader drawn circles/ellipses (drawn using edge/distance antialiasing)

笑着哭i 提交于 2019-12-12 05:26:55
问题 I am drawing circles/ellipses in WebGL using a single Quad and a fragment shader, in order to draw them in a resolution independent manner (Edge distance anti-aliasing) Here is my fragment shader currently: '#extension GL_OES_standard_derivatives : enable', 'precision mediump float;', 'varying vec2 coord;', 'vec4 circleColor = vec4(1.0, 0.5, 0.0, 1.0);', 'vec4 outlineColor = vec4(0.0, 0.0, 0.0, 1.0);', 'uniform float strokeWidth;', 'float outerEdgeCenter = 0.5 - strokeWidth;', 'void main(void

OpenGL Reflection shader showing only grey

偶尔善良 提交于 2019-12-12 05:25:15
问题 I'm very new to OpenGL and I've been working with setting up sky boxes, and finally fixed it thanks to some help here but now the reflection shader I've tried to set up by editing some I've found (so a sphere will have a basic reflection effect based on the cube map of my sky box) will not show any color but grey as shown in the image. http://i.imgur.com/Th56Phg.png I'm having no luck figuring out, here is my shader code: Vertex Shader #version 330 core attribute vec3 position; attribute vec2

Shader Texture Always Facing The Camera

浪子不回头ぞ 提交于 2019-12-11 19:19:10
问题 I have this basic shader, though I ran into some trouble which really bugs me! I'm applying the texture using the fragment shader though however I move or rotate the camera, the texture on the face will always be facing the camera (I've added a GIF image as an example). So as you can see on the image above the texture keeps facing the camera, also if I move away of closer it will keep its scale. I'm trying to achive that I can move around and the texture will keep it's position, rotation and

function with bool return type in OpenGL ES shader using GPUImage

房东的猫 提交于 2019-12-11 14:33:35
问题 I'm working on an iOS project, using GPUImage framework. I cannot get my shader complied. There's a function in my fragment shader: const vec2 boundMin = vec2(0.0, 0.0); const vec2 boundMax = vec2(1.0, 1.0); bool inBounds (vec2 p) { return all(lessThan(boundMin, p)) && all(lessThan(p, boundMax)); } Shader compile log: ERROR: 0:1: '_Bool' : syntax error syntax error When I replace all the calls to function inBounds(vec2 p) with all(lessThan(boundMin, p)) && all(lessThan(p, boundMax)) it works

OpenGL GL_POLYGON Not Functioning Properly

假如想象 提交于 2019-12-11 13:38:11
问题 I have an OpenGL-related issue. Whenever I attempt to draw a simple polygon using four vertices from a vertex buffer... nothing happens. However, it will draw the shape in GL_TRIANGLES or GL_TRIANGLE_STRIP mode, albeit distorted. Am I doing something wrong? Relevent code: Vertex array: http://i.imgur.com/nEcbw.png GL_POLYGON: http://i.imgur.com/idfFT.png GL_TRIANGLES: http://imgur.com/84ey3,idfFT,nEcbw#0 GL_TRIANGLE_STRIP: http://i.imgur.com/JU3Zl.png 回答1: I'm using a forward-compatible 3.2

Why is this tutorial example of a shader not displaying any geometry as it is supposed to?

二次信任 提交于 2019-12-11 08:54:52
问题 This is an example from: http://pyopengl.sourceforge.net/context/tutorials/shader_1.xhtml It is creating a vbo, binging it, and running it with a shader, but somewhere along the way, something is not working properly. :\ from OpenGLContext import testingcontext BaseContext = testingcontext.getInteractive() from OpenGL.GL import * from OpenGL.arrays import vbo from OpenGLContext.arrays import * from OpenGL.GL import shaders class TestContext( BaseContext ): def OnInit( self ): VERTEX_SHADER =

Combining Two Shaders into One Shader

僤鯓⒐⒋嵵緔 提交于 2019-12-11 08:03:22
问题 Unity Project, Want to combine these two shaders into one shader to get both of their functionality. One shader is for lighting, the other shader is for rendering better. How do I combine? Shader "Transparent/Cutout/Lit3dSprite" { Properties{ _MainCol("Main Tint", Color) = (1,1,1,1) _MainTex("Main Texture", 2D) = "white" {} _Cutoff("Alpha cutoff", Range(0,1)) = 0.5 } SubShader{ Tags {"Queue" = "AlphaTest" "IgnoreProjector" = "True" "RenderType" = "TransparentCutout" "PreviewType" = "Plane"}

Three.js: Objects intersected and shader material

你。 提交于 2019-12-11 07:18:23
问题 I have a scene with objects intersected using Lambert material, like in this jsFiddle. Now I need/want to switch the material of that plane to a Shader material and the plane turns into a background thing, like here. The question is, can I use different materials in objects and still preserve the intersection effect? Is this a Three.js limitation or this is how shaders works? Or am I missing a parameter in the renderer/material? At the moment is not an option no switch all my work to shader

Three.JS: Gaussian blur in GLSL shader

本秂侑毒 提交于 2019-12-11 06:04:07
问题 I have this vert/frag shader, which is using vertex data and two textures. I am trying to apply post blur effect, but having only rectangles after it. vert: attribute float type; attribute float size; attribute float phase; attribute float increment; uniform float time; uniform vec2 resolution; uniform sampler2D textureA; uniform sampler2D textureB; varying float t; void main() { t = type; vec4 mvPosition = modelViewMatrix * vec4(position, 1.0 ); if(t == 0.) { gl_PointSize = size * 0.8; }

Volumetric Fog Shader - Camera Issue

流过昼夜 提交于 2019-12-11 06:03:59
问题 I am trying to build an infinite fog shader. This fog is applied on a 3D plane. For the moment I have a Z-Depth Fog. And I encounter some issues. As you can see in the screenshot, there are two views. The green color is my 3D plane. The problem is in the red line. It seems that the this line depends of my camera which is not good because when I rotate my camera the line is affected by my camera position and rotation. I don't know where does it comes from and how to have my fog limit not based