pixel-shading

blinn-phong shading with numpy

大兔子大兔子 提交于 2019-12-24 03:23:49
问题 I am trying to implement blinn-phong shading in numpy for educational purposes. However I am stuck at debugging what parameters are doing for several days now. My general idea was the following. Since the equation was given for a channel. I apply the model to each color channel to get the relative pixel intensities in the channel, then regroup the channels back togather to have all the image. My lambertian coefficiant does not seem to take into account the light position changes, but it does

How exactly does OpenGL do perspectively correct linear interpolation?

我们两清 提交于 2019-12-17 06:34:36
问题 If linear interpolation happens during the rasterization stage in the OpenGL pipeline, and the vertices have already been transformed to screen-space, where does the depth information used for perspectively correct interpolation come from? Can anybody give a detailed description of how OpenGL goes from screen-space primitives to fragments with correctly interpolated values? 回答1: The output of a vertex shader is a four component vector, vec4 gl_Position . From Section 13.6 Coordinate

Lambertian Shader not working

柔情痞子 提交于 2019-12-12 03:31:53
问题 I'm trying to make a Lambertian shader for my ray tracer, but am having trouble. The scence still seems to be flat shaded, just a little darker. Such as in this picture This is my Shader Class: public class LambertianShader { public Colour diffuseColour; public LambertianShader(Colour diffuseColour){ this.diffuseColour = diffuseColour; } public Colour shade(Intersection intersection, Light light){ Vector3D lightDirection = light.location.subtract(intersection.point); lightDirection.normalise(

Android GLES20.glBlendEquation not working?

旧巷老猫 提交于 2019-12-01 18:26:27
问题 Ive been trying to make a 2.5D engine with depth and normal map textures for a few weeks now, not unlike whats used here Linky. After thinking the drawing of a depth map in the fragment shader from a texture was impossible due to ES 2.0 missing the gl_fragDepth variable I found a tutorial for iOS where they used glBlendEquation with the mode GL_MIN/GL_MAX to "fake" depth buffering of the fragment to a framebuffer-texture Linky. Unfortunely GLES20.glBlendEquation makes the application crash on

Android GLES20.glBlendEquation not working?

会有一股神秘感。 提交于 2019-12-01 17:51:24
Ive been trying to make a 2.5D engine with depth and normal map textures for a few weeks now, not unlike whats used here Linky . After thinking the drawing of a depth map in the fragment shader from a texture was impossible due to ES 2.0 missing the gl_fragDepth variable I found a tutorial for iOS where they used glBlendEquation with the mode GL_MIN/GL_MAX to "fake" depth buffering of the fragment to a framebuffer-texture Linky . Unfortunely GLES20.glBlendEquation makes the application crash on both my phones (SGS 1/2) with UnsupportedOperationException. So Im wondering if anyone has used this

How exactly does OpenGL do perspectively correct linear interpolation?

有些话、适合烂在心里 提交于 2019-11-27 01:13:05
If linear interpolation happens during the rasterization stage in the OpenGL pipeline, and the vertices have already been transformed to screen-space, where does the depth information used for perspectively correct interpolation come from? Can anybody give a detailed description of how OpenGL goes from screen-space primitives to fragments with correctly interpolated values? The output of a vertex shader is a four component vector, vec4 gl_Position . From Section 13.6 Coordinate Transformations of core GL 4.4 spec: Clip coordinates for a vertex result from shader execution, which yields a