shader

OpenGL ignores Quads and makes them Triangles

时光毁灭记忆、已成空白 提交于 2020-12-21 02:47:40
问题 This is the second time I'm making a game engine, but I'm a little stuck right now, since I cannot figure out why this is happening, no matter what object I send, OpenGL only draws a White Triangle on the center of the screen, like this. I've even coppied my old code from my last engine on the Renderer and the Camera Objects, still acts the same, so I´m guessing it has something to do with the Render Script. Renderer: Renderer2D::Renderer2D(const Shader& shader) { this->shader = shader; this-

OpenGL ignores Quads and makes them Triangles

只谈情不闲聊 提交于 2020-12-21 02:46:00
问题 This is the second time I'm making a game engine, but I'm a little stuck right now, since I cannot figure out why this is happening, no matter what object I send, OpenGL only draws a White Triangle on the center of the screen, like this. I've even coppied my old code from my last engine on the Renderer and the Camera Objects, still acts the same, so I´m guessing it has something to do with the Render Script. Renderer: Renderer2D::Renderer2D(const Shader& shader) { this->shader = shader; this-

Cube to Sphere mapping (inverse function wanted)

时光怂恿深爱的人放手 提交于 2020-12-15 06:16:30
问题 I've came across a cube to sphere mapping function that provides a more uniform result than just normalizing the coordinates or other mapping methods. Unfortunately there is no unwrapping function. Source: http://mathproofs.blogspot.com/2005/07/mapping-cube-to-sphere.html vec3 spherify ( vec3 v ) { float x2 = v.x * v.x; float y2 = v.y * v.y; float z2 = v.z * v.z; vec3 s; s.x = v.x * sqrt(1.0 - y2 / 2.0 - z2 / 2.0 + y2 * z2 / 3.0); s.y = v.y * sqrt(1.0 - x2 / 2.0 - z2 / 2.0 + x2 * z2 / 3.0); s

Cube to Sphere mapping (inverse function wanted)

南笙酒味 提交于 2020-12-15 06:15:36
问题 I've came across a cube to sphere mapping function that provides a more uniform result than just normalizing the coordinates or other mapping methods. Unfortunately there is no unwrapping function. Source: http://mathproofs.blogspot.com/2005/07/mapping-cube-to-sphere.html vec3 spherify ( vec3 v ) { float x2 = v.x * v.x; float y2 = v.y * v.y; float z2 = v.z * v.z; vec3 s; s.x = v.x * sqrt(1.0 - y2 / 2.0 - z2 / 2.0 + y2 * z2 / 3.0); s.y = v.y * sqrt(1.0 - x2 / 2.0 - z2 / 2.0 + x2 * z2 / 3.0); s

Is it possible to declare a shader variable as both input and output?

喜夏-厌秋 提交于 2020-12-12 02:07:30
问题 I'm using both a vertex shader and a geometry shader. My vertex shader does nothing more than forward its input to the geometry shader. #version 330 core layout (location = 0) in uint xy; layout (location = 1) in uint znt; out uint out_xy; out uint out_znt; void main() { out_xy = xy; out_znt = znt; } Is it possible to declare xy and znt as both an input and an output, so that I don't need to rename them? 回答1: You cannot "declare" them that way, but you can use interface blocks, which can give

iPad GLSL. From within a fragment shader how do I get the surface - not vertex - normal

会有一股神秘感。 提交于 2020-12-10 06:33:43
问题 Is it possible to access the surface normal - the normal associated with the plane of a fragment - from within a fragment shader? Or perhaps this can be done in the vertex shader? Is all knowledge of the associated geometry lost when we go down the shader pipeline or is there some clever way of recovering that information in either the vertex of fragment shader? Thanks in advance. Cheers, Doug twitter: @dugla 回答1: You can get per-pixel normals interpolated from vertex normales by just using a

iPad GLSL. From within a fragment shader how do I get the surface - not vertex - normal

本秂侑毒 提交于 2020-12-10 06:31:07
问题 Is it possible to access the surface normal - the normal associated with the plane of a fragment - from within a fragment shader? Or perhaps this can be done in the vertex shader? Is all knowledge of the associated geometry lost when we go down the shader pipeline or is there some clever way of recovering that information in either the vertex of fragment shader? Thanks in advance. Cheers, Doug twitter: @dugla 回答1: You can get per-pixel normals interpolated from vertex normales by just using a

what do texture2D().r and texture2D().a mean?

走远了吗. 提交于 2020-12-10 00:57:53
问题 I am using OpenGL ES in Android programming, when I transform YUV(NV21) to RGB in shader, like: vec3 yuv = vec3( (texture2D(u_TextureY, vTextureCoord).r - 0.0625), texture2D(u_TextureUV, vTextureCoord).a - 0.5, texture2D(u_TextureUV, vTextureCoord).r - 0.5 ); then I'll get YUV data that seperating from u_TextureY and u_TextureUV . I know that NV21 format is like: YYYYYY...UVUV... BUT how can I transform YUYV422 to RGB? So, my problem is what do "r" and "a" mean in texture2D(u_TextureY,

Shader - Calculate depth relative to Object

◇◆丶佛笑我妖孽 提交于 2020-12-08 04:47:46
问题 I am trying to calculate depth relative to the object . Here is a good solution to retrieve depth relative to camera : Depth as distance to camera plane in GLSL varying float distToCamera; void main() { vec4 cs_position = glModelViewMatrix * gl_Vertex; distToCamera = -cs_position.z; gl_Position = gl_ProjectionMatrix * cs_position; } With this example the depth is relative to the camera . But I would like get the depth relative to the object. I would like the same depth and value if I am near