vertex-attributes

How do I make this simple OpenGL code (works in a “lenient” 3.3 and 4.2 profile) work in a strict 3.2 and 4.2 core profile?

安稳与你 提交于 2019-12-07 01:10:34
问题 I had some 3D code that I noticed wouldn't render in a strict core profile but fine in a "normal" (not explicitly requested-as-core-only) profile context. To isolate the issue, I have written the smallest simplest possible OpenGL program drawing just a triangle and a rectangle: I have posted that OpenGL program as a Gist here. With the useStrictCoreProfile variable set to false, the program outputs no error messages to the console and draws a quad and a triangle as per the above screenshot,

How do I make this simple OpenGL code (works in a “lenient” 3.3 and 4.2 profile) work in a strict 3.2 and 4.2 core profile?

一笑奈何 提交于 2019-12-05 05:39:14
I had some 3D code that I noticed wouldn't render in a strict core profile but fine in a "normal" (not explicitly requested-as-core-only) profile context. To isolate the issue, I have written the smallest simplest possible OpenGL program drawing just a triangle and a rectangle: I have posted that OpenGL program as a Gist here . With the useStrictCoreProfile variable set to false, the program outputs no error messages to the console and draws a quad and a triangle as per the above screenshot, both on an Intel HD OpenGL 3.3 and on a GeForce with OpenGL 4.2. However , with useStrictCoreProfile

Using GL_INT_2_10_10_10_REV in glVertexAttribPointer()

你。 提交于 2019-12-04 19:15:24
Can anybody tell me how exactly do we use GL_INT_2_10_10_10_REV as type parameter in glVertexAttribPointer() ? I am trying to pass color values using this type. Also what is the significance of "REV" suffix in this type ? Does it require any special treatment in the shaders ? My code is as follows : GLuint red=1023,green=1023,blue=1023,alpha=3; GLuint val = 0; val = val | (alpha << 30); val = val | (blue << 20); val = val | (green << 10); val = val | (red << 0); GLuint test_data[]={val,val,val,val}; loadshaders(); glBindAttribLocation(ps,0,"tk_position"); glBindAttribLocation(ps,1,"color");

Can you tell if a vertex attribute is enabled from within a vertex shader?

江枫思渺然 提交于 2019-11-29 14:05:56
I was wondering if there was a way to tell if a vertex attribute is enabled from within a vertex shader? I know that if the vertex attribute is disabled all the values will be treated as 0.0, so I could do a test like the following: if (attribute == 0) { // Do something different to normal. } else { // Use the attribute. } But this has the obvious problem for the case that the attribute is enabled and the value is just set to 0 (it will be treated as if it's disabled)! The other solution would be to just use a uniform variable that states whether or not to use the attribute, but I wondered if

Can you tell if a vertex attribute is enabled from within a vertex shader?

人盡茶涼 提交于 2019-11-28 07:31:52
问题 I was wondering if there was a way to tell if a vertex attribute is enabled from within a vertex shader? I know that if the vertex attribute is disabled all the values will be treated as 0.0, so I could do a test like the following: if (attribute == 0) { // Do something different to normal. } else { // Use the attribute. } But this has the obvious problem for the case that the attribute is enabled and the value is just set to 0 (it will be treated as if it's disabled)! The other solution