Vertex Shader error while Passing mesh face index using glvertex4i

后端 未结 3 1271
爱一瞬间的悲伤
爱一瞬间的悲伤 2021-01-29 13:52

Initializing GL_List for processing.

glBegin(GL_POINTS); 
for (i = 0; i < faceNum; i++)
{
    mesh->GetFaceNodes(i          


        
相关标签:
3条回答
  • 2021-01-29 14:00

    The code in the question does not make any sense. The GLSL keyword varying was chosen because it was meant to reflect the property that the data will be different for each fragment, due to the automatic interpolation across the primitive. This happens only between the last programmable shader stage before the rasrerizer and the fragment shader.

    In the beginning, there was only the vertex shader and the fragment shader. The VS would get attributes as input, and output to varyings, which would be interpolation and become inputs to the FS.

    With the introduction of the Geometry Shader in GL 3.0 / GLSL 1.30, this scheme did not make sense any more. The outputs of the VS would not be interpolated any more, but become direct inputs of the GS. As a result, the keywords attribute and varying where removed from GLSL, and replaced by the more general in / out scheme.

    As a result, a GS with varying cannot exist. You either use legacy GLSL which doesn't support Geometry Shaders, or you use a newer GLSL with in/out.

    0 讨论(0)
  • 2021-01-29 14:02

    Since the shader does not contain any version information, it is a OpenGL Shading Language 1.10 Specification shader.

    In GLSL 1.10 varyingvariables of type int are not allowed and implicit casts from int to float are not supported. The glsl 1.10 there are no in an out variables. The keyword for intercace variables is varying.

    Furthermore the variable color is not defined int eh fragment shader.

    varying float final;
    
    void main( void )
    {
        final = gl_Vertex.w;
    
        // [...]
    }
    
    varying float final_tofrag;
    
    void main( void )
    {
        if (final_tofrag < 0.0) // ?
             gl_FragData[0] = vec4(final_tofrag, final_tofrag, -gl_FragCoord.z, 0.0);
        else
             gl_FragData[0] = vec4(final_tofrag, final_tofrag, gl_FragCoord.z, 0.0);
    }
    

    I recommend to check if the shader compilation succeeded and if the program object linked successfully.

    If the compiling of a shader succeeded can be checked by glGetShaderiv and the parameter GL_COMPILE_STATUS. e.g.:

    #include <iostream>
    #include <vector>
    
    bool CompileStatus( GLuint shader )
    {
        GLint status = GL_TRUE;
        glGetShaderiv( shader, GL_COMPILE_STATUS, &status );
        if (status == GL_FALSE)
        {
            GLint logLen;
            glGetShaderiv( shader, GL_INFO_LOG_LENGTH, &logLen );
            std::vector< char >log( logLen );
            GLsizei written;
            glGetShaderInfoLog( shader, logLen, &written, log.data() );
            std::cout << "compile error:" << std::endl << log.data() << std::endl;
        }
        return status != GL_FALSE;
    }
    

    If the linking of a program was successful can be checked by glGetProgramiv and the parameter GL_LINK_STATUS. e.g.:

    bool LinkStatus( GLuint program )
    {
        GLint status = GL_TRUE;
        glGetProgramiv( program, GL_LINK_STATUS, &status );
        if (status == GL_FALSE)
        {
            GLint logLen;
            glGetProgramiv( program, GL_INFO_LOG_LENGTH, &logLen );
            std::vector< char >log( logLen );
            GLsizei written;
            glGetProgramInfoLog( program, logLen, &written, log.data() );
            std::cout << "link error:" << std::endl << log.data() << std::endl;
        }
        return status != GL_FALSE;
    }
    
    0 讨论(0)
  • 2021-01-29 14:12

    It would help if you actually posted the compilation error in your question, otherwise we don't know what your error is.

    So, since I'm taking random guesses in the dark, I'll make a couple of guesses here.

    1. You are assigning a float to an integer, which might be giving you a conversion error.
    // this might now compile, but it will probably only ever give you
    // zero or one. Was that the intent?
    final = int(gl_Vertex.w); 
    
    1. You are NOT writing to gl_Position within your vertex shader. If you don't write to that value, OpenGL cannot execute your vertex shader.

    2. In your fragment shader, you are checking the value color.z, but you have not declared color as a uniform, shader input, or const.

    3. Whilst this won't cause a compilation error, dividing final (an integer whose value is 1 or 0?), by an integer value of 100 or 1000 is only ever going to give you zero or one. Was the intention to use final as a float rather than integer?

    4. You are mixing integers and floats within the vec4 declaration in your fragment shader. This might be causing the compiler to baulk.

    Unfortunately, without access to the GLSL error log, there isn't going to be anything anyone can do to identify your problem beyond what I've listed above.

    0 讨论(0)
提交回复
热议问题