Rendering issue with different computers

前端 未结 2 1068
小蘑菇
小蘑菇 2021-01-07 14:02

So i am making a short of tower defense game. I shared a build with them so i can check if everything performs as it should on another host.

And what actually happe

相关标签:
2条回答
  • 2021-01-07 14:40

    After further debugging sessions with my friends and much tryouts i managed to find the problem. This took me two solid days to figure out and really it was just a silly mistake.

    glDrawArrays(GL_TRIANGLES, 0, aVBO.GetVerticesSize());
    

    The above code does not get the vertices size (as points) but as a total number of floats stored there. So everything is multiplied by 3. Adding a /3 solved it.

    So i assume since the total points where multiplied by 3 times the vbo "stole" data from other vbos stored on the gpu. (Hence the tree model stack to my tower).

    What i can't figure out yet though, and would like an answer on that, is why on my computer everything rendered fine but not on other computers. As i state in my original question a hint would be that my computer is actually a developer station while my friend's not. Anyone who is kind enough to explain why this effect doesn't reproduce on me i will gladly accept his answer as a solution to my problem.

    Thank you

    0 讨论(0)
  • 2021-01-07 14:43

    The OpenGL spec. does not specify the exact behaviour that should occur when you issue a draw call with more vertices than your buffer stores. The reason this may work correctly on one machine and not on another comes down to implementation. Each vendor is free to do whatever they want if this situation occurs, so the render artifacts might show up on AMD hardware but not at all on nVIDIA or Intel. Making matters worse, there is actually no error state generated by a call to glDrawArrays (...) when it is asked to draw too many vertices. You definitely need to test your software on hardware sourced from multiple vendors to catch these sorts of errors; who manufactures the GPU in your computer, and the driver version, is just as important as the operating system and compiler.

    Nevertheless there are ways to catch these silly mistakes. gDEBugger is one, and there is also a new OpenGL extension I will discuss below. I prefer to use the new extension because in my experience, in addition to deprecated API calls and errors (which gDEBugger can be configured to monitor), the extension can also warn you for using inefficiently aligned data structures and various other portability and performance issues.

    I wanted to add some code I use to use OpenGL Debug Output in my software, since this is an example of an errant behaviour that does not actually generate an error that you can catch with glGetError (...). Sometimes, you can catch these mistakes with Debug Output (though, I just tested it and this is not one of those situations). You will need an OpenGL Debug Context for this to work (the process of setting this up is highly platform dependent), but it is a context flag just like forward/backward compatible and core (glfw should make this easy for you).

    Automatic breakpoint macro for x86 platforms

    
    // Breakpoints that should ALWAYS trigger (EVEN IN RELEASE BUILDS) [x86]!
    #ifdef _MSC_VER
    # define eTB_CriticalBreakPoint() if (IsDebuggerPresent ()) __debugbreak ();
    #else
    # define eTB_CriticalBreakPoint() asm (" int $3");
    #endif
    


    Enable OpenGL Debug Output (requires a Debug Context and a relatively recent driver, OpenGL 4.x era)

    
    // SUPER VERBOSE DEBUGGING!
    if (glDebugMessageControlARB != NULL) {
      glEnable                  (GL_DEBUG_OUTPUT_SYNCHRONOUS_ARB);
      glDebugMessageControlARB  (GL_DONT_CARE, GL_DONT_CARE, GL_DONT_CARE, 0, NULL, GL_TRUE);
      glDebugMessageCallbackARB ((GLDEBUGPROCARB)ETB_GL_ERROR_CALLBACK, NULL);
    }
    


    Some important utility functions to replace enumerant values with more meaningful text

    
    const char*
    ETB_GL_DEBUG_SOURCE_STR (GLenum source)
    {
      static const char* sources [] = {
        "API",   "Window System", "Shader Compiler", "Third Party", "Application",
        "Other", "Unknown"
      };
    
      int str_idx =
        min ( source - GL_DEBUG_SOURCE_API,
                sizeof (sources) / sizeof (const char *) );
    
      return sources [str_idx];
    }
    
    const char*
    ETB_GL_DEBUG_TYPE_STR (GLenum type)
    {
      static const char* types [] = {
        "Error",       "Deprecated Behavior", "Undefined Behavior", "Portability",
        "Performance", "Other",               "Unknown"
      };
    
      int str_idx =
        min ( type - GL_DEBUG_TYPE_ERROR,
                sizeof (types) / sizeof (const char *) );
    
      return types [str_idx];
    }
    
    const char*
    ETB_GL_DEBUG_SEVERITY_STR (GLenum severity)
    {
      static const char* severities [] = {
        "High", "Medium", "Low", "Unknown"
      };
    
      int str_idx =
        min ( severity - GL_DEBUG_SEVERITY_HIGH,
                sizeof (severities) / sizeof (const char *) );
    
      return severities [str_idx];
    }
    
    DWORD
    ETB_GL_DEBUG_SEVERITY_COLOR (GLenum severity)
    {
      static DWORD severities [] = {
        0xff0000ff, // High (Red)
        0xff00ffff, // Med  (Yellow)
        0xff00ff00, // Low  (Green)
        0xffffffff  // ???  (White)
      };
    
      int col_idx =
        min ( severity - GL_DEBUG_SEVERITY_HIGH,
                sizeof (severities) / sizeof (DWORD) );
    
      return severities [col_idx];
    }
    


    My Debug Output Callback (somewhat messy, because it prints each field in a different color in my software)

    
    void
    ETB_GL_ERROR_CALLBACK (GLenum        source,
                           GLenum        type,
                           GLuint        id,
                           GLenum        severity,
                           GLsizei       length,
                           const GLchar* message,
                           GLvoid*       userParam)
    {
      eTB_ColorPrintf (0xff00ffff, "OpenGL Error:\n");
      eTB_ColorPrintf (0xff808080, "=============\n");
    
      eTB_ColorPrintf (0xff6060ff, " Object ID: ");
      eTB_ColorPrintf (0xff0080ff, "%d\n", id);
    
      eTB_ColorPrintf (0xff60ff60, " Severity:  ");
      eTB_ColorPrintf ( ETB_GL_DEBUG_SEVERITY_COLOR   (severity),
                          "%s\n",
                            ETB_GL_DEBUG_SEVERITY_STR (severity) );
    
      eTB_ColorPrintf (0xffddff80, " Type:      ");
      eTB_ColorPrintf (0xffccaa80, "%s\n", ETB_GL_DEBUG_TYPE_STR     (type));
    
      eTB_ColorPrintf (0xffddff80, " Source:    ");
      eTB_ColorPrintf (0xffccaa80, "%s\n", ETB_GL_DEBUG_SOURCE_STR   (source));
    
      eTB_ColorPrintf (0xffff6060, " Message:   ");
      eTB_ColorPrintf (0xff0000ff, "%s\n\n", message);
    
      // Force the console to flush its contents before executing a breakpoint
      eTB_FlushConsole ();
    
      // Trigger a breakpoint in gDEBugger...
      glFinish ();
    
      // Trigger a breakpoint in traditional debuggers...
      eTB_CriticalBreakPoint ();
    }
    



    Since I could not actually get your situation to trigger a debug output event, I figured I would at least show an example of an event I was able to trigger. This is not an error that you can catch with glGetError (...), or an error at all for that matter. But it is certainly a draw call issue that you might be completely oblivious to for the duration of your project without using this extension :)

    OpenGL Error:
    =============
     Object ID: 102
     Severity:  Medium
     Type:      Performance
     Source:    API
     Message:   glDrawElements uses element index type 'GL_UNSIGNED_BYTE' that is not optimal for the current hardware configuration; consider using 'GL_UNSIGNED_SHORT' instead.
    
    0 讨论(0)
提交回复
热议问题