问题
Currently my function to convert result of eglGetError()
looks like this:
std::string eglErrorString(EGLint error)
{
switch(error)
{
case EGL_SUCCESS: return "No error";
case EGL_NOT_INITIALIZED: return "EGL not initialized or failed to initialize";
case EGL_BAD_ACCESS: return "Resource inaccessible";
case EGL_BAD_ALLOC: return "Cannot allocate resources";
case EGL_BAD_ATTRIBUTE: return "Unrecognized attribute or attribute value";
case EGL_BAD_CONTEXT: return "Invalid EGL context";
case EGL_BAD_CONFIG: return "Invalid EGL frame buffer configuration";
case EGL_BAD_CURRENT_SURFACE: return "Current surface is no longer valid";
case EGL_BAD_DISPLAY: return "Invalid EGL display";
case EGL_BAD_SURFACE: return "Invalid surface";
case EGL_BAD_MATCH: return "Inconsistent arguments";
case EGL_BAD_PARAMETER: return "Invalid argument";
case EGL_BAD_NATIVE_PIXMAP: return "Invalid native pixmap";
case EGL_BAD_NATIVE_WINDOW: return "Invalid native window";
case EGL_CONTEXT_LOST: return "Context lost";
}
return "Unknown error "+to_hex_string(int(error));
}
But for e.g. OpenGL itself there's gluErrorString()
, which saves us from having to manually maintain the list of errors.
Is there something like gluErrorString()
for EGL?
回答1:
Definitely not as part of EGL itself. And I don't think it should be there. EGL is a low level window system interface. Producing user readable strings is really not its job.
You may think that having a function that produces a few strings is trivial and harmless. But since this would be user readable strings, you would almost have to think about internationalization. Meaning that you could get the strings in various languages, including ones that use different character sets. So what looked simple suddenly becomes much more complex. And at least in my opinion, supporting only English strings would be very arbitrary.
Even more importantly, translating error codes to strings is not conceptually part of an API that provides an interface to a window system. IMHO, these kinds of APIs should be minimal. In this case, it should provide exactly the functionality needed to interface with the window system, and nothing more.
Of course nobody stops you (or anybody else) from implementing a higher level library that provides this kind of functionality. That's exactly what GLU (which is where gluErrorString()
came from) was for OpenGL. It provided some commonly used functionality layered on top of OpenGL.
The use of past tense when talking about GLU in the previous paragraph was deliberate. It's built on top of OpenGL functionality from a previous millennium.
回答2:
Macro magic can save you some typing:
#define CASE_STR( value ) case value: return #value;
const char* eglGetErrorString( EGLint error )
{
switch( error )
{
CASE_STR( EGL_SUCCESS )
CASE_STR( EGL_NOT_INITIALIZED )
CASE_STR( EGL_BAD_ACCESS )
CASE_STR( EGL_BAD_ALLOC )
CASE_STR( EGL_BAD_ATTRIBUTE )
CASE_STR( EGL_BAD_CONTEXT )
CASE_STR( EGL_BAD_CONFIG )
CASE_STR( EGL_BAD_CURRENT_SURFACE )
CASE_STR( EGL_BAD_DISPLAY )
CASE_STR( EGL_BAD_SURFACE )
CASE_STR( EGL_BAD_MATCH )
CASE_STR( EGL_BAD_PARAMETER )
CASE_STR( EGL_BAD_NATIVE_PIXMAP )
CASE_STR( EGL_BAD_NATIVE_WINDOW )
CASE_STR( EGL_CONTEXT_LOST )
default: return "Unknown";
}
}
#undef CASE_STR
来源:https://stackoverflow.com/questions/38127022/is-there-a-standard-way-to-query-egl-error-string