opengl-es-3.0

Why is glTexImage2D returning GL_INVALID_OPERATION on iOS?

爷,独闯天下 提交于 2020-01-16 16:48:57
问题 I'm making the following call on iOS using OpenGL ES 3: glTexImage2D(GL_TEXTURE_2D, // target 0, // level GL_RGBA, // internalformat 1024, // width 692, // height 0, // border GL_RGBA, // format GL_UNSIGNED_BYTE, // type NULL); // data However, it is returning GL_INVALID_OPERATION . There are a slew of reasons that GL_INVALID_OPERATION might be returned. However, I can't spot any that are relevant to my situation. The weird thing is if I just ignore the error, things seem to work anyway.

Why can't I use OpenGL ES 3.0 in Qt?

回眸只為那壹抹淺笑 提交于 2020-01-04 02:19:05
问题 I set a QSurfaceFormat on my window, and this surface format has "3.0" set as its GL version number. The code: static QSurfaceFormat createSurfaceFormat() { QSurfaceFormat format; format.setSamples(4); format.setDepthBufferSize(24); format.setStencilBufferSize(8); format.setVersion(3, 0); return format; } int main(int argc, char *argv[]) { // ... QQmlApplicationEngine engine; engine.load(QUrl(QStringLiteral("qrc:/main.qml"))); QWindow* window = (QWindow*) engine.rootObjects().first(); window-

Unity plugin using OpenGL for Project Tango

主宰稳场 提交于 2020-01-02 10:06:38
问题 I am developing an AR app using Unity for Project Tango. One of the things I am trying to accomplish is getting the frame image from the device while using the AR example they provided with the SDK - https://github.com/googlesamples/tango-examples-unity The problem is that they are using the IExperimentalTangoVideoOverlay which doesn't return the frame buffer (The image is converted from YUV to RGB in the shader). I've registered to OnExperimentalTangoImageAvailable event and called an

Android 4.3 PBO not working

泄露秘密 提交于 2020-01-01 19:29:14
问题 I am using PBO to take screenshot. However, the result image is all black. It works perfectly fine without PBO . Is there any thing that I need to take care before doing this ? I even tried by rendering to a FBO and then use GLES30.glReadBuffer(GLES30.GL_COLOR_ATTACHMENT0) , no hope public void SetupPBO(){ GLES30.glGenBuffers(1, pbuffers, 0); GLES30.glBindBuffer(GLES30.GL_PIXEL_PACK_BUFFER, pbuffers[0]); int size = (int)this.mScreenHeight * (int)this.mScreenWidth * 4; GLES30.glBufferData

Metal vertex shader draw points of a Texture

牧云@^-^@ 提交于 2019-12-24 11:01:30
问题 I want to execute Metal (or OpenGLES 3.0) shader that draws Points primitive with blending. To do that, I need to pass all the pixel coordinates of the texture to Vertex shader as vertices which computes the position of the vertex to be passed to fragment shader. The fragment shader simply outputs the color for the point with blending enabled. My problem is if there is an efficient was to pass coordinates of vertices to the vertex shader, since there would be too many vertices for 1920x1080

Determine internal format of given astc compressed image through its header?

ぃ、小莉子 提交于 2019-12-24 03:19:36
问题 I am writing a EbGL based HTML application that uses ASTC (Adaptive Scalable Texture Compression) compressed textures to be loaded on my triangle. I would like to know that does there exists a way to know whether the internal format of the compressed ASTC image(that in my case might be located on a remote web server) is "linear" or "srgb encoded", by parsing the ASTC header. I can then use that internalFormat information obtained to pass my ASTC texture to glCompressedTexImage2D() . In other

OpenGL ES3 Shadow map problems

柔情痞子 提交于 2019-12-22 01:36:10
问题 I work on C++ project for Android with OpenGL ES3, so I try to implement the shadow map with directional light, I understand the theory well but I never get it successfully rendered. first I create the framebuffer which contains the depth map: glGenFramebuffers(1, &depthMapFBO); glBindFramebuffer(GL_FRAMEBUFFER, depthMapFBO); glGenTextures(1, &depthMap); glBindTexture(GL_TEXTURE_2D, depthMap); glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, SHADOW_WIDTH, SHADOW_HEIGHT, 0, GL_DEPTH

When switching to GLSL 300, met the following error

隐身守侯 提交于 2019-12-21 04:42:06
问题 when I switch to use OpenGL ES 3 with GLSL 300, I met the following error in my frag shader undeclared identifier gl_FragColor when using GLSL 100, everything is fine. 回答1: Modern versions of GLSL do fragment shader outputs simply by declaring them as out values, and gl_FragColor is no longer supported, hence your error. Try this: out vec4 fragColor; void main() { fragColor = vec4(1.0, 0.0, 0.0, 1.0); } Note that gl_FragDepth hasn't changed and is still available. For more information see

Does the Android emulator support OpenGL ES 3.0?

只愿长相守 提交于 2019-12-17 07:47:32
问题 I know that the emulator has supported OpenGL ES 2.0 as of SDK tools 17 and Android 4.0.3, but that was introduced back in April 2012. Does the Android emulator support OpenGL ES 3.0, or are we still waiting on that? If not, does any other third-party emulator/simulator (e.g. Genymotion) support OpenGL ES 3.0? 回答1: The latest Android Emulator now supports OpenGL ES 3.0. To use OpenGL ES 3.0, your development machine needs a host GPU graphics card that supports OpenGL 3.2 or higher on

Open GL ES error: undefined reference to 'glDispatchCompute'

时光怂恿深爱的人放手 提交于 2019-12-13 04:11:50
问题 I am using Open GL ES 3.1 in Android app with native C++ code. So I need to run a C++ lib with Android support. I have used some Open GL ES functions and they worked well. But when I tried to use glDispatchCompute , a linker gave a following error: undefined reference to 'glDispatchCompute' . Here is the call: glDispatchCompute(10, 1, 1); Here are my includes: #include <string> #include <jni.h> #include <GLES3/gl31.h> #include <GLES/egl.h> #include <GLES/gl.h> #include <GLES3/gl3ext.h>