opengl-es

OpenGLES 3.0 Shader Compile error on Android Device for in and out storage qualifiers

心不动则不痛 提交于 2020-03-23 07:34:06
问题 So I am updating my app to use OpenGLES 3.0 to take advantage of transform feedback, but the shader isn't compiling. error: 06-27 17:29:43.299 18593-18627/com.harmonicprocesses.penelopefree E/MyGLRenderer﹕ Could not compile shader 35633: 06-27 17:29:43.299 18593-18627/com.harmonicprocesses.penelopefree E/MyGLRenderer﹕ ERROR: 0:1: 'in' : Syntax error: syntax error INTERNAL ERROR: no main() function! ERROR: 1 compilation errors. No code generated. Here is the vertex shader code: private final

Android MediaCodec appears to buffer H264 frames

我只是一个虾纸丫 提交于 2020-03-20 05:00:58
问题 I'm manually reading a RTP/H264 stream and pass the H264 frames to the Android MediaCodec. I use the "markerBit" as a border for the frames. The MediaCodec is tied to a OpenGL Texture (SurfaceTexture). In general everything works fine. But the Decoder appears to buffer frames. If I put a frame in the decoder it is not rendered immediately to the texture. After I put 2-3 frames more in the decoder the first frame is rendered to the texture. I'm implementing against Android 4.4.4. private

Android MediaCodec appears to buffer H264 frames

♀尐吖头ヾ 提交于 2020-03-20 04:58:34
问题 I'm manually reading a RTP/H264 stream and pass the H264 frames to the Android MediaCodec. I use the "markerBit" as a border for the frames. The MediaCodec is tied to a OpenGL Texture (SurfaceTexture). In general everything works fine. But the Decoder appears to buffer frames. If I put a frame in the decoder it is not rendered immediately to the texture. After I put 2-3 frames more in the decoder the first frame is rendered to the texture. I'm implementing against Android 4.4.4. private

Android MediaCodec appears to buffer H264 frames

你离开我真会死。 提交于 2020-03-20 04:55:09
问题 I'm manually reading a RTP/H264 stream and pass the H264 frames to the Android MediaCodec. I use the "markerBit" as a border for the frames. The MediaCodec is tied to a OpenGL Texture (SurfaceTexture). In general everything works fine. But the Decoder appears to buffer frames. If I put a frame in the decoder it is not rendered immediately to the texture. After I put 2-3 frames more in the decoder the first frame is rendered to the texture. I'm implementing against Android 4.4.4. private

Taking screenshot of WKWebview with hardware accelerated content

冷暖自知 提交于 2020-03-17 11:41:24
问题 I am having serious trouble with taking screenshot of WKWebview content when there is hardware accelerated content (some specific casino games that are running inside iframe). So far I used the standard way of taking screenshot like everyone suggests: UIGraphicsBeginImageContextWithOptions(containerView.frame.size, true, 0.0) containerView.layer.render(in: UIGraphicsGetCurrentContext()!) //This line helps to fix view rendering for taking screenshot on older iOS devices containerView

sRGB Framebuffer on OpenGL ES 3.0

巧了我就是萌 提交于 2020-03-03 02:59:16
问题 I am working on a OpenGL ES 3.0 Android project using Java. I need to implement gamma correction and somewhere I've read that sRGB textures would be supported in OpenGL ES 3.0. So my intention was to call glEnable(GL_FRAMEBUFFER_SRGB) before rendering into the default framebuffer. However, when I try to call GLES30.glEnable(GLES30.GL_FRAMEBUFFER_SRGB) it turns out, there is no GLES30.GL_FRAMEBUFFER_SRGB, but there are some constants for sRGB texture formats like GLES30.GL_SRGB. So, my

sRGB Framebuffer on OpenGL ES 3.0

扶醉桌前 提交于 2020-03-03 02:57:48
问题 I am working on a OpenGL ES 3.0 Android project using Java. I need to implement gamma correction and somewhere I've read that sRGB textures would be supported in OpenGL ES 3.0. So my intention was to call glEnable(GL_FRAMEBUFFER_SRGB) before rendering into the default framebuffer. However, when I try to call GLES30.glEnable(GLES30.GL_FRAMEBUFFER_SRGB) it turns out, there is no GLES30.GL_FRAMEBUFFER_SRGB, but there are some constants for sRGB texture formats like GLES30.GL_SRGB. So, my

sRGB Framebuffer on OpenGL ES 3.0

浪尽此生 提交于 2020-03-03 02:56:47
问题 I am working on a OpenGL ES 3.0 Android project using Java. I need to implement gamma correction and somewhere I've read that sRGB textures would be supported in OpenGL ES 3.0. So my intention was to call glEnable(GL_FRAMEBUFFER_SRGB) before rendering into the default framebuffer. However, when I try to call GLES30.glEnable(GLES30.GL_FRAMEBUFFER_SRGB) it turns out, there is no GLES30.GL_FRAMEBUFFER_SRGB, but there are some constants for sRGB texture formats like GLES30.GL_SRGB. So, my

What are the cause(s) of input touch/display lag in android?

China☆狼群 提交于 2020-02-23 04:52:00
问题 I have a very simple app that render a square with opengl, input touch are read by the GLSurfaceView and last position is exchanged with the rendering thread using a volatile variable. What I observe (and as been also very well described in https://www.mail-archive.com/android-developers@googlegroups.com/msg235325.html) is that there is a delay (a lag) between the touch position and the display. When activating the developer option to show touch position, I see that when moving rapidly: the

What are the cause(s) of input touch/display lag in android?

六眼飞鱼酱① 提交于 2020-02-23 04:51:53
问题 I have a very simple app that render a square with opengl, input touch are read by the GLSurfaceView and last position is exchanged with the rendering thread using a volatile variable. What I observe (and as been also very well described in https://www.mail-archive.com/android-developers@googlegroups.com/msg235325.html) is that there is a delay (a lag) between the touch position and the display. When activating the developer option to show touch position, I see that when moving rapidly: the