texture

LibGDX: Filtering a scaled TextureRegion

匿名 (未验证) 提交于 2019-12-03 02:24:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I have several objects with different textures for different states, so I am using a TextureAtlas made with TexturePacker, and resizing the TextureRegion where I need it. I have to resize because not only am I trying to support both 720p and 1080p, but some of my objects are tiles or cursors which resize based on the width and height of the board, as that can change in my game whereas the board will always occupy the same percentage of the screen. With a Texture , I can just do this: texture.setFilter(TextureFilter.Linear, TextureFilter

How to detect maximum texture resolution on iPhone?

匿名 (未验证) 提交于 2019-12-03 02:19:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm making an universal openGL based app that should work on ipod/iphone 2G/3G/3GS/4 and iPad. To deliver the best possible graphics I need to switch between different texture resolutions based on what device is running it. For example, iPhone 2G needs textures that is no larger than 1024x1024, while iPhone 3GS can handle larger textures. So, on iPhone 3GS I want to load a texture atlas that's 2048x2048, while iPhone 2G will get the downscaled 1024x1024 texture atlas. Is there a simple and safe way to detect the maximum texture resolution

OpenGL 2 Texture Internal Formats GL_RGB8I, GL_RGB32UI, etc

匿名 (未验证) 提交于 2019-12-03 02:16:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm rewriting a large part of my texturing code. I would like to be able to specify certain internal formats: GL_RGB8I, GL_RGB8UI, GL_RGB16I, GL_RGB16UI, GL_RGB32I, and GL_RGB32UI. These tokens do not exist in OpenGL 2. When specifying these internal formats as arguments to glTexImage2D, the texturing fails (the texture appears as white). When checking for errors, I get [EDIT:] 1282 ("invalid operation"). I take this to mean that the OpenGL is still using OpenGL 2 for glTexImage2D, and so the call is failing. Obviously, it will need to use a

Get Maximum OpenGL ES 2.0 Texture Size Limit in Android

匿名 (未验证) 提交于 2019-12-03 02:15:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: I am trying to get the maximum texture size limit in Android for OpenGL 2.0. But I've found that the next instruction only works if I'm currently within the OpenGL Context, in other words I must have a GL Surface and a GL Renderer, etc, which I don't want. int [] maxTextureSize = new int [ 1 ]; GLES20 . glGetIntegerv ( GLES20 . GL_MAX_TEXTURE_SIZE , maxTextureSize , 0 ); So I came with the next algorithm, which gives me the maximum texture size without having to create any surface or renderer. It works correctly, so my question is

Threejs Texture

匿名 (未验证) 提交于 2019-12-03 02:15:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: When I execute my render of some geometry I can see in my console this warning: THREE.WebGLRenderer: Texture is not power of two. Texture.minFilter should be set to THREE.NearestFilter or THREE.LinearFilter. I can't understand the reason and the background of my canvas is completely black. 回答1: The size of your texture is not powers of two (ie. 16x16, 32x32, 64x64 ...). Set yourTexture.minFilter = THREE.LinearFilter to get rid of the error message. 文章来源: Threejs Texture

WebGL - wait for texture to load

匿名 (未验证) 提交于 2019-12-03 02:13:02
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: How do I test if the WebGLTexture object is 'complete' ? Currently I get this message: [WebGLRenderingContext]RENDER WARNING: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not 'texture complete' I get this warning because the render-loop is trying to use the texture before its image has finished loading, so how to fix that? 回答1: The easiest way to fix that is to make a 1x1 texture at creation time. var tex = gl.createTexture(); gl.bindTexture(gl.TEXTURE_2D, tex); gl

Android Video Player Using NDK, OpenGL ES, and FFmpeg

匿名 (未验证) 提交于 2019-12-03 02:03:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: Ok so here is what I have so far. I have built FFmpeg on android and am able to use it fine. I have been able to load a video into FFmpeg after passing the chosen filename from the java side. To save on performance I am writing video player in the NDK rather than passing frames from FFmpeg to java through JNI. I want to send frames from the video to an OpenGL surface. I am having trouble figuring out how to get each frame of video and render it onto the OpenGL surface. I have been stuck trying to figure this out for a couple weeks

OpenGL font rendering using Freetype2

匿名 (未验证) 提交于 2019-12-03 01:58:03
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to render a freetype font using OpenGL, following the example posted at http://en.wikibooks.org/wiki/OpenGL_Programming/Modern_OpenGL_Tutorial_Text_Rendering_02 . I've been able to generate a texture atlas from the font, creating shaders and creating quads. What I seem to get stuck at is passing the texture to the shader and/or getting the correct UVs for my quads. Been struggling for a good while now and really need the help. The following is the struct I use to create my texture atlas. struct FontCharacter { float advanceX;

How to save SurfaceTexture as bitmap

匿名 (未验证) 提交于 2019-12-03 01:57:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: When I decode a video to a surface I want to save the frames i want as bitmap/jpeg files. I don't want to draw on the screen and just want to save the content of the SurfaceTexture as an image file. 回答1: You have to render the texture. If it were a normal texture, and you were using GLES 2 or later, you could attach it to an FBO and read directly from that. A SurfaceTexture is backed by an " external texture ", and might be in a format that the GL driver doesn't support a full set of operations on, so you can't do that. You need to render it

glsl sampler2DShadow and shadow2D clarification

匿名 (未验证) 提交于 2019-12-03 01:57:01
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 由 翻译 强力驱动 问题: Quick background of where I'm at (to make sure we're on the same page, and sanity check if I'm missing/assuming something stupid): Goal: I want to render my scene with shadows, using deferred lighting and shadowmaps. Struggle: finding clear and consistent documentation regarding how to use shadow2D and sampler2DShadow. Here's what I'm currently doing: In the fragment shader of my final rendering pass (the one that actually calculates final frag values), I have the MVP matrices from the pass from the light's point of view, the depth