iOS: playing a frame-by-frame greyscale animation in a custom colour

让人想犯罪 __ 提交于 2020-01-11 07:16:26

问题


I have a 32 frame greyscale animation of a diamond exploding into pieces (ie 32 PNG images @ 1024x1024)

my game consists of 12 separate colours, so I need to perform the animation in any desired colour

this I believe rules out any Apple frameworks, also it rules out a lot of public code for animating frame by frame in iOS.

what are my potential solution paths?

these are the best SO links I have found:

  • Faster iPhone PNG Animations
  • frame by frame animation
  • Is it possible using video as texture for GL in iOS?

that last one just shows it is may be possible to load an image into a GL texture each frame ( he is doing it from the camera, so if I have everything stored in memory, that should be even faster )

I can see these options ( listed laziest first, most optimised last )

option A each frame (courtesy of CADisplayLink), load the relevant image from file into a texture, and display that texture

I'm pretty sure this is stupid, so onto option B

option B preload all images into memory then as per above, only we load from memory rather than from file

I think this is going to be the ideal solution, can anyone give it the thumbs up or thumbs down?

option C preload all of my PNGs into a single GL texture of the maximum size, creating a texture Atlas. each frame, set the texture coordinates to the rectangle in the Atlas for that frame.

while this is potentially a perfect balance between coding efficiency and performance efficiency, the main problem here is losing resolution; on older iOS devices maximum texture size is 1024x1024. if we are cramming 32 frames into this ( really this is the same as cramming 64 ) we would be at 128x128 for each frame. if the resulting animation is close to full screen on the iPad this isn't going to hack it

option D instead of loading into a single GL texture, load into a bunch of textures moreover, we can squeeze 4 images into a single texture using all four channels

I baulk at the sheer amount of fiddly coding required here. My RSI starts to tingle even thinking about this approach

I think I have answered my own question here, but if anyone has actually done this or can see the way through, please answer!


回答1:


If something higher performance than (B) is needed, it looks like the key is glTexSubImage2D http://www.opengl.org/sdk/docs/man/xhtml/glTexSubImage2D.xml

Rather than pull across one frame at a time from memory, we could arrange say 16 512x512x8-bit greyscale frames contiguously in memory, send this across to GL as a single 1024x1024x32bit RGBA texture, and then split it within GL using the above function.

This would mean that we are performing one [RAM->VRAM] transfer per 16 frames rather than per one frame.

Of course, for more modern devices we could get 64 instead of 16, since more recent iOS devices can handle 2048x2048 textures.

I will first try technique (B) and leave it at that if it works ( I don't want to over code ), and look at this if needed.

I still can't find any way to query how many GL textures it is possible to hold on the graphics chip. I have been told that when you try to allocate memory for a texture, GL just returns 0 when it has run out of memory. however to implement this properly I would want to make sure that I am not sailing close to the wind re: resources... I don't want my animation to use up so much VRAM that the rest of my rendering fails...




回答2:


You would be able to get this working just fine with CoreGraphics APIs, there is no reason to deep dive into OpenGL for a simple 2D problem like this. For the general approach you should take to creating colored frames from a grayscale frame, see colorizing-image-ignores-alpha-channel-why-and-how-to-fix. Basically, you need to use CGContextClipToMask() and then render a specific color so that what is left is the diamond colored in with the specific color you have selected. You could do this at runtime, or you could do it offline and create 1 video for each of the colors you want to support. It is be easier on your CPU if you do the operation N times and save the results into files, but modern iOS hardware is much faster than it used to be. Beware of memory usage issues when writing video processing code, see video-and-memory-usage-on-ios-devices for a primer that describes the problem space. You could code it all up with texture atlases and complex openGL stuff, but an approach that makes use of videos would be a lot easier to deal with and you would not need to worry so much about resource usage, see my library linked in the memory post for more info if you are interested in saving time on the implementation.



来源:https://stackoverflow.com/questions/6327477/ios-playing-a-frame-by-frame-greyscale-animation-in-a-custom-colour

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!