OpenGl 16 bit display via Tao/C#

牧云@^-^@ 提交于 2019-12-06 12:49:03

If you create a texture the 'type' parameter of glTexImage is only the data type your texture data is in before it is converted by OpenGL into its own format. To create a texture with 16 bit per channel you need something like GL_LUMINANCE16 as format (internal format remains GL_LUMINANCE). If there's no GL_LUMINANCE16 for OpenGL 1.4 check if GL_EXT_texture is available and try it with GL_LUMINANCE16_EXT.

One of these should work. However if it doesn't you can encode your 16 bit values as two 8 bit pairs with GL_LUMINANCE_ALPHA and decode it again inside a shader.

I've never worked in depths higher (deeper) than 8bit per channel, but here's what I'd try first:

Turn off filtering on the texture and see how it affects the output.

Set texturing glHints to best quality.

You could consider using a single channel floating point texture through one of the GL_ARB_texture_float, GL_ATI_texture_float or GL_NV_float_buffer extensions if the hardware supports it, I can't recall if GL 1.4 has floating point textures or not though.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!