Render to 16bits unsigned integer 2D texture in WebGL2
问题 As stated in the WebGL 2 official specs or docs (look here), gl.RGBA16UI internal size format is a color-renderable format. That means I should be able to render to a RGBA16UI texture. I can easily fill a texture with an UInt16Array and then use it. But I fail filling a texture rendering into it with a shader. Then, I only get zeros when I sample the values in the next shader. Did anyone already succeed in rendering to an unsigned integer texture with WebGL 2 ? I would be very grateful, I'm