问题
I'm using the following vertex shader (courtesy http://stemkoski.github.io/Three.js/Shader-Heightmap-Textures.html) to generate terrain from a grayscale height map:
uniform sampler2D bumpTexture;
uniform float bumpScale;
varying float vAmount;
varying vec2 vUV;
void main()
{
vUV = uv;
vec4 bumpData = texture2D( bumpTexture, uv );
vAmount = bumpData.r; // assuming map is grayscale it doesn't matter if you use r, g, or b.
// move the position along the normal
vec3 newPosition = position + normal * bumpScale * vAmount;
gl_Position = projectionMatrix * modelViewMatrix * vec4( newPosition, 1.0);
}
I'd like to have 32-bits of resolution, and have generated a heightmap that encodes heights as RGBA. I have no idea how to go about changing the shader code to accommodate this. Any direction or help?
回答1:
bumpData.r
, .g
, .b
and .a
are all quantities in the range [0.0, 1.0] equivalent to the original byte values divided by 255.0
.
So depending on your endianness, a naive conversion back to the original int might be:
(bumpData.r * 255.0) +
(bumpdata.g * 255.0 * 256.0) +
(bumpData.b * 255.0 * 256.0 * 256.0) +
(bumpData.a * 255.0 * 256.0 * 256.0 * 256.0)
So that's the same as a dot product with the vector (255.0, 65280.0, 16711680.0, 4278190080.0)
, which is likely to be the much more efficient way to implement it.
回答2:
With threejs
const generateHeightTexture = (width) => {
// let max_texture_width = RENDERER.capabilities.maxTextureSize;
let pixels = new Float32Array(width * width)
pixels.fill(0, 0, pixels.length);
let texture = new THREE.DataTexture(pixels, width, width, THREE.AlphaFormat, THREE.FloatType);
texture.magFilter = THREE.LinearFilter;
texture.minFilter = THREE.NearestFilter;
// texture.anisotropy = RENDERER.capabilities.getMaxAnisotropy();
texture.needsUpdate = true;
console.log('Built Physical Texture:', width, 'x', width)
return texture;
}
来源:https://stackoverflow.com/questions/28652695/webgl-heightmap-using-vertex-shader-using-32-bits-instead-of-8-bits