textures

MeshLab: How to import XYZRGB file

半城伤御伤魂 提交于 2020-02-21 10:31:09
问题 When I import and XYZRGB (that I generate programatically), MeshLab renders the point-cloud, but the colors are missing. How can I assign textures/colors to vertices. I have tried "Vertex Attribute Transfer" but nothing happens. 回答1: Rename the extension .xyz to .txt. Then choose the x y z r g b option when importing the file. in meshlab source: http://www.laserscanningforum.com/forum/viewtopic.php?f=168&t=8052 来源: https://stackoverflow.com/questions/22850905/meshlab-how-to-import-xyzrgb-file

OpenGL compressed textures and extensions

删除回忆录丶 提交于 2020-02-01 04:48:05
问题 I've an nVidia Quadro NVS 295/PCIe/SSE2 card in which when I do glGetString(GL_EXTENSIONS) , print out the values and grep for "compress", I get this list GL_ARB_compressed_texture_pixel_storage GL_ARB_texture_compression GL_ARB_texture_compression_rgtc GL_EXT_texture_compression_dxt1 GL_EXT_texture_compression_latc GL_EXT_texture_compression_rgtc GL_EXT_texture_compression_s3tc GL_NV_texture_compression_vtc But then again glCompressedTexImage2D says that glGet with GL_COMPRESSED_TEXTURE

three.js change texture on material

人走茶凉 提交于 2020-02-01 02:42:05
问题 I'm setting up a texture on a mesh in three.js and when it loads it looks how I want it too: texture = THREE.ImageUtils.loadTexture("textures/hash.png"); texture.needsUpdate = true; uniforms = { color: { type: "c", value: new THREE.Color( 0xffffff ) }, texture: { type: "t", value: texture }, }, vertexShader = "varying vec2 vUv; void main() {vUv = uv;gl_Position = projectionMatrix * modelViewMatrix * vec4( position, 1.0 );}", fragmentShader = "uniform vec3 color; uniform sampler2D texture;

cuda kernel for add(a,b,c) using texture objects for a & b - works correctly for 'increment operation' add(a,b,a)?

吃可爱长大的小学妹 提交于 2020-01-25 11:31:47
问题 I want to implement a cuda function 'add(a,b,c)' for adding (component-wise) two one-channel floating-point images 'a' and 'b' together and storing the result in the floating-point image 'c'. So 'c = a + b'. The function will be implemented by first binding texture objects 'aTex' and 'bTex' to the pitch-linear images 'a' and 'b', and then accessing the image 'a' and 'b' inside the kernel only via the texture objects 'aTex' and 'bTex'. The sum is stored in 'c' via a simple write to global

cuda kernel for add(a,b,c) using texture objects for a & b - works correctly for 'increment operation' add(a,b,a)?

三世轮回 提交于 2020-01-25 11:31:46
问题 I want to implement a cuda function 'add(a,b,c)' for adding (component-wise) two one-channel floating-point images 'a' and 'b' together and storing the result in the floating-point image 'c'. So 'c = a + b'. The function will be implemented by first binding texture objects 'aTex' and 'bTex' to the pitch-linear images 'a' and 'b', and then accessing the image 'a' and 'b' inside the kernel only via the texture objects 'aTex' and 'bTex'. The sum is stored in 'c' via a simple write to global

Xna transform a 2d texture like photoshop transforming tool

百般思念 提交于 2020-01-25 04:55:10
问题 I want to create the same transforming effect on XNA 4 as Photoshop does: Transform tool is used to scale, rotate, skew, and just distort the perspective of any graphic you’re working with in general This is what all the things i want to do in XNA with any textures http://www.tutorial9.net/tutorials/photoshop-tutorials/using-transform-in-photoshop/ Skew: Skew transformations slant objects either vertically or horizontally. Distort: Distort transformations allow you to stretch an image in ANY

Render to 16bits unsigned integer 2D texture in WebGL2

99封情书 提交于 2020-01-25 04:36:05
问题 As stated in the WebGL 2 official specs or docs (look here), gl.RGBA16UI internal size format is a color-renderable format. That means I should be able to render to a RGBA16UI texture. I can easily fill a texture with an UInt16Array and then use it. But I fail filling a texture rendering into it with a shader. Then, I only get zeros when I sample the values in the next shader. Did anyone already succeed in rendering to an unsigned integer texture with WebGL 2 ? I would be very grateful, I'm

How do you use not-power-of-2 textures in LWJGL?

折月煮酒 提交于 2020-01-25 00:04:50
问题 I'm using Slick Util's Texture class to load textures for lwjgl, but apparently that forces your texture to a power of 2. How would you use a texture that's not a power of 2? (I really don't want to add transparent pixels to the edge of the image to make it a power of 2.) I've seen somewhere that it's possible, but I can't find any way of doing it. 回答1: The reason the stick-utils package forces you to use power-of-two textures is this is what graphics cards like to load. Where possible, you

How to bind texture just to one object in OpenGLES?

白昼怎懂夜的黑 提交于 2020-01-24 19:49:29
问题 I am activating the texture just before the object to draw. But the texture is showing on both objects. Why is that so? Should I unbind the texture before the first object to draw? - I tried with glDisable and glBindTexture but it did not help. Here my code: @Override public void onDrawFrame(GL10 gl) { GLES20.glClear(GL10.GL_COLOR_BUFFER_BIT); synchronized (camerObject) { surfaceTextureCamera.updateTexImage(); cameraUpdate = false; } vertexBuffer.position(0); GLES20.glVertexAttribPointer

OpenGL ES texture problem, 4 duplicate columns and horizontal lines (Android)

久未见 提交于 2020-01-24 19:35:09
问题 I have a buffer of RGB (or RGBA) texture image, and I want to display it on my Android device with the following code. I use OpenGL from NDK. glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 256, 256, 0, GL_RGBA, GL_UNSIGNED_BYTE, this->pBuffer); I have also set the PixelFormat from the Java side with: this.getHolder().setFormat(PixelFormat.RGBA_8888); this.setEGLConfigChooser(8, 8, 8, 8, 0, 0); setRenderer(new MyRenderer()); The image is displayed but there are four columns (identical and contains