问题
I'm writing an android app using openGL ES. I followed some online tutorials and managed to load up a textured cube using hard-coded vertices/indices/texture coordinates
As a next step I wrote a parser for wavefront .obj files. I made a mock file using the vertices etc from the tutorial, which loads fine.
However, when I use a file made using a 3d modelling package, all the textures get messed up
Below is how I'm currently getting the texture coordinates:
First I load all the texture coordinates, the vt
's into a big vector
Next I find the first two texture coordinates for each f
triangle (so f 1/2/3 2/5/2 3/4/1 means I take the 2nd and 5th texture coordinates. Since .obj starts counting from 1 not 0, I have to -1 from the position and then multiply the position by 2 for the x coord position and do the same but +1 for the y coord position in my vt
array)
I take those texture coordinates that I just found and add them to another vector.
Once I've gone through all the vertices. I turn the vector into a FloatBuffer, passing that to glTexCoordPointer
in my draw method
Here is the code for parsing the file:
private void openObjFile(String filename, Context context, GL10 gl){
Vector<String> lines = openFile(filename, context); // opens the file
Vector<String[]> tokens = new Vector<String[]>();
Vector<Float> vertices = new Vector<Float>();
Vector<Float> textureCoordinates = new Vector<Float>();
Vector<Float> vertexNormals = new Vector<Float>();
// tokenise
for(int i = 0;i<lines.size();i++){
String line = lines.get(i);
tokens.add(line.split(" "));
}
for(int j = 0;j<tokens.size();j++){
String[] linetokens = tokens.get(j);
// get rid of comments
//if(linetokens[0].equalsIgnoreCase("#")){
//tokens.remove(j);
//}
// get texture from .mtl file
if(linetokens[0].equalsIgnoreCase("mtllib")){
parseMaterials(linetokens[1],context, gl);
}
// vertices
if(linetokens[0].equalsIgnoreCase("v")){
vertices.add(Float.valueOf(linetokens[1]));
vertices.add(Float.valueOf(linetokens[2]));
vertices.add(Float.valueOf(linetokens[3]));
}
// texture coordinates
if(linetokens[0].equalsIgnoreCase("vt")){
textureCoordinates.add(Float.valueOf(linetokens[1]));
textureCoordinates.add(Float.valueOf(linetokens[2]));
}
// vertex normals
if(linetokens[0].equalsIgnoreCase("vn")){
vertexNormals.add(Float.valueOf(linetokens[1]));
vertexNormals.add(Float.valueOf(linetokens[2]));
vertexNormals.add(Float.valueOf(linetokens[3]));
}
}
// vertices
this.vertices = GraphicsUtil.getFloatBuffer(vertices);
Mesh mesh = null;
Vector<Short> indices = null;
Vector<Float> textureCoordinatesMesh = null;
Vector<Float> vertexNormalsMesh = null;
for(int j = 0;j<tokens.size();j++){
String[] linetokens = tokens.get(j);
if(linetokens[0].equalsIgnoreCase("g")){
if(mesh!=null){
mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
mesh.setNumindices(indices.size());
mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));
meshes.add(mesh);
}
mesh = new Mesh();
indices = new Vector<Short>();
textureCoordinatesMesh = new Vector<Float>();
vertexNormalsMesh = new Vector<Float>();
} else if(linetokens[0].equalsIgnoreCase("usemtl")){
String material_name = linetokens[1];
for(int mn = 0;mn<materials.size();mn++){
if(materials.get(mn).getName().equalsIgnoreCase(material_name)){
mesh.setTextureID(materials.get(mn).getTextureID());
mn = materials.size();
}
}
} else if(linetokens[0].equalsIgnoreCase("f")){
for(int v = 1;v<linetokens.length;v++){
String[] vvtvn = linetokens[v].split("/");
short index = Short.parseShort(vvtvn[0]);
index -= 1;
indices.add(index);
if(v!=3){
int texturePosition = (Integer.parseInt(vvtvn[1]) - 1) * 2;
float xcoord = (textureCoordinates.get(texturePosition));
float ycoord = (textureCoordinates.get(texturePosition+1));
// normalise
if(xcoord>1 || ycoord>1){
xcoord = xcoord / Math.max(xcoord, ycoord);
ycoord = ycoord / Math.max(xcoord, ycoord);
}
textureCoordinatesMesh.add(xcoord);
textureCoordinatesMesh.add(ycoord);
}
int normalPosition = (Integer.parseInt(vvtvn[2]) - 1) *3;
vertexNormalsMesh.add(vertexNormals.get(normalPosition));
vertexNormalsMesh.add(vertexNormals.get(normalPosition)+1);
vertexNormalsMesh.add(vertexNormals.get(normalPosition)+2);
}
}
}
if(mesh!=null){
mesh.setIndices(GraphicsUtil.getShortBuffer(indices));
mesh.setNumindices(indices.size());
mesh.setNormals(GraphicsUtil.getFloatBuffer(vertexNormalsMesh));
mesh.setTextureCoordinates(GraphicsUtil.getFloatBuffer(textureCoordinatesMesh));
meshes.add(mesh);
}// Adding the final mesh
}
And here is the code for drawing:
public void draw(GL10 gl){
gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
// Counter-clockwise winding.
gl.glFrontFace(GL10.GL_CCW);
gl.glEnable(GL10.GL_CULL_FACE);
gl.glCullFace(GL10.GL_BACK);
// Pass the vertex buffer in
gl.glVertexPointer(3, GL10.GL_FLOAT, 0,
vertices);
for(int i=0;i<meshes.size();i++){
meshes.get(i).draw(gl);
}
// Disable the buffers
gl.glDisableClientState(GL10.GL_VERTEX_ARRAY);
}
public void draw(GL10 gl){
if(textureID>=0){
// Enable Textures
gl.glEnable(GL10.GL_TEXTURE_2D);
// Get specific texture.
gl.glBindTexture(GL10.GL_TEXTURE_2D, textureID);
// Use UV coordinates.
gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
// Pass in texture coordinates
gl.glTexCoordPointer(2, GL10.GL_FLOAT, 0, textureCoordinates);
}
// Pass in texture normals
gl.glNormalPointer(GL10.GL_FLOAT, 0, normals);
gl.glEnableClientState(GL10.GL_NORMAL_ARRAY);
gl.glDrawElements(GL10.GL_TRIANGLES, numindices,GL10.GL_UNSIGNED_SHORT, indices);
if(textureID>=0){
// Disable buffers
gl.glDisableClientState(GL10.GL_NORMAL_ARRAY);
gl.glDisableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
}
}
I'd really appreciate any help with this. It is frustrating to be not-quite able to load up the model from file and I'm really not sure what I'm doing wrong or missing here
回答1:
I have to admit to being a little confused by the framing of your code. Specific things I would expect to be an issue:
- you decline to copy a texture coordinate to the final mesh list for the third vertex associated with any face; this should put all of your coordinates out of sync after the first two
- your texture coordinate normalisation step is unnecessary — to the extent that I'm not sure why it's in there — and probably broken (what if xcoord is larger than ycoord on the first line, then smaller on the second?)
- OBJ considers (0, 0) to be the top left of a texture, OpenGL considers it to be the bottom left, so unless you've set the texture matrix stack to invert texture coordinates in code not shown, you need to invert them yourself, e.g.
textureCoordinatesMesh.add(1.0 - ycoord);
Besides that, generic OBJ comments that I'm sure you're already well aware of and don't relate to the problem here are that you should expect to handle files that don't supply normals and files that don't supply either normals or texture coordinates (you currently assume both are present), and OBJ can hold faces with an arbitrary number of vertices, not just triangles. But they're always planar and convex, so you can just draw them as a fan or break them into triangles as though they were a fan.
来源:https://stackoverflow.com/questions/5585368/problems-using-wavefront-objs-texture-coordinates-in-android-opengl-es