问题
I have an object moving about in a flat 2D coordinate system (0-1000,0-1000).
In a 3D space coordinate system, I have a sphere and a camera looking at the sphere.
I then want to convert these 2D coordinates to the 3D world/space coordinates and then move this object around the surface of the sphere. So, it's basically like converting Lat/Long into a 3D spacial coordinate.
Currently I'm using setFromSpherical() but maybe I'm using it the wrong way.
Vector3 my3DVector = new Vector3(0,0,0);
my3DVector.setFromSpherical((float)Math.toRadians(azimuthAngle), (float)Math.toRadians(polarAngle)).
Now my understanding of the 2 parameters are as follows:
- Azimuth angle is in Radians, between 0-2PI, and is effectively LONGDITUDE
- Polar angle is in Radians, between 0-1PI, and is effectively LATITUDE
As my 2D grid (lat/long) grid size is 1000x1000, I calculate azimuth and polar like this:
float azimuthAngle = (flatPos.X/1000) * 360;
float polarAngle = (flatPos.Y/1000) * 180;
This gives the normalised versions of the 2D grid values (Azimuth 0-360 degrees, Polar 0-180 degrees), which are then converted to Radians, and passed to setFromSpherical(). Currently, this works, and the object is moving around the surface of the 3D Sphere, however, when the 2D X value passes 1000, it wraps back to 0 and then the calculated 3D Z value jumps, so the object jumps to the other side. The object wrapping in the 2D Y value works fine though...
Any thoughts? Thanks
来源:https://stackoverflow.com/questions/34297871/libgdx-converting-2d-map-coords-to-3d-space-coords