问题
I'm working on a code to do a software skinner (bone/skin animation), and I'm at the "optimization" phase (the skinner works pretty well and skin a 4900 triangles mesh with 22 bones in 1.09 ms on a Core Duo 2 Ghz (notebook)). What I need to know is :
1) Can someone show me the way (maybe with pseudocode) to transform a float3 (array of 3 float) (representing a coordinate) against a float4x3 matrix?
2) Can someone show me the way (maybe with pseudocode) to transform a float3 (array of 3 float) (representing a normal) against a float3x3 matrix?
I ask this as i know that in the skinning process you can avoid to use part of the matrix without getting any change in the animation process. (So to recover some elaboration time)
Thanks!
回答1:
Optimizing vector/matrix operations via mathematical reduction is possible, but tricky. You can find some information on the topic here, here, and here.
Now, this may not be quite what you're looking for, but...
You can use the machine GPU (graphics card processor) to vastly increase the computation performance of vector/matrix operations. Many operations can be increased by several orders of magnitude by taking advantage of SIMD processing available on the GPU.
There are two reasonably good libraries available for C# developers for GPGPU programming:
- Microsoft's Accelerator Library, documentation available here.
- Brahma - an open source GPU library for C# developers that leverages LINQ.
来源:https://stackoverflow.com/questions/2104718/vector-transformation-with-matrix