How modern X86 processors actually compute multiplications?
问题 I was watching some lecture on algorithms, and the professor used multiplication as an example of how naive algorithms can be improved... It made me realize that multiplication is not that obvious, although when I am coding I just consider it a simple atomic operation, multiplication requires a algorithm to run, it does not work like summing numbers. So I wonder, what algorithm modern desktop processors actually use? I guess they don't rely on logarithm tables, and don't make loops with