Why is squaring a number faster than multiplying two random numbers?

前端 未结 14 2061
终归单人心
终归单人心 2021-01-30 07:18

Multiplying two binary numbers takes n^2 time, yet squaring a number can be done more efficiently somehow. (with n being the number of bits) How could that be?

Or is i

相关标签:
14条回答
  • 2021-01-30 07:43

    Do you mean multiplying a number by a power of 2? This is usually quicker than multiplying any two random numbers since the result can be calculated by simple bit shifting. However, bear in mind that modern microprocessors dedicate lots of brute force silicon to these types of calculations and most arithmetic is performed with blinding speed compared to older microprocessors

    0 讨论(0)
  • 2021-01-30 07:43

    If you assume fixed length to the word size of the machine and that the number to be squared is in memory, a squaring operation requires only one load from memory, so could be faster.

    For arbitrary length integers, multiplication is typically O(N²) but there are algorithms which reduce this for large integers.

    If you assume the simple O(N²) approach to multiply a by b, then for each bit in a you have to shift b and add it to an accumulator if that bit is one. For each bit in a you need 3N shifts and additions.

    Note that

    ( x - y )² = x² - 2 xy + y²
    

    Hence

    x² = ( x - y )² + 2 xy - y²
    

    If each y is the largest power of two not greater than x, this gives a reduction to a lower square, two shifts and two additions. As N is reduced on each iteration, you may get an efficiency gain ( the symmetry means it visits each point in a triangle rather than a rectangle ), but it's still O(N²).

    There may be another better symmetry to exploit.

    0 讨论(0)
  • 2021-01-30 07:44

    a^2 (a+b)*(a+b)+b^2 eg. 66^2 = (66+6)(66-6)+6^2 = 72*60+36= 4356

    for a^n just use the power rule

    66^4 = 4356^2

    0 讨论(0)
  • 2021-01-30 07:46

    First of all great question! I wish there were more questions like this.

    So it turns out that the method I came up with is O(n log n) for general multiplication in the arithmetic complexity only. You can represent any number X as

    X = x_{n-1} 2^{n-1} + ... + x_1 2^1 + x_0 2^0
    Y = y_{m-1} 2^{m-1} + ... + y_1 2^1 + y_0 2^0
    

    where

    x_i, y_i \in {0,1}
    

    then

    XY = sum _ {k=0} ^ m+n r_k 2^k
    

    where

    r_k = sum _ {i=0} ^ k x_i y_{k-i}
    

    which is just a straight forward application of FFT to find the values of r_k for each k in (n +m) log( n + m) time.

    Then for each r_k you must determine how big the overflow is and add it up accordingly. For squaring a number this means O(n log n) arithmetic operations.

    You can add up the r_k values more efficiently using the Schönhage–Strassen algorithm to obtain a O(n log n log log n) bit operation bound.

    The exact answer to your question is already posted by Eric Bainville.

    However, you can get a much better bound than O(n^2) for squaring a number simply because there exist much better bounds for multiplying integers!

    0 讨论(0)
  • 2021-01-30 07:46

    The square root of 2n is 2n / 2 or 2n >> 1, so if your number is a power of two everything is totally simple once you know the power. To multiply is even simplier: 24 * 28 is 24+8. There's no sense in this statements you've done.

    0 讨论(0)
  • 2021-01-30 07:47

    Suppose you want to expand out the multiplication (a+b)×(c+d). It splits up into four individual multiplications: a×c + a×d + b×c + b×d.

    But if you want to expand out (a+b)², then it only needs three multiplications (and a doubling): a² + 2ab + b².

    (Note also that two of the multiplications are themselves squares.)

    Hopefully this just begins to give an insight into some of the speedups that are possible when performing a square over a regular multiplication.

    0 讨论(0)
提交回复
热议问题