How to calculate MIPS for an algorithm for ARM processor

前端 未结 7 770
抹茶落季
抹茶落季 2021-02-01 10:23

I have been asked recently to produced the MIPS (million of instructions per second) for an algorithm we have developed. The algorithm is exposed by a set of C-style functions.

7条回答
  •  温柔的废话
    2021-02-01 10:48

    I'll bet that your hardware vendor is asking how many MIPS you need.

    As in "Do you need a 1,000 MIPS processor or a 2,000 MIPS processor?"

    Which gets translated by management into "How many MIPS?"

    Hardware offers MIPS. Software consumes MIPS.

    You have two degrees of freedom.

    • The processor's inherent MIPS offering.

    • The number of seconds during which you consume that many MIPS.

    If the processor doesn't have enough MIPS, your algorithm will be "slow".

    if the processor has enough MIPS, your algorithm will be "fast".

    I put "fast" and "slow" in quotes because you need to have a performance requirement to determine "fast enough to meet the performance requirement" or "too slow to meet the performance requirement."

    On a 2,000 MIPS processor, you might take an acceptable 2 seconds. But on a 1,000 MIPS processor this explodes to an unacceptable 4 seconds.


    How many MIPS do you need?

    1. Get the official MIPS for your processor. See http://en.wikipedia.org/wiki/Instructions_per_second

    2. Run your algorithm on some data.

    3. Measure the exact run time. Average a bunch of samples to reduce uncertainty.

    4. Report. 3 seconds on a 750 MIPS processor is -- well -- 3 seconds at 750 MIPS. MIPS is a rate. Time is time. Distance is the product of rate * time. 3 seconds at 750 MIPS is 750*3 million instructions.

    Remember Rate (in Instructions per second) * Time (in seconds) gives you Instructions.

    Don't say that it's 3*750 MIPS. It isn't; it's 2250 Million Instructions.

提交回复
热议问题