Integer calculations on GPU

后端 未结 1 1016
名媛妹妹
名媛妹妹 2020-12-17 09:57

For my work it\'s particularly interesting to do integer calculations, which obviously are not what GPUs were made for. My question is: Do modern GPUs support efficient inte

相关标签:
1条回答
  • 2020-12-17 10:37

    First, you need to consider the hardware you're using: GPU devices performance widely differs from a constructor to another.
    Second, it also depends on the operations considered: for example adds might be faster than multiplies.

    In my case, I'm only using NVIDIA devices. For this kind of hardware: the official documentation announces equivalent performance for both 32-bit integers and 32-bit single precision floats with the new architecture (Fermi). Previous architecture (Tesla) used to offer equivalent performance for 32-bit integers and floats but only when considering adds and logical operations.

    But once again, this may not be true depending on the device and instructions you use.

    0 讨论(0)
提交回复
热议问题