Matching CRC32 from STM32F0 and zlib

后端 未结 5 1085
隐瞒了意图╮
隐瞒了意图╮ 2021-02-10 17:53

I\'m working on a communication link between a computer running Linux and a STM32F0. I want to use some kind of error detection for my packets and since the STM32F0 has CRC32 hw

5条回答
  •  花落未央
    2021-02-10 18:21

    From the documentation, it appears that your STM32 code is not just uninteresting — it is rather incomplete. From the documentation, in order to use the CRC hardware you need to:

    1. Enable the CRC peripheral clock via the RCC peripheral.
    2. Set the CRC Data Register to the initial CRC value by configuring the Initial CRC value register (CRC_INIT).(a)
    3. Set the I/O reverse bit order through the REV_IN[1:0] and REV_OUT bits respectively in CRC Control register (CRC_CR).(a)
    4. Set the polynomial size and coefficients through the POLYSIZE[1:0] bits in CRC Control register (CRC_CR) and CRC Polynomial register (CRC_POL) respectively.(b)
    5. Reset the CRC peripheral through the Reset bit in CRC Control register (CRC_CR).
    6. Set the data to the CRC Data register.
    7. Read the content of the CRC Data register.
    8. Disable the CRC peripheral clock.

    Note in particular steps 2, 3, and 4, which define the CRC being computed. They say that their example has rev_in and rev_out false, but for the zlib crc, they need to be true. Depending on the way the hardware is implemented, the polynomial will likely need to reversed as well (0xedb88320UL). The initial CRC needs to be 0xffffffff, and the final CRC inverted to match the zlib crc.

提交回复
热议问题