问题
When trying to correctly reverse-engineer and decode data on a UART connection, I have arrived at the following conclusions about the format of the data being received.
- Data is sent in "packets". Each packet is delimited only by time (spaces) between transmissions.
- Packets are of variable length. The length is specified by the third byte in the sequence.
- Data is not framed using any special characters or out-of-band signals, but a valid data packet can be (assumed to be) valid, based on the final byte which is a checksum value of the frame.
When using a logic analyzer, it is easy to discern packets. However, feeding the data via UART to a program makes delimiting packets impossible. All received data is enqueued by the operating system. While certain handlers may be added to trigger on data received events, this does not ensure that data available in the OS's uart queue will be a whole packet.
Are there any best practices for separating such data?
Addendum:
My current solution (which has huge overhead and a large error rate):
Starting from the first byte in the queue, try to parse a frame. If the size specified in the frame is larger than 0x20 (there are no packets larger than 32bytes - header and checksum included) then the current "start byte" is considered invalid and dropped, and recoginition continues from the next byte etc)
The other solution I am working on is using a microcontroller to parse the data and frame it properly, either in-band or out-of-band. This is a better solution, as such a time sensitive protocol should require a RTOS. But still, there must be a way to implement this on a normal OS.
Logic Analyzer:
(The first and second byte ARE NOT constant. I have deducted that the first byte is an address (or maybe a timeslot, and the second byte is a packet type).回答1:
The other solution I am working on is using a microcontroller
(Modbus is a serial protocol that seems to also rely on idle time to delimit message frames.)
The best way to detect such gaps is to use a USART/UART that can measure this with hardware at the actual receiver input. Any software solution would likely to be prone to inaccuracy and false events due to latency.
The USARTs in Atmel ARM (and maybe AVR32) SoCs have a "receiver time-out" feature. Each received character restarts this timer. An interrupt can be generated after a specified interval (i.e. when no more character are received for a while). This timeout could be treated as an End-of-Message event.
From an Atmel datasheet:
The Receiver Time-out provides support in handling variable-length frames. This feature detects an idle condition on the RXD line. When a time-out is detected, the bit TIMEOUT in the Channel Status Register (US_CSR) rises and can generate an interrupt, thus indicating to the driver an end of frame.
ADDENDUM
A possible software solution would require a (high-resolution) periodic timer that the U(S)ART driver would use to count time intervals between received characters. Using PIO instead of DMA, the driver would have to reset the count_of_intervals as each char is received. When the count exceeds a threshold (i.e. count
* interval_time
> inter_message_gap_time
), then the receiver has been silent too long, indicating an inter-message gap.
来源:https://stackoverflow.com/questions/27152926/parsing-time-delimited-uart-data