Please note that this is not homework and i did search before starting this new thread. I got Store an int in a char array?
I was looking for an answer but didn\'t get a
1) it sort of work since you use an array of characters for the transportation, I would use a binary protocol personally. If you can use the 4 bytes of your variable, I would take a look to htonl/ntohl functions (they are on virtually every unix and on windows since w2k), else see below
2) with a binary protocol, encoding would be
uint32_t a = 0xff00ffaa;
char byte_array[1024]; // this is the array to be transferred over the network
// leave byte_array[0] for version byte
// leave the high order byte in a since you want only the 3 lowest
byte_array[1] = (char)((a & 0x00FF0000)>>16);
byte_array[2] = (char)((a & 0x0000FF00)>>8);
byte_array[3] = (char)(a & 0x000000FF);
and decoding would be
uint32_t a = 0;
a |= byte_array[1]<<16;
a |= byte_array[2]<<8;
a |= byte_array[3];
Maybe you need to make this work with an existing protocol, in which case, ignore my answer.
Rather than reinvent the wheel here, why don't you use Google's Protocol Buffers library to do this job? Simpler, more flexible and very efficient.
What you're doing will sort-of work. You're not transferring the bytes of the data - you're transferring the numeric value of the data. As a result a buffer of size 5 is way too small for the data you're sending (0xFF00FFAA has a numeric value of 4278255530 - 10 bytes).
To transfer the bytes you need to do something like the following (assumes little endian):
Encode:
char array[1024]; // outgoing network data
int next = 0;
array[next++] = value & 0xFF;
array[next++] = (value >> 8) & 0xFF;
array[next++] = (value >> 16) & 0xFF;
array[next++] = (value >> 24) & 0xFF;
These statements strip off the bytes of the value and assign them to successive values in your array.
Decode:
char array[1024]; // incoming network data
int next = 0;
value = 0;
value |= (int)*((unsigned char*)array)[next++];
value |= (int)*((unsigned char*)array)[next++] << 8;
value |= (int)*((unsigned char*)array)[next++] << 16;
value |= (int)*((unsigned char*)array)[next++] << 24;
These statements pull the bytes out of the array and push them back into the value.
If you want to try to optimize your network format and not transfer bytes you can eliminate some of the data. But remember that your sender and receiver need to know from each other what to expect - so there needs to be some communication of what the type or length of data elements being passed is.
It's probably best to use some existing tool. If you can't - do you care about endianness (i.e. is this a cross platform protocol?)
Otherwise, you can simply do something like...
unsigned char msg[1024];
int writeIndex = 0;
[...]
int mynum = 12345;
memcpy(msg + writeIndex , &mynum, sizeof mynum);
writeIndex += sizeof mynum;
and to decode
//[...] also declare readIndex;
memcopy(&mynum, msg + readIndex, sizeof mynum);
readIndex += sizeof mynum;
(you could replace the notion of msg + index
with an unsigned char pointer, though this is unlikely to matter).
Using memcpy like this is liable to be slower, but also more readable. If necessary, you could implement a memcopy clone in a #define or inline function - it's just a short loop of assignments, after all.
You're storing as ASCII, where you should be storing the bytes themselves.
The encoding should be something like:
uint32_t a = 0xff00ffaa;
unsigned char byte_array[1024];
Notice how I made your target array unsigned, to indicate that it's "raw bytes", and not actually characters.
byte_array[0] = a >> 24;
byte_array[1] = a >> 16;
byte_array[2] = a >> 8;
byte_array[3] = a >> 0;
This serializes the variable a
into the four first bytes of byte_array
using big-endian byte ordering, which is sort of the default for many network protocols.
You may also want to see my answer here: question 1577161.
What you have will not work in the manner in which you have it. For example, a is 32 bit and in your example you the high order bits set, which means it cannot fit into a 4 digit number with your printf statement. (0xff00ffaa = 4278255530, which is more then 4 digits) I believe it will overflow the buffer. I believe printf will convert it and overflow the field, but it depend on how your compiler/C implements the printf function when there is not enough buffer space.
For the printf statement you have, the maximum value you could pass in would be 9999 for 4 characters. Likewise, in you example of transferring the data with the 3 byte length field, you would have a maximum length of 999. In theory your length could be 1000, if you added 1 to the length, but the buffer you have declared, is 1024 where the maximum buffer length you would need would be 1004 bytes long.
Using ASCII characters does make messages/data portable across the system, but it is at the expense of using more bandwidth/space and programming time and effort to convert back and forth from ASCII to transfer the data.
It seems like you have a good idea, but it still needs a bit of work.