For your screen of 1920 x 1080, with 4 byte color, you are looking at approximately 8 MB per frame. With 20 FPS, you have 160 MB/s. So getting from 8 MB to 200 KB (4 MB/s @ 20 FPS) is a great improvement.
I would like to get your attention to certain aspects that I am not sure you are focusing on, and hopefully it helps.
- The more you compress your screen image, the more processing it might need
- You actually need to focus on compression mechanisms designed for series of continuously changing images, similar to video codecs (sans audio though). For example: H.264
- Remember, you need to use some kind of real-time protocol for transferring your data. The idea behind that is, if one of your frame makes it to the destination machine with a lag, you might as well drop the next few frames to play catch-up. Else you will be in a perennially lagging situation, which I doubt the users are going to enjoy.
- You can always sacrifice quality for performance. The simplest such mechanism that you see in similar technologies (like MS remote desktop, VNC, etc) is to send a 8 bit color (ARGB each of 2 bits) instead of 3 byte color that you are using.
- Another way to improve your situation would be to focus on a specific rectangle on the screen that you want to stream, instead of streaming the whole desktop. This will reduce the size of the frame itself.
- Another way would be to scale your screen image to a smaller image before transmitting and then scale it back to normal before displaying.
- After sending the initial screen, you can always send the diff between
newpixels
and previouspixels
. Needless to say the the original screen and the diff screen will all be LZ4 compressed/decompressed. Every so often you should send the full array instead of the diff, if you use some lossy algorithm to compress the diff.
- Does UpdatedRegions, have overlapping areas? Can that be optimized to not send duplicate pixel information?
The ideas above can be applied one on top of the other to get a better user experience. Ultimately, it depends on the specifics of your application and end-users.
EDIT:
Color Quantization can be used to reduce the number of bits used for a color. Below are some links to concrete implementations of Color Quantization
- Optimizing Color Quantization for Images
- nQuant library
Usually the quantized colors are stored in a Color Palette and only the index into this palette is given to the decoding logic