On Windows 10 build 10.1607.14393.10 (aka Anniversary edition) I am unable to get MJPG capture streams anymore. Used to be both MJPG and YUY2 resolutions, now I am getting o
Since this is the answer let me state first that several workarounds exists ranging from hacks to expensive development
Now to the proper design of "frame server":
We also have frame server in our zSpace system design that servers decompressed images, compressed cameras feed (four of them at almost 1Gpixel per second total), blob detection information and 3D poses triangulation results to multiple clients (applications including remote) at the same time. Whole thing using shared memory and/or sockets is just few hundred of lines of straight C code. I've implemented it and it works on Windows and Linux.
The deficiency of the Microsoft "improvement" lays in ignorance to the client needs and I believe is easy to fix.
For the sake of the argument let's assume camera streams compressed format (could be MJPG/H.26x/HEVC/something new and better).
Let say there are several possible classes of clients:
Enough? Today they all get NV12 (which actually constitutes even bigger data loss in terms of half a bandwidth of U (Cb) and V(Cr) samples).
Now, since Microsoft is implementing frame server, they have to decompress the data one way or another and even multiple ways. For that the uncompressed data has to land in memory and can be (conditionally) kept there in case some client can benefit from using it. The initial media graph design allowed splitters and anybody with a little coding proficiency can implement conditional splitter that only pushes data to the pins that have client (sinks) attached.
Actually, correct implementation should take clients needs into account (and this information is already present and readily available from all the clients in a way of media type negotiations and attributes that control the graph behavior). Then it should apply decompressors and other filters only when needed paying close attention to cpu cache locality and serve requested data to appropriate clients from appropriate memory thru appropriate mechanisms. It will allow all kind of the optimizations in the potential permutations of the mix of aforementioned client and beyond.
If Microsoft needs help in designing and implementing the frame server satisfying this simple (if not to say trivial) set of requirements - all it has to do is ask - instead of breaking huge class of applications and services.
I wonder how Microsoft is planing to network stream Hollolens input? Via NV12? Or via yet another hack?
"Developers, Developers, Developers..." :(