I have command-line program that \"do a lot of work\" and produce \"a lot of statistics\". It\'s stocks trading software pretty critical to delays, bugs, and whatever, so I don\
It would probably be more efficient and resilient to have the 'server' store it's information within a Database and have the 'client' poll this as and when required.
This is going way beyond a single comment of course. What you will eventually need is a dispatching mechanism to notify subscribers about the data record update. Since the amount of data is huge, you might not want to opt for a persistence storage. If the dispatching to multiple subscribers looks overwhelming so far, you can go with a simple interproc (via shared object (aka named pipes) for the consumer-provider to be on the same machine or something more distributed otherwise)
You can try to use a database with a queuing system or a messaging appliance. A server can receive messages to a queue and then the GUI can "subscribe" to the messages.
Microsoft Message Queuing
Also you can take a look at memory mapped files:
Memory-mapped file
These can be shared amongst applications simultaneously when configured properly.
Hope that helps,
Jeffrey Kevin Pry
I suggest you look at a technology like DDS. There are no databases involved and it is not hard to implement. I began looking at OpenSplice DDS recently. There are a number of different implementations out there.
You can use any form of IPC (inter-process communication).
Since you're planning to go over the network in the future, I recommend WCF. If the GUI program isn't .NET, then you may need to use a lower-level solution, such as named pipes or sockets.