I am developing a component which needs to process the live feed and broadcast the data to the listeners in pretty fast manner ( with about 100 nano second level accuracy, e
You can use these simple extension methods on your event handlers:
public static void Raise<T>(this EventHandler<T> handler, object sender, T e) where T : EventArgs {
if (handler != null) handler(sender, e);
}
public static void Raise(this EventHandler handler, object sender, EventArgs e) {
if (handler != null) handler(sender, e);
}
public static void RaiseOnDifferentThread<T>(this EventHandler<T> handler, object sender, T e) where T : EventArgs {
if (handler != null) Task.Factory.StartNewOnDifferentThread(() => handler.Raise(sender, e));
}
public static void RaiseOnDifferentThread(this EventHandler handler, object sender, EventArgs e) {
if (handler != null) Task.Factory.StartNewOnDifferentThread(() => handler.Raise(sender, e));
}
public static Task StartNewOnDifferentThread(this TaskFactory taskFactory, Action action) {
return taskFactory.StartNew(action: action, cancellationToken: new CancellationToken());
}
Usage:
public static Test() {
myEventHandler.RaiseOnDifferentThread(null, EventArgs.Empty);
}
The cancellationToken
is necessary to guarantee StartNew()
actually uses a different thread, as explained here.
I can't speak to if this will reliably meet the 100ns requirement but here's an alternative where you'd provide the end user with a way to provide you a ConcurrentQueue that you would fill and they could listen to on a separate thread.
class Program
{
static void Main(string[] args)
{
var multicaster = new QueueMulticaster<int>();
var listener1 = new Listener(); //Make a couple of listening Q objects.
listener1.Listen();
multicaster.Subscribe(listener1);
var listener2 = new Listener();
listener2.Listen();
multicaster.Subscribe(listener2);
multicaster.Broadcast(6); //Send a 6 to both concurrent Queues.
Console.ReadLine();
}
}
//The listeners would run on their own thread and poll the Q like crazy.
class Listener : IListenToStuff<int>
{
public ConcurrentQueue<int> StuffQueue { get; set; }
public void Listen()
{
StuffQueue = new ConcurrentQueue<int>();
var t = new Thread(ListenAggressively);
t.Start();
}
void ListenAggressively()
{
while (true)
{
int val;
if(StuffQueue.TryDequeue(out val))
Console.WriteLine(val);
}
}
}
//Simple class that allows you to subscribe a Queue to a broadcast event.
public class QueueMulticaster<T>
{
readonly List<IListenToStuff<T>> _subscribers = new List<IListenToStuff<T>>();
public void Subscribe(IListenToStuff<T> subscriber)
{
_subscribers.Add(subscriber);
}
public void Broadcast(T value)
{
foreach (var listenToStuff in _subscribers)
{
listenToStuff.StuffQueue.Enqueue(value);
}
}
}
public interface IListenToStuff<T>
{
ConcurrentQueue<T> StuffQueue { get; set; }
}
Since given the fact that you can't hold up processing on other listeners this means multiple threads. Having dedicated listening threads on the listeners seems like a reasonable approach to try, and the concurrent queue seems like a decent delivery mechanism. In this implementation it's just constantly polling, but you could probably use thread signaling to reduce the cpu load using something like AutoResetEvent
.
It seems like you are looking for tasks. The following is an extension method i wrote for my job that can asynchronously invokes an event so that every event handler is on their own thread. I can't comment on its speed since that has never been a requirement for me.
UPDATE
Based on the comments i adjusted it so that only one task is created to call all of the subscribers
/// <summary>
/// Extension method to safely encapsulate asynchronous event calls with checks
/// </summary>
/// <param name="evnt">The event to call</param>
/// <param name="sender">The sender of the event</param>
/// <param name="args">The arguments for the event</param>
/// <param name="object">The state information that is passed to the callback method</param>
/// <remarks>
/// This method safely calls the each event handler attached to the event. This method uses <see cref="System.Threading.Tasks"/> to
/// asynchronously call invoke without any exception handling. As such, if any of the event handlers throw exceptions the application will
/// most likely crash when the task is collected. This is an explicit decision since it is really in the hands of the event handler
/// creators to make sure they handle issues that occur do to their code. There isn't really a way for the event raiser to know
/// what is going on.
/// </remarks>
[System.Diagnostics.DebuggerStepThrough]
public static void AsyncSafeInvoke( this EventHandler evnt, object sender, EventArgs args )
{
// Used to make a temporary copy of the event to avoid possibility of
// a race condition if the last subscriber unsubscribes
// immediately after the null check and before the event is raised.
EventHandler handler = evnt;
if (handler != null)
{
// Manually calling all event handlers so that we could capture and aggregate all the
// exceptions that are thrown by any of the event handlers attached to this event.
var invocationList = handler.GetInvocationList();
Task.Factory.StartNew(() =>
{
foreach (EventHandler h in invocationList)
{
// Explicitly not catching any exceptions. While there are several possibilities for handling these
// exceptions, such as a callback, the correct place to handle the exception is in the event handler.
h.Invoke(sender, args);
}
});
}
}
100 ns is a very tough target to hit. I believe it will take a deep understanding of what you're doing and why to hit that kind of performance.
However, asynchronously invoking event subscribers is pretty easy to solve. It's already answered here by, who else, Jon Skeet.
foreach (MyDelegate action in multicast.GetInvocationList())
{
action.BeginInvoke(...);
}
edit: I should also mention that you need to be running on a real-time operating system to give tight performance guarantees to your users.
Signals and shared memory are very fast. You could send a separate signal to tell applications to read a message from a shared-memory location. Of course, the signal is still an event that your application has to consume on a high-priority thread if you want low latency. I would include a time-tag in the data so the receiver can compensate for the inevitable latency.