This for small payloads.
I am looking to achieve 1,000,000,000 per 100ms.
The standard BinaryFormatter is very slow. The DataContractSerializer is slow than Bina
The only reason to serialize objects is to make them compatible with a generic transport medium. Network, disk, etc. The perf of the serializer never matters because the transport medium is always so much slower than the raw perf of a CPU core. Easily by two orders of magnitude or more.
Which is also the reason that attributes are an acceptable trade-off. They are also I/O bound, their initialization data has to be read from the assembly metadata. Which requires a disk read for the first time.
So, if you are setting perf requirements, you need to focus 99% on the capability of the transport medium. A billion 'payloads' in 100 milliseconds requires very beefy hardware. Assume a payload is 16 bytes, you'll need to move 160 gigabytes in a second. This is quite beyond even the memory bus bandwidth inside the machine. DDR RAM moves at about 5 gigabytes per second. A one gigabit Ethernet NIC moves at 125 megabytes per second, burst. A commodity hard drive moves at 65 megabytes per second, assuming no seeking.
Your goal is not realistic with current hardware capabilities.
You could write a custom serialization by implement ISerailizable on your data structures. Anyway you will probably face some "impedence" from the hardware itself to serialize with these requirements.
Your performance requirement restricts the available serializers to 0. A custom BinaryWriter and BinaryReader would be the fastest you could get.