Let's try to compute the point at which your numbers could become large enough to overflow 64-bit numbers.
Let's assume you're doing your measurements once per microsecond. At a rate of 1 million increments per second, it'll take 264/1'000'000 seconds for a 64-bit number to overflow. That works out to over a half million years of counting. Even if you increase the rate to once per nanosecond, it would still take well over 500 years.
For the running total, you could (theoretically) run out a little sooner. If, for example, you had 100 Gigabit Ethernet, and had it running at maximum theoretical bandwidth all the time, you'd run out in (a little) less than 47 years.
If you limit yourself to technologies most of us can actually afford, about the fastest transfer rates most of use deal with are to/from SSDs. Assuming you had drives that could handle it, the most recent SATA Express specification supports transfers at up to 16 Gb/s. You'd need to saturate that 24/7 for well over 200 years before you used up the full range of a 64-bit integer.
Hmm...maybe we should look at main memory. Let's assume 4 channels of the fastest DDR 4 memory yet specified, and (as always) the grossly unrealistic assumption that you'll keep it operating at maximum theoretical bandwidth 24/7. With this, you could still count all all transfers to and from memory for over 4 years at a time before you'd be in any danger of a 64-bit integer overflowing.
Of course, you could try to over-clock the CPU and RAM to get there a little faster, but that would probably be a losing game--anything more than the very most modest overclock will probably reduce the life expectancy of the parts, so the machine would probably die before the 64-bit integer overflowed.
Bottom line: Your need for 128-bit integers seems questionable at best.