I\'m curious as to whether or not there is a real difference between the money
datatype and something like decimal(19,4)
(which is what money uses
I want to give a different view of MONEY vs. NUMERICAL, largely based my own expertise and experience... My point of view here is MONEY, because I have worked with it for a considerable long time and never really used NUMERICAL much...
MONEY Pro:
Native Data Type. It uses a native data type (integer) as the same as a CPU register (32 or 64 bit), so the calculation doesn't need unnecessary overhead so it's smaller and faster... MONEY needs 8 bytes and NUMERICAL(19, 4) needs 9 bytes (12.5% bigger)...
MONEY is faster as long as it is used for it was meant to be (as money). How fast? My simple SUM
test on 1 million data shows that MONEY is 275 ms and NUMERIC 517 ms... That is almost twice as fast... Why SUM test? See next Pro point
MONEY Con:
money
doesn't need to be so precise and is meant to be used as money, not just a number...But... Big, but here is even your application involved real-money, but do not use it in lots of SUM operations, like in accounting. If you use lots of divisions and multiplications instead then you should not use MONEY...