4
votes

Money datatype has been used for column like VendorHours, OverTime, Expenses in one of table.

I am designing another table which is related to this same table and will have similar columns but I am thinking about using the decimal datatype instead of money as decimal is more precise.

Later I found out that money datatype is being used because it take 8 bytes where as decimal would use 10 for 10-19 Precision.

Columns like Visit Hour, OverTime would fit into decimal with Precision 9 and would take only 5 bytes. So is this a good idea to use decimal(9,2) instead of money?

I will be doing a lot of calculation on those fields inside the stored procedure for reports.

1

1 Answers

5
votes

It depends... if you're not going to do anything to the values other than add or subtract... no problems. If you're going to do anything else (like multiply or divide), you'll need at least 4 decimal places to do the calculations without loosing overall accuracy.

MONEY more accurately represents the real world situation, where each value is rounded to the nearest cent as calculated, then the average is again rounded. In a long calculation chain, the difference can wind up being considerably larger than one cent ... but due to the business-logic constraint that all intermediate values contain non-fractional cents, the MONEY result will be accurate, whereas the "more precise" DECIMAL will not.