According to the SQL2003 standard (§6.1 Data Types), whose relevant excerpt you can see transcribed in that reply in Soen (the standard in itself is not free to consult, needing to be purchased), establishes a small difference between these two types:
- The
NUMERIC
needs to have the precision exact specified as well as the scale;
- The
DECIMAL
need to have at least the specified accuracy, and the identical scale.
In other words, if an implementation chooses to represent the DECIMAL
with greater precision than requested, she is free to do so. But if she wants to implement it with exact accuracy - making this kind functionally equivalent to NUMERIC
- This is also in line with the standard.
Mysql - like pointed out by bfavaretto in the comments - is one that does not distinguish between types (SQL Server is another). According to documentation, in Mysql "NUMERIC
is implemented as DECIMAL
", so that its behavior is identical. And, as required by the standard, the precision used is exactly the requested.
About using one or the other, I don’t have enough experience to comment, but Maniero’s argument in your answer that the use of DECIMAL
can harm portability (i.e. potentially lead to different results when the bank is migrated from one DBMS to another) is already a good reason, in my opinion, to avoid this type.
I don’t know about Mysql, but second that answer in Soen according to SQL standards
NUMERIC
needs to have the precision exact specified, while theDECIMAL
need to have at least the specified accuracy (but may have more). Implementations, however, may or may not treat them as equals - SQL Server for example does not distinguish between types, which is in accordance with the standards; I don’t know information about how this is done in Mysql.– mgibsonbr
Mysql also makes no distinction.
– bfavaretto
@mgibsonbr - This post you linked was very enlightening. Consider turning it into a response.
– emanuelsn