What is the difference between DECIMAL and NUMERIC?
It's not clear to me if they're totally interchangeable.
to my understanding they can work in the same way, using DECIMAL OR NUMERIC is the same ..in short DECIMAL (7,2) is the same as NUMERIC (7.2)
Thanks for reaching out.
There is one notable difference between NUMERIC and DECIMAL in standard SQL. The NUMERIC data type is strict; it enforces the exact precision and scale that you have specified. This is in stark contrast to DECIMAL, which allows more numbers than the stated precision.
Hope this helps.
Could you clarify this statement " This is in stark contrast to DECIMAL, which allows more numbers than the stated precision." with an example?
If a given datapoint is say 123456.789, then does DECIMAL(5,2) allow the entire datapoint to be stored as an attribute value or does it round it to 1234567.89 thereby allowing more numbers than the stated precision of 5, while enforcing the same scale of 2?