Joakim Backman's answer is specific, but this may bring additional clarity to it.
There is a minor difference. As per SQL For Dummies, 8th Edition (2013):
The DECIMAL data type is similar to NUMERIC. ... The difference is that your implementation may specify a precision greater than what you specify — if so, the implementation uses the greater precision. If you do not specify precision or scale, the implementation uses default values, as it does with the NUMERIC type.
It seems that the difference on some implementations of SQL is in data integrity. DECIMAL allows overflow from what is defined based on some system defaults, where as NUMERIC does not.