PRECISION is not an executable command but a specification that appears inside data type declarations and CAST expressions. In numeric contexts (DECIMAL, NUMERIC) it defines the maximum count of significant digits that can be stored, while an optional SCALE controls digits to the right of the decimal point. In temporal contexts (TIME, TIMESTAMP, INTERVAL) precision represents the number of fractional-seconds digits. Declaring precision helps enforce data quality, determines storage size, and influences rounding rules during inserts, updates, and arithmetic operations. If an inserted or calculated value exceeds the declared precision the database rounds, truncates, or raises an error, depending on the dialect and session settings. Precision accepts positive integers, typically 1-38 for numeric types and 0-9 for fractional seconds. Omitting precision defaults to an implementation-specific value (e.g., NUMERIC defaults to the engine’s maximum; TIMESTAMP defaults to 6 in MySQL, 0 in SQL Server). PRECISION cannot be altered by itself; instead ALTER TABLE … ALTER COLUMN must redefine the full data type.
precision
(INTEGER) - total number of significant or fractional-seconds digitsscale
(INTEGER) - optional, digits to the right of the decimal point (must be <= precision); not used for temporal typesSCALE, DECIMAL, NUMERIC, TIME, TIMESTAMP, INTERVAL, CAST, ALTER TABLE
SQL-92
It sets the total number of allowed digits for numeric types and the number of fractional-seconds digits for temporal types.
Yes. Use ALTER TABLE … ALTER COLUMN to redefine the column with a new data type that includes the desired precision and scale.
No. If omitted, the database applies its default precision, which differs across engines (e.g., NUMERIC defaults to maximum precision; TIMESTAMP defaults vary).
Typical errors include precision exceeds maximum, scale exceeds precision, or numeric value out of range if an inserted value needs more digits than allowed.