LOAD DATA INFILE lets you import large text files into MySQL tables in one statement, making bulk inserts dramatically faster than INSERT.
LOAD DATA INFILE
over many INSERTs?LOAD DATA INFILE streams the entire file directly into the storage engine, bypassing SQL parsing for every row. This cuts network round-trips and reduces redo/undo overhead, resulting in imports that are 10-20× faster than loops of INSERT statements.
Plain CSV or TSV files exported from spreadsheets, ETL tools, or other databases import cleanly. Use the same delimiter, enclosure, and line-ending settings in both export and LOAD DATA INFILE to avoid parsing errors.
Use LOCAL if the file sits on the client machine:LOAD DATA LOCAL INFILE '/path/to/orders.csv' ...
Omit LOCAL when the file is on the MySQL server and secure_file_priv
allows the directory.
Specify FIELDS TERMINATED BY ','
and ENCLOSED BY '"'
for CSV, and match LINES TERMINATED BY '\n'
. Include IGNORE 1 LINES
to skip headers. Add SET created_at = NOW()
to populate audit columns not present in the file.
Yes—list the target columns after the table name. Columns omitted from the list take their default values, letting you ignore auto-increment IDs or computed fields.
Add REPLACE
to overwrite conflicting rows or IGNORE
to skip them. Pick one; omitting both aborts the entire load on the first duplicate.
LOAD DATA INFILE obeys autocommit. Wrap it in START TRANSACTION
/ COMMIT
to ensure an all-or-nothing import. This is crucial when loading into multiple related tables.
Enable SET GLOBAL general_log = 'ON'
temporarily and watch the rows-per-second metric. For million-row CSVs on SSDs, speeds of 50k–150k rows/sec are typical.
Pre-disable secondary indexes, load from the same server disk, import into an empty staging table first, and always validate row counts. Re-enable indexes and run checksum comparisons before promoting data.
InnoDB places a metadata lock but allows concurrent reads unless you use REPLACE, which may escalate locks. For MyISAM, the table is write-locked for the duration.
No, decompress first. Use gunzip -c file.csv.gz | mysql -e "LOAD DATA LOCAL INFILE '/dev/stdin' ..."
for one-liner imports.
Add IGNORE
. MySQL logs row errors to SHOW WARNINGS
but continues loading the remaining data.