LOAD DATA INFILE or mysqlimport quickly bulk-loads CSV data into a MySQL table.
LOAD DATA INFILE streams a CSV file directly into a table, bypassing row-by-row INSERTs, so large imports finish in seconds instead of minutes.
Use LOAD DATA INFILE 'file.csv' INTO TABLE table_name;
. Add options to match delimiters, headers, and encodings.
Add IGNORE 1 LINES
so MySQL skips the first CSV line and prevents header text from becoming data.
List target columns after the table name. Unlisted CSV fields can be routed to user variables and discarded.
LOCAL paths read from the client machine (LOAD DATA LOCAL INFILE
). Non-LOCAL paths read from the server and need secure_file_priv
access.
LOAD DATA LOCAL INFILE '/tmp/orders.csv'
INTO TABLE Orders
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(id, customer_id, order_date, total_amount);
Wrap LOAD DATA in a stored procedure or use the command-line mysqlimport
tool in a cron job for hands-free updates.
mysqlimport is a shell wrapper around LOAD DATA LOCAL INFILE; it infers the target table from the filename.
mysqlimport --local --fields-terminated-by=, \
--ignore-lines=1 demo_db /tmp/Products.csv
1) Always load into a staging table first. 2) Set sql_mode='STRICT_ALL_TABLES'
to surface bad rows. 3) Disable indexes only for very large files, then rebuild.
No. LOCAL reads from the client, so only mysql --local-infile
must be enabled, not the FILE privilege.
No. Decompress first or use a pipeline (gunzip -c file.gz | mysql ...
).
Use IGNORE
instead of REPLACE
and check SHOW WARNINGS
for rejected rows after the import.