How to Migrate from BigQuery to MySQL in PostgreSQL

Galaxy Glossary

How do I migrate data from BigQuery to MySQL?

Migrate data from Google BigQuery tables into a MySQL database using EXPORT DATA and LOAD DATA INFILE commands.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.

Description

Why migrate from BigQuery to MySQL?

Teams move analytical datasets from BigQuery to MySQL when they need OLTP-style serving, cheaper storage, or to embed data in application workflows. MySQL’s widespread hosting options and ACID compliance make it attractive for production workloads.

What are the high-level steps?

First export tables from BigQuery to Cloud Storage. Second copy the files to the MySQL host. Third create equivalent MySQL schemas. Fourth bulk-load the data with LOAD DATA INFILE or mysqlimport. Finally verify counts and automate incremental syncs.

How do I export BigQuery data?

EXPORT DATA OPTIONS(
uri='gs://my-bucket/customers-*.csv',
format='CSV',
overwrite=true,
header=true
) AS
SELECT * FROM `project.dataset.Customers`;

This command writes sharded CSVs with headers to Cloud Storage. Use one export per table.

How do I transfer files to the MySQL server?

Use gsutil or Storage Transfer Service to pull files locally, e.g. gsutil cp gs://my-bucket/customers-*.csv /tmp/. Ensure the MySQL user running LOAD DATA has read permission on the directory.

How do I prepare MySQL schemas?

CREATE TABLE Customers(
id BIGINT PRIMARY KEY,
name VARCHAR(255),
email VARCHAR(255),
created_at DATETIME
) ENGINE=InnoDB;

Match column order and types to the exported CSV to avoid casting errors.

How do I load data into MySQL efficiently?

LOAD DATA INFILE '/tmp/customers-000000000000.csv'
INTO TABLE Customers
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(id, name, email, @created)
SET created_at = STR_TO_DATE(@created,'%Y-%m-%d %H:%i:%s');

Sharded CSVs can be loaded in parallel sessions. Disable secondary indexes during bulk load for speed.

How do I validate row counts?

Run SELECT COUNT(*) FROM Customers; in MySQL and compare with bq query 'SELECT COUNT(*) FROM project.dataset.Customers'. Mismatches mean truncation or encoding issues.

How can I automate the migration?

Wrap EXPORT and LOAD commands in a CI job. Use checksum tables or CDC tools like Airbyte to sync daily deltas after the initial full load.

Best practices for BigQuery → MySQL migration?

Export gzip-compressed CSV to cut network time. Align time zones to UTC. Convert BigQuery TIMESTAMP to MySQL DATETIME with explicit formatting. Keep a rollback backup of the target database.

Common mistakes and how to fix them

Using default BIGINT for BigQuery NUMERIC

NUMERIC has higher precision; cast it to DECIMAL(38,9) in MySQL to prevent overflow.

Forgetting to IGNORE 1 ROWS in LOAD DATA

Headers become data, shifting columns. Always include IGNORE 1 ROWS when your export adds headers.

Further reading

Check Google’s EXPORT DATA docs and MySQL’s LOAD DATA INFILE guide for advanced options.

Why How to Migrate from BigQuery to MySQL in PostgreSQL is important

How to Migrate from BigQuery to MySQL in PostgreSQL Example Usage


-- Export Orders from BigQuery and import into MySQL
-- BigQuery (Orders)
EXPORT DATA OPTIONS(
  uri='gs://my-bucket/orders-*.csv',
  format='CSV',
  overwrite=true,
  header=true
) AS
SELECT id, customer_id, order_date, total_amount
FROM `project.dataset.Orders`;

-- MySQL (Orders)
CREATE TABLE Orders(
  id           BIGINT PRIMARY KEY,
  customer_id  BIGINT,
  order_date   DATE,
  total_amount DECIMAL(12,2)
);

LOAD DATA INFILE '/tmp/orders-000000000000.csv'
INTO TABLE Orders
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(id, customer_id, @odate, total_amount)
SET order_date = STR_TO_DATE(@odate,'%Y-%m-%d');

How to Migrate from BigQuery to MySQL in PostgreSQL Syntax


-- 1. BigQuery export (Customers example)
EXPORT DATA OPTIONS(
  uri='gs://my-bucket/customers-*.csv',
  format='CSV',
  overwrite=true,
  header=true,
  field_delimiter=',',          -- default ','
  quote='"',                    -- optional
  compression='GZIP'            -- optional
) AS
SELECT id, name, email, created_at
FROM `project.dataset.Customers`;

-- 2. MySQL DDL (matching schema)
CREATE TABLE Customers(
  id         BIGINT PRIMARY KEY,
  name       VARCHAR(255),
  email      VARCHAR(255),
  created_at DATETIME
) ENGINE=InnoDB;

-- 3. MySQL bulk load
LOAD DATA INFILE '/tmp/customers-000000000000.csv'
INTO TABLE Customers
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(id, name, email, @created)
SET created_at = STR_TO_DATE(@created,'%Y-%m-%d %H:%i:%s');

Common Mistakes

Frequently Asked Questions (FAQs)

Can I export directly from BigQuery to MySQL without files?

No native streaming path exists. You must export to Cloud Storage or use a data-integration tool that performs extract-load.

What file format is fastest for LOAD DATA?

GZIP-compressed CSV balances speed and size. MySQL decompresses on the fly, and network transfer is smaller.

How do I keep BigQuery and MySQL in sync after migration?

Use incremental exports based on updated_at columns or deploy a CDC tool like Airbyte or Debezium.

Want to learn about other SQL terms?

Trusted by top engineers on high-velocity teams
Aryeo Logo
Assort Health
Curri
Rubie Logo
Bauhealth Logo
Truvideo Logo
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.