Create a secure, performant JDBC connection between Amazon Redshift and Looker for modeling and dashboarding.
Looker needs a live SQL connection to Redshift so it can run queries in real time, build Explores, and refresh dashboards without ETL.
You need a Redshift user with SELECT on target schemas, the cluster’s host & port, an allow-listed Looker IP range, and the JDBC driver (bundled in Looker).
Run CREATE USER looker_user PASSWORD '•••'; then GRANT USAGE ON SCHEMA public TO looker_user; and GRANT SELECT ON ALL TABLES IN SCHEMA public TO looker_user;.
Add Looker’s outbound IP addresses to the Redshift security group ingress rules (TCP 5439). This avoids handshake timeouts.
In Admin → Connections, click “New Connection”, choose Redshift, and supply Host, Port, Database, Username, and Password. Leave JDBC params empty unless you need SSL or fetch size tuning.
Add `ssl=true&tcpKeepAlive=true&defaultRowFetchSize=10000` in the JDBC field to improve stability and transfer rate.
Click “Test” in Looker. A green check means JDBC, auth, and permissions work. If it fails, inspect VPC rules and user grants.
Use Looker’s SQL Runner and execute SELECT customer_id, SUM(total_amount) AS revenue FROM orders GROUP BY 1 LIMIT 10;
. A result set confirms read access.
Use a dedicated, non-superuser account; restrict schemas; enable SSL; rotate passwords via Looker Secrets Manager; and monitor STL_USERLOG.
Create a nightly job that runs ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO looker_user;
so new tables stay readable.
No, but enabling ssl=true secures data in transit and is recommended.
Yes. Supply an Amazon RDS IAM token in the Password field and rotate it automatically with a script.
Set query_limits and PDT concurrency in LookML, and monitor STL_QUERY for long-running statements.