Building a Looker Studio Custom Connector

Galaxy Glossary

How do I build a Looker Studio custom connector?

A Looker Studio custom connector is a Google Cloud–hosted Apps Script or Cloud Function that translates any third-party data source into the standardized schema Looker Studio (formerly Data Studio) needs for visualizations.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Description

What Is a Looker Studio Custom Connector?

A Looker Studio custom connector is a lightweight web service—most commonly written in Google Apps Script—that implements Google’s getData(), getSchema(), and authentication interfaces. Once deployed to Google Cloud, it lets analysts query almost any REST, SQL, or file-based system from inside Looker Studio as if it were a native data source.

Why Build Your Own Connector?

  • Unlock data silos. Many SaaS tools, internal microservices, or flat-file exports are not yet covered by the community connector gallery.
  • Control performance & cost. You can cache, sample, or aggregate data before it hits Looker Studio’s quota-limited front end.
  • Enforce governance. Custom connectors are executable code, so you can embed row-level security, API key rotation, and audit logging.
  • Maintain brand consistency. White-label the authentication flow and configuration screen so business users have a polished experience.

High-Level Architecture

  1. User adds your connector in Looker Studio’s “Create Data Source” UI.
  2. Looker Studio loads an Add-on style configuration panel served by your Apps Script.
  3. When the report runs, Looker Studio calls your getData() endpoint with user-provided parameters.
  4. Your code queries the third-party API or database, reformats rows into a JSON table, and returns them.

Prerequisites

  • Google Cloud project with OAuth consent screen approved.
  • Basic knowledge of JavaScript/TypeScript.
  • Access to the target data source (API keys, database creds, etc.).
  • Optional: clasp CLI for local Apps Script development.

Step-by-Step Guide

1. Bootstrap the Project

# Install clasp if you haven’t
npm i -g @google/clasp

# Log in
clasp login

# Create a new script bound to a Cloud project
clasp create --type standalone --title "My Custom Connector"

2. Wire Up OAuth2 (If Needed)

Most third-party APIs require OAuth2. Install Google’s helper library:

npm install google-apps-script-oauth2 --save

Configure the flow in Code.gs:

function getOAuthService() {
return OAuth2.createService('myService')
.setAuthorizationBaseUrl('https://provider.com/oauth2/auth')
.setTokenUrl('https://provider.com/oauth2/token')
.setClientId(PROP.getProperty('CLIENT_ID'))
.setClientSecret(PROP.getProperty('CLIENT_SECRET'))
.setCallbackFunction('authCallback')
.setPropertyStore(PropertiesService.getUserProperties());
}

3. Implement getConfig()

function getConfig(request) {
var cc = DataStudioApp.createConfig();
cc.newTextInput()
.setId('project_id')
.setName('Project ID')
.setHelpText('Enter the numeric project identifier.');
return cc.build();
}

4. Define the Schema in getSchema()

function getSchema(request) {
return {
schema: [
{name: 'date', label: 'Date', dataType: 'YEAR_MONTH_DAY', semantics: {conceptType: 'DIMENSION'}},
{name: 'impressions', label: 'Impressions', dataType: 'NUMBER', semantics: {conceptType: 'METRIC', isReaggregatable: true}}
]
};
}

5. Fetch Data in getData()

function getData(request) {
var url = 'https://api.provider.com/v1/metrics?project=' + request.configParams.project_id;
var response = UrlFetchApp.fetch(url, {
headers: {Authorization: 'Bearer ' + getOAuthService().getAccessToken()}
});
var json = JSON.parse(response.getContentText());

var rows = json.data.map(function(r) {
return {
values: [r.date, r.impressions]
};
});

return {
schema: getSchema(request).schema,
rows: rows
};
}

6. Deploy

  1. Run clasp push to upload code.
  2. In the Apps Script UI → Deploy → New Deployment → Select Data Studio Connector.
  3. Map the deployment to your Cloud project.
  4. Submit for OAuth verification if the connector is public.

7. Test in Looker Studio

Create a new data source, choose “Build Your Own” at the bottom, and paste the deployment ID. After authorizing, you should see your fields and be able to build charts.

Best Practices

  • Pagination & Caching. Respect API limits by paginating and using CacheService or Google Cloud Memorystore.
  • Field Reaggregation. Mark metrics as isReaggregatable = true only when safe (e.g., sums, counts).
  • Error Handling. Throw DataStudioApp.createConnectorError() with actionable messages so users know what went wrong.
  • Version Pinning. Use clasp’s --version flag; Looker Studio caches deployments aggressively.
  • Automated Testing. Write Jest tests for getSchema() and formatResponse() helpers, then CI-deploy via GitHub Actions.

Common Misconceptions

“Apps Script can’t handle large datasets.”

Apps Script has memory/time limits, but you can stream data through Cloud Functions or BigQuery, then proxy results.

“I must open-source my code.”

Unlisted deployments can remain private to your domain; only gallery connectors require a repo.

“Schema changes require users to reconnect.”

Field additions are backward-compatible. Use canBlend carefully to avoid breaking blends.

Complete Working Example

// File: Code.gs
function getAuthType() {
return {type: 'NONE'}; // public API, no auth
}

function getConfig(request) {
var cc = DataStudioApp.createConfig();
cc.newTextInput()
.setId('subreddit')
.setName('Subreddit')
.setPlaceholder('e.g., r/dataengineering');
return cc.build();
}

function getSchema() {
return {schema: [
{name: 'created_utc', label: 'Date', dataType: 'YEAR_MONTH_DAY', semantics:{conceptType:'DIMENSION'}},
{name: 'score', label: 'Score', dataType: 'NUMBER', semantics:{conceptType:'METRIC', isReaggregatable:true}}
]};
}

function getData(request) {
var url = 'https://www.reddit.com/r/' + request.configParams.subreddit + '/top.json?limit=100&t=day';
var json = JSON.parse(UrlFetchApp.fetch(url, {muteHttpExceptions:true}).getContentText());
var rows = json.data.children.map(function(post) {
var p = post.data;
return {values: [Utilities.formatDate(new Date(p.created_utc*1000),'GMT','yyyyMMdd'), p.score]};
});
return {schema: getSchema().schema, rows: rows};
}

When to Use Galaxy Instead

Galaxy is a SQL editor, so if your data already lives in a relational database—PostgreSQL, Snowflake, BigQuery—you can often skip the connector entirely. Have analysts query the tables directly in Galaxy, endorse the SQL, and share collections. Build a Looker Studio report on top of database connectors, which are first-class and require no code.

Why Building a Looker Studio Custom Connector is important

Custom connectors let teams visualize data from systems that lack native Looker Studio support. Instead of exporting CSVs or building one-off dashboards, you deliver a reusable, governed integration that any analyst can use with zero code. This accelerates time-to-insight, reduces manual work, and keeps stakeholders in the BI tool they already know.

Building a Looker Studio Custom Connector Example Usage



Common Mistakes

Frequently Asked Questions (FAQs)

What languages can I use to build a connector?

Google officially supports JavaScript (Apps Script) and any language that can run in Cloud Functions or Cloud Run. JavaScript is most common because deployment is frictionless.

Do I need my own server?

No. Apps Script runs on Google’s infrastructure. For heavy workloads you can proxy to Cloud Functions or an external server, but many connectors remain 100% serverless.

How long can getData() run?

The hard limit is 30 seconds for Apps Script. For longer jobs, pre-aggregate data in BigQuery or cache results in Cloud Storage and stream small chunks.

Can I skip a connector and query SQL directly in Galaxy?

Yes. If your data source is a SQL database, Galaxy’s modern editor and AI copilot let you build and share queries faster than writing a connector. Then use Looker Studio’s built-in database connection to visualize the results.

Want to learn about other SQL terms?