1. Home
  2. Docs
  3. Local Product Inventory
  4. Step 1: Setup Data Source...
  5. Option 1: Setup Big Query as Data Source for Local Product Inventory

Option 1: Setup Big Query as Data Source for Local Product Inventory

Video Walkthrough

Description  :

Google BigQuery is a scalable, fully managed, serverless cloud data warehouse that supports lightning-fast SQL queries over large datasets.
When used as a Datahash source for Local Product Inventory (LPI), BigQuery enables you to send real-time or batch updates of store-level product availability to ad platforms — powering local inventory ads and improving local commerce performance.

Prerequisites  

  •  Google Cloud Project with BigQuery API enabled
  • Service Account with:
    • BigQuery User role
    • BigQuery Data Editor role
  •  Service Account JSON Key for authentication
  • Local Product Inventory dataset prepared in Datahash-required schema

Getting Started:  

Select BigQuery Source  

  • In the left navigation, go to Sources → Warehouse
  • Click the BigQuery connector tile

Choose File Data Type  

  • On the File Data Type selection screen, choose Local Product Inventory
  • Provide a Source Name for your connection
  • Click Next

Choose Integration Method  

You can integrate via:

  • Table Path — Pull directly from a defined table in BigQuery
  • Query Path — Pull data via a custom SQL query
  • Click Validate Credentials
  • If validation succeeds, click Finish to complete setup

Option B: Query Path  

  • Provide the following credentials:
    • JSON Key → Service Account JSON key file
    • Project ID → GCP Project hosting the dataset
    • Dataset ID → Dataset name in BigQuery
  • Click Validate Credentials
  • On the “Configure” screen, enter your SQL Query to fetch Local Product Inventory data
  • Click Preview Results to verify output
  • If correct, click Finish to complete setup
  • If everything looks good, click Finish to complete the setup. The source connector set-up will be marked ‘Completed’ as below. If the set-up is exited before finishing the set-up, the connector will remain in pending status and still be completed any time later by clicking the Connection Name in the below step.

__________________________________________________________________________________

Where to find the Service Account Credentials JSON Key, Project ID and Dataset ID in the Google Cloud Platform:

  • Navigate to IAM & Admin → Service Accounts
  • Create or select an existing service account with BigQuery User + BigQuery Data Editor roles
  • Go to Keys → Add Key → Create New Key → JSON
  • Save the JSON file

Project ID:

Project ID  

  • Found in your project list in GCP dashboard

__________________________________________________________________________________

Dataset ID:

  1. Log in to your Google cloud console https://console.cloud.google.com
  2.  
  3. In BigQuery Studio, click the dataset to view Dataset ID in the info panel

__________________________________________________________________________________

Table Name  

Expand the dataset in BigQuery Explorer and copy the table name

How to create a BigQuery Service Account:

  • Once done, you will get the “API Enabled” badge.
  • On the Service accounts page, choose your BigQuery project, and then choose Create service account.
  • On the Service account details page, enter a descriptive value for Service account name. Choose Create and continue. The Grant this service account access to the project page opens.
  • For Select a role, choose BigQuery USer, and then choose BigQuery Data Editor.
  • Choose Continue, and then choose Done.
  • Choose Keys, Add key, Create new key.
  • Choose JSON, and then choose Create. Choose the folder to save your private key or check the default folder for downloads in your browser.

How can we help?