Is two-way communication between BigQuery and Google Sheets possible?
In other words, if add a row or modify an entry in Google Sheets it reflects in the corresponding table in BigQuery, and vice versa (no schema changes).
You can have one-way connections in either direction, but not two-way from both.
From Google Sheets to Google BigQuery:
You can define a Sheets file as an external data source in BigQuery. This way, any updates to the sheet will be reflected back in any queries from BigQuery.
Setting this up from the command-line:
Authenticate with Google Drive scopes:
gcloud auth login --enable-gdrive-access
Get the Drive URI of your sheet.
Create the external table definition file:
bq mkdef \
--noautodetect \
--source_format=source_format \
"drive_uri" \
path_to_schema_file > /tmp/mytable_def.json
Modify the file with a text editor with any addition options.
Create the external table to query.
bq mk --external_table_definition=/tmp/mytable_def.json mydataset.mytable
Modifying an external table from BigQuery is not supported.
From BigQuery to Sheets:
Use Connected Sheets to visualize BigQuery data with Google Sheets.
Updating BigQuery data via connected sheets is not supported.
You can add rows to a google sheet and see the results reflected in BQ when you query. Additionally you can add columns to the sheet, and make the appropriate schema changes in BQ and see the resulting values.
You cannot though run DML from BigQuery that would result in additional rows being added to a google sheet.
Related
we currently have a DBT instance that sits over our Google BigQuery data warehouse. Now we've recently been asked to incorporate some data from Google Sheets into our modelling.
With that, is it possible for DBT to connect directly with Google Sheets? i.e. configure Google Sheets as a direct external datasource in the .yml file, or have DBT possibly run some sort of BigQuery federated SQL statement?
There's a DBT package called dbt-external-tables (https://hub.getdbt.com/dbt-labs/dbt_external_tables/latest/), but that only seems to work with BigQuery + files in Google Cloud Storage buckets.
But the common and most straightforward option I'm seeing in forums and documentation is to create an external table on BigQuery on top of the Google Sheet. And then have DBT connect to the external BigQuery table.
Just wanted to check if the above common option for integrating DBT x Google Sheets x BigQuery is in fact the only option, or if there's actually a way to have DBT connect directly to Google Sheets before hitting BigQuery?
Thanks
From what I see over on the dbt-external-tables side, the bigquery adapter folds to a DDL statement for the create_external_table macro.
Unfortunately, I just don't see a similar DDL statement available for the Google Sheets "external" definition. It looks like the UI probably executes something through the bq cli client to create through from the web portal, if I had to guess.
If a section is ever added to this guide which includes a DDL definition for Google Drive based external sources, this would probably become a relatively easy build into the previously mentioned dbt macro for external tables. Until then, you will have to define this through the UI, the bq client yourself, or the REST api.
I have a database table running which references a table created from a Google sheet. Sometimes this query runs whereas other times I see the errors
Error while reading data, error message: Failed to read the spreadsheet. Errors: Deadline=118.95772243s
How I can make querying from this table more reliable?
As noted in the documentation for external data sources, there can be some latency with Google Sheets since the data is not persisted to BigQuery, which is why the query runs inconsistently. As noted in the same documentation, it is recommended to perform the following options:
1) Store the Google Sheets CSV within Google Cloud Storage.
2) Load the data into BigQuery
Note: Loading data into BigQuery from Google Drive is currently not supported at this time, therefore step ‘1’ is required above for loading Google Sheets data into BigQuery..
In addition you may have the above steps scheduled using Google Cloud Composer. Here is an example of how to use Cloud Composer to transfer from Google Cloud Storage into a BigQuery table.
I have a table in Big Query that is coming from Data Prep after some processing. Now I need to get this data to google sheets. I am currently importing the data from Big Query to Google Sheets using the "OWOX BI Big Query Reports" connector.It works fine till I have to refresh it again. All the new columns that I create in Google Sheets after importing the data get removed every time I refresh the data using the connector mentioned above.Is there a better way to fetch data from Big query without disrupting the created columns?
You are using Google Sheets wrong. Don't modify to sheet BQ creates, instead reuse the data in other sheet with IMPORTRANGE function, this way you create a copy of the data and columns created on this new sheet won't disappear.
https://support.google.com/docs/answer/3093340
I have created 3 Google Cloud bigQuery tables mapping to 3 worksheets in a single Google Sheets sheet. I have named the 3 tables according to the 3 worksheet names, and I can run SQL queries against the tables.
My problem is that the results from the query always show the records in the first worksheet within the spreadsheet, not the second or third worksheet.
Is it possible to associate BigQuery tables to different worksheets of the same Google Sheets document? If not, is the only solution to have 3 separate sheet documents each with the first worksheet being the one with the records that BigQuery will query?
This is currently not supported. BigQuery can only query the first sheet. You can follow https://issuetracker.google.com/issues/35905674 if you're interested in this feature.
We use SAS to manage our data, and we have a table that updates every day.
We use Google sheets to create a dashboard.
In this regard I would like to have Google Sheets access the table directly and import all the data, instead of me manually importing the data
Is there a way to do this?
Google sheets does not allow direct import of SAS datasets, according to this page:
https://support.google.com/docs/answer/40608?hl=en
However, you can run a SAS program as a batch job to export your SAS dataset to csv or one of the other supported formats, then I think you could use Google Apps Script to automate the rest of the import, as per this answer:
How to automatically import data from uploaded CSV or XLS file into Google Sheets