Error while querying google sheets connected permanent table in BigQuery - google-sheets

I am trying to set up a scheduled query on BigQuery using new UI, which includes a reference to a table which is connected to a google sheet.
I am unable to schedule the query as BigQuery throws up this error:
"Error while reading table: dataset.table_name, error message: Found corrupted data while opening file."
However, when I manually run the query or directly query the table using:
SELECT * From dataset.table_name
the query runs and shows accurate results, even though the error still pops up.
Is there a workaround to scheduling this query?

There are some known issues and limitations for scheduling queries. For instance, you might need to update your credentials to query the Google Drive data. Moreover, you need to be sure that the destination table for your scheduled query is in the same region as the source of your data as this may give you this error message.
Hope it helps.

Related

Why my stored procedures in MariaDB do not appear when connect to Tableau?

I don't know why but I can not see the stored procedures appear when I connect the database to Tableau (I use MariaDB). I can only see the data tables.
Anyone has the same problems with me? I am a newbie so I am not sure if my description is clear or not.
Use the stored procedures.
I found that Tableau does not connect to stored processes and that one way around this is that when you connect to your server, you should use the initial query function. Once you log in, grab Custom SQL and for that script simply use
select * from #nameoftemptable
and Execute.

How looks like the log of a google sheets sourced table update in BigQuery?

I have several tables in BigQuery that are sourced from Google Sheets tables. When the Google Sheets table is updated then automatically the table in BigQuery is also updated. I am trying to understand how the log of this event looks like in the Operations Logging. My end idea is to create a sink of theses logs in order to create a Pub/Sub and run scheduled queries based on these events.
Thank you
When you use external Table (Google sheet or other) the data are never stored in BigQuery native storage. It's always external.
Therefore, when you update your Google Sheet, nothing happens in BigQuery. It's only when you query the data, you will read (again) the sheet document and get the latest data.
Therefore, there is no insert log that you can track when you update the data in Google Sheet. The only log that you have is when you perform a request in BigQuery to read the data (external or not), as mentioned by Sakshi.
When the external data source(Google Sheet or other) is updated and the BigQuery table associated with it is queried, BigQuery initiates an insert job which is visible in Cloud Logging.
You can find this log by applying filter resource type as BigQuery Project in Cloud Logging console, ie. you will see protoPayload.methodName set to google.cloud.bigquery.v2.JobService.InsertJob.
For more information on BigQuery Logs you can refer to this documentation.

Finding delete entity calls to Azure storage table

Is there a way to find out if there was any delete entity call to a azure table in last 'N' minutes? Basically my goal is to find all operations that updated the table in last 'N' minutes.
Update: I am looking for a way to do it with a rest api call for a specific table in the storage.
If using Azure Portal an option, you can find this information via Metrics. For example, see the screenshot below
]
Basically here I am taking a sum of all transactions against my table storage where API call was DeleteEntity.
You can find more information about it here: https://learn.microsoft.com/en-us/azure/storage/common/storage-metrics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.json.
UPDATE
If you wish to get this information programmatically, I believe you will need to use Azure Monitoring REST API. I looked up the request sent by Portal and it is sending a request to /subscriptions/<my-subscription-id>/resourceGroups/<my-resource-group>/providers/Microsoft.Storage/storageAccounts/<my-storage-account>/tableServices/default/providers/Microsoft.Insights/metrics/Transactions endpoint.
UPDATE 2
For a specific table, the only option I can think of is to fetch the data from Storage Analytics Logs which is stored in $logs blob container and then parse the CSV file manually. You may find these links helpful:
https://learn.microsoft.com/en-us/rest/api/storageservices/storage-analytics-log-format
https://learn.microsoft.com/en-us/rest/api/storageservices/storage-analytics-logged-operations-and-status-messages#logged-operations

Google Data Prep - cannot import table from BigQuery (created from Google Sheets) "Nothing found"

I created one table in BigQuery from Google Sheets, when I tried importing it in Cloud Data Prep it says that there are no tables in the dataset.
I'm not sure whether it's an issue with the Google sheet integration, because when I check the details of the table it says there are 0 rows even though there are 3 rows (it's test data).
I already tried giving Cloud Data Prep viewer permission for both the dataset and the project and nothing changes.
Print screens:
Data Prep
BigQuery table info
BigQuery entries
Well apparently when you create a table from google sheets it is not recognized as an actual table, so I made a query to replicate it to a second table and it works!

Fusion Tables 500 response on csv import to private table

Since monday my Android app users have been getting a 500 return code from Fusion Tables whenever they try to save their data.
Within the app we convert all their geo data into csv format and POST it to Fusion Tables using an insert command. Before Sunday this appears to have been working fine. Starting on monday we are seeing problems like the the following logcat with 100% of our saves:
10-29 12:18:34.083: W/System.err(3650): java.io.FileNotFoundException: https://www.googleapis.com/upload/fusiontables/v1/tables/[valid table ID redacted]/import?access_token=[valid access token redacted]
Despite the error message, manually checking the Fusion Tables shows no error and all seems fine.
Given the other problems with Drive on monday I'm guessing that the team rolled out some changes over the weekend. Perhaps there was a change that means I need to terminate the stream with a special character or something. Anyone else experiencing a similar problem or have any idea what is going on?
I'm up to about 30 complaints from users now and it's getting a bit old.
We believe this is resolved now -- are you still getting the reports? If it's the problem we identified, the imports were in fact eventually completing even though an error status was returned.

Resources