Is it possible to trigger a fusion table to geocode without using gui? - google-fusion-tables

I have done fusion-table map for a customer. I am updating the table rows with server scripts, sometimes I need to delete all the rows and then create new rows. None of my data comes with geocoded addresses.
This map is shown in my client website, all updating is done in a server script.
So, my question is once I update the data, the map doesn't update with the new data until I log into the fusion table and I manually trigger a geocode.
Is it possible to trigger that remotely?

You can't trigger the FT geocoder remotely, but you can use the same Google Maps API to first geocode your addresses, then add them and their coordinates to the table. You can see an example of this in Synchronizing Fusion Tables with Google Forms.

Related

How to create external table using dbt from Google Sheet to BigQuery?

I want to create external table in BigQuery and the data source is from Google Sheet. Is it possible to do it using dbt? In the yml file, where should I put the URI?
The main problem is, I don’t have the access to create it directly in BigQuery.
One way to handle a Google Sheet as a source is by creating a new table out of it in BigQuery via Connected Sheets.
Then, you create a new source in dbt that relies on that table, and start building your downstream models from there.
As far as I know, you cannot create a source directly from dbt, unless it is a seed file, which I woul not recommend unless it is a rather static file (e.g. country names and ISO codes, which is not prone to change over time).
We have a similar situation where the data source is from Google Sheet.
The end user updates the Google sheet on a periodical basic and we replicate it using Fivetran to our Snowflake datastore.
DBT can then pick up the data seamlessly.

How looks like the log of a google sheets sourced table update in BigQuery?

I have several tables in BigQuery that are sourced from Google Sheets tables. When the Google Sheets table is updated then automatically the table in BigQuery is also updated. I am trying to understand how the log of this event looks like in the Operations Logging. My end idea is to create a sink of theses logs in order to create a Pub/Sub and run scheduled queries based on these events.
Thank you
When you use external Table (Google sheet or other) the data are never stored in BigQuery native storage. It's always external.
Therefore, when you update your Google Sheet, nothing happens in BigQuery. It's only when you query the data, you will read (again) the sheet document and get the latest data.
Therefore, there is no insert log that you can track when you update the data in Google Sheet. The only log that you have is when you perform a request in BigQuery to read the data (external or not), as mentioned by Sakshi.
When the external data source(Google Sheet or other) is updated and the BigQuery table associated with it is queried, BigQuery initiates an insert job which is visible in Cloud Logging.
You can find this log by applying filter resource type as BigQuery Project in Cloud Logging console, ie. you will see protoPayload.methodName set to google.cloud.bigquery.v2.JobService.InsertJob.
For more information on BigQuery Logs you can refer to this documentation.

Finding delete entity calls to Azure storage table

Is there a way to find out if there was any delete entity call to a azure table in last 'N' minutes? Basically my goal is to find all operations that updated the table in last 'N' minutes.
Update: I am looking for a way to do it with a rest api call for a specific table in the storage.
If using Azure Portal an option, you can find this information via Metrics. For example, see the screenshot below
]
Basically here I am taking a sum of all transactions against my table storage where API call was DeleteEntity.
You can find more information about it here: https://learn.microsoft.com/en-us/azure/storage/common/storage-metrics-in-azure-monitor?toc=%2fazure%2fstorage%2fblobs%2ftoc.json.
UPDATE
If you wish to get this information programmatically, I believe you will need to use Azure Monitoring REST API. I looked up the request sent by Portal and it is sending a request to /subscriptions/<my-subscription-id>/resourceGroups/<my-resource-group>/providers/Microsoft.Storage/storageAccounts/<my-storage-account>/tableServices/default/providers/Microsoft.Insights/metrics/Transactions endpoint.
UPDATE 2
For a specific table, the only option I can think of is to fetch the data from Storage Analytics Logs which is stored in $logs blob container and then parse the CSV file manually. You may find these links helpful:
https://learn.microsoft.com/en-us/rest/api/storageservices/storage-analytics-log-format
https://learn.microsoft.com/en-us/rest/api/storageservices/storage-analytics-logged-operations-and-status-messages#logged-operations

how to insert data in to google fusion table form iphone app

I want to insert data from my iPhone app to fusion table my fusion table is private, my select query in working properly if i insert data manually in to fusion but i don't want to insert data manually in want when my app get value that vale has to insert in to fusion table, i referred google API Documentation, i read this also Insertion into google fusion table but still i m not understanding how to implement this can please any one guide me for this.
Thanks in advance!!!
You should consider to use a so called Service Account. You can create such accounts in the Google API console. A Service Account acts as a deputy of the user. You can make calls to the API, change data etc. Just make sure that the Service Account has correct permission on the table you want to edit.

Using rails to place markers on google maps from locations stored in DB

OK so I added a google map into my site that lets users search for locations based on either city, state or by zip code that shows all shops of a certain type around them. It uses the places API to grab them, BUT I also have a table in my DB that has a bunch of shop locations set up by users. The table stores the shop details, as well as the address and the latLng (and soon the place reference number if they have a places account). What I want to do is have my google map display all the user added shop locations as well.
Im not sure how to have rails find the shops in the DB that are in proximity to the location searched for by the user.
So if I went to the site and typed in 20175 for the zip code, rails would find all the records that are close to that zip code. I guess the query would have to be based on the latLng.
This site was able to do it: http://www.checkoutmyink.com/shops and is pretty much what I want to do.
If you want to perform spatial algebra with rails, i would recommend using rGeo. It fits well with ActiveRecord and adapts to a bunch of spatial DB's (including Postgis).
Still, using spatial DB stored procedures will be far more efficient to retrieve points within a radius (using GIST index), e.g. using ST_DWithin with Postgis

Resources