Custom template creation in Google dataflow - google-cloud-dataflow

Is it possible to create custom template in Google dataflow for live data streaming into cloud SQL?

No. Google Cloud SQL is not a supported I/O transform. See Built-in I/O Transforms.
Edit: You may be able to connect to Google Cloud SQL with Apache Beam's JdbcIO class. However, the class has been annotated as #Experimental. See the JdbcIO documentation.

Related

Google Cloud Pub/Sub to ingest data from API endpoint and publish as message

I have been trying to build a pipeline in Google Cloud Data Fusion where data source is a 3rd party API endpoint. I have been unable to successfully use the HTTP Plugin, but it has been suggested that I use Pub/Sub for the data ingest.
I've been trying to follow this tutorial as a starting point, but it doesn't help me out with the very first step of the process: ingesting data from API endpoint.
Can anyone provide examples of using Pub/Sub -- or any other viable method -- to ingest data from an API endpoint and send that data down to Data Fusion for transformation and ultimately to BigQuery?
I will also need to be able to dynamically modify the URI (e.g., date filter parameters) in the GET request in this pipeline.
In order to achieve the first step in the tutorial you are following
Ingest CSV (Comma-separated values) data to BigQuery using Cloud Data Fusion.
You need to set up a functioning pub/sub system. This can be done via the command line, the console, or in your case the best would be to use, one of the client libraries. If you follow this tutorial you should have a functioning pub/sub system.
At that point you should be able to follow the original tutorial

How to traffic data between Google Cloud SQL and Flutter?

Cloud SQL documentation about connecting with external apps didn't helped me much. Isn't there some library to handle data traffic like Firebase's Cloud Firestore and Realtime Database offer?
Either use cloud functions to provide an API for Flutter and access to the DB
or run your custom server in the Google cloud that does that.
SQL databases should never be accessed over the internet directly and instead hidden behind a web server that only exposes a limited or specialized API.

Pushing data from iOS to Google Cloud BigQuery

I am new to Google Cloud Platforms and not quite sure with the whole architecture but what I am trying to achieve is to save some data to Google Cloud from an iOS application and do some analytics work on this data using Google Cloud Products, such as: Dataproc and Datalab. From what I read so far I would need to create a dataset in Google Cloud BigQuery and create a table in it. I have done this using the Google Cloud Web UI but now I want to populate the table from my iOS app. I can't seem to find how to do that.
The most painless route would be to wire up Firebase Analytics and then turn on its daily log export to Big Query, as described by Google in the walkthrough Importing Firebase Analytics Data into BigQuery. Google maintains the entire analytic export stack for you then, seeing as they also maintain Firebase. The downside is that the analytics export happens only daily.
Alternatively, you'd be looking at using the Big Query REST API to upload data, as documented by Google in their Loading Data with a POST Request how-to guide. The iOS tooling for that would be your usual NSURLSession and NSURLDataTask APIs, or whatever abstraction you prefer that's built atop them.
Google does maintain a collection of iOS-native APIs, but unfortunately, Big Query is not included amongst the supported APIs as of May 2017. There are native Big Query clients for Go, C#, and Java, amongst others. So you could use your own API for upload to a server you control, and then use one of those client APIs serverside to implement the actual Big Query integration, if you wished.

Accessing Cloud Pub/Sub Message attributes in Cloud DataFlow

According to what I read of DataFlow, the Pub/Sub datasource only gives the message body to work with in the pipeline. We have a use-case where we want to inspect the attributes of the message to make certain decisions. Is there any way of achieving this currently? I'm open to extending the Pub/Sub I/O to incorporate this if required.
Currently, there is no way to access the message attributes of your messages via the PubsubIO connector, but it would clearly be useful to do so. This is tracked in Apache Beam (incubating) as the issue BEAM-404.
I recommend following this issue to keep abreast of new developments.

Using google endpoints with sql

I'm still new to GAE, and I would like to have more wisdom about couple of things.
I searched in documentation, but I think I'm just too stupid to understand some things from documentation.
How can I combine Google Cloud SQL with endpoints? Is there such possibility?
How can I use endpoints to upload videos to google platform?
You should be able to use anything you can do on a non cloud endpoints api to cloud endpoints like google cloud sql. But since it's on preview you might encounter bugs/changes when it goes out of preview. You should create a Cloud SQL tests models on regular app engine app then try to use it on cloud endpoints, so you can minimize debugging for errors.
https://developers.google.com/appengine/docs/python/cloud-sql/
You will need to use a blobstore api:
https://developers.google.com/appengine/docs/python/blobstore/
on your endpoints have a method that creates the upload url and use that to upload from your app then on the uploadHandler it will trigger once the whole file has been uploaded, process your blobInfo key store it appropriately.

Resources