preprocessing in google cloud ml engine using google-api-php-client-services - google-cloud-ml-engine

I am using google-api-php-client-services to train my data of my PHP website in google cloud. How do I do preprocessing in https://cloud.google.com/blog/big-data/2016/12/how-to-classify-images-with-tensorflow-using-google-cloud-machine-learning-and-cloud-dataflow using this library?
Is there any alternative to google-api-php-client to interact with ml_engine in PHP?

That is the only client library available for CMLE. But that client library only provides PHP wrappers for Google Cloud APIs such as CMLE's API
The actual preprocessing done in the blog post is written as a Dataflow job in Python. As the blog post describes to run it you have to execute a Python program.
You won't be able to write your Dataflow job using PHP.

Related

Google Cloud Pub/Sub to ingest data from API endpoint and publish as message

I have been trying to build a pipeline in Google Cloud Data Fusion where data source is a 3rd party API endpoint. I have been unable to successfully use the HTTP Plugin, but it has been suggested that I use Pub/Sub for the data ingest.
I've been trying to follow this tutorial as a starting point, but it doesn't help me out with the very first step of the process: ingesting data from API endpoint.
Can anyone provide examples of using Pub/Sub -- or any other viable method -- to ingest data from an API endpoint and send that data down to Data Fusion for transformation and ultimately to BigQuery?
I will also need to be able to dynamically modify the URI (e.g., date filter parameters) in the GET request in this pipeline.
In order to achieve the first step in the tutorial you are following
Ingest CSV (Comma-separated values) data to BigQuery using Cloud Data Fusion.
You need to set up a functioning pub/sub system. This can be done via the command line, the console, or in your case the best would be to use, one of the client libraries. If you follow this tutorial you should have a functioning pub/sub system.
At that point you should be able to follow the original tutorial

How to use API instead of using Google BigQuery Data Transfer Service?

I am trying to create BigQuery Data transfer config for Google Adwords through API using a programming language (Python, Java). I looked at the documentation about BigQuery data transfer API. But there is no proper process for that. Maybe I could not understand properly. Can anyone help me in understanding how to use API to get daily analytic data from YouTube instead of paying YouTube to use their BigQuery Data transfer?
You need to get started using Adwords SQL
https://developers.google.com/adwords/api/docs/guides/first-api-call
Refer to the Getting Started section of the Python client library README file to download and install the AdWords API client library for Python.

Custom template creation in Google dataflow

Is it possible to create custom template in Google dataflow for live data streaming into cloud SQL?
No. Google Cloud SQL is not a supported I/O transform. See Built-in I/O Transforms.
Edit: You may be able to connect to Google Cloud SQL with Apache Beam's JdbcIO class. However, the class has been annotated as #Experimental. See the JdbcIO documentation.

Do GCP ML Engine online predictions support API keys?

I'm attempting to make online predictions using https://ml.googleapis.com/v1/projects/<project>/models/<model>/versions/<version>:predict?key=<mykey>
I created a service account with full ML Engine Admin access, then generated an unrestricted API key. Many of Google's online docs mention ?key= as a valid auth method for their APIs. Can anyone confirm that this is supported by ML Engine today?
I was able to make the same request using Authorization: Bearer <access token> with my own personal access token, but I do not want to do that. Additionally, the request will ultimately be made via PHP, and the client libraries for PHP currently do not support ML Engine yet.
Thanks
It's not supported by CMLE Online Prediction.
There is a list of supported GCP services here: https://cloud.google.com/docs/authentication/api-keys

Use Google Gears geolocation from a python app

I'd like to use the Google geolocation API in my app, written in Python. My problem is that Google provides a JSON interface (easily useable from Python) but from http://code.google.com/p/gears/wiki/GeolocationAPI I see that the API "is published to allow developers to provide their own network location server for use through the Gears API. Google's network location server is only to be used through the Gears API. See section 5.3 of the Gears Terms of Service at [address]."
It is a very strange thing: there is a very cool JSON but I cannot use it. I have to use it through Google Gears instead. But how can I do it from a Python app?
For example, I see that the geolocation service provided by Firefox calls directly the JSON API. Why is FF able to do that?
Thanks,
Alessio Palmero Aprosio
Google has deprecated Gears entirely, as the geolocation feature is now standard in modern browsers (for certain values of "standard").
The pylocation module may provide the information you need. It can output the geolocation data in text, json, or xml.

Resources