Request to Machine Learning service on Bluemix using Python - machine-learning

I'm struggling in find a way to send data and have back the prediction of a SPSS model deployed on Bluemix Machine Learning service.
I make a lot of test using request library in Python or with curl command but I did not succeed.
I'm too new to Bluemix to understand the service documentation.
Any help,
Thanks

I managed to pass data in and receive the prediction with the code as follows:
import requests, urllib3, json
access_key= "INSERT_ACCESS_KEY_HERE"
username = "INSERT_USERNAME_HERE"
password = "INSERT_PASSWORD_HERE"
headers = urllib3.util.make_headers(basic_auth='{}:{}'.format(username, password))
payload_online= { "tablename": "INSERT_TABLENAME_HERE", "header": [INSERT_TABLE_HEADERS_HERE],"data": [[INSERT_DATA_TO_USE_FOR_THE_PREDICTION_HERE]]}
url= 'https://ibm-watson-ml.mybluemix.net/pm/v1/score/INSERT_CONTEXTID_HERE?accesskey=INSERT_THE_ACCESS_KEY'
header = {'Content-Type': 'application/json', 'Authorization': "INSERT_TOKEN_HERE"}
response_online = requests.post(url, json=payload_online, headers=header)
print(response_online.text)

Related

How to authenticate a call to a private google cloud function from a container on a VM on the vpc?

In GCP, I have a Compute Engine VM on the default VPC running docker. I have a container app (python fastAPI web app) that needs to call a private cloud function to post some data using requests.
head = {
'Content-Type': 'application/json;charset=UTF-8',
'Accept': 'application/json',
'Authorization': 'Bearer ' + token
}
resp = requests.post(url,
data = msgObj,
headers = head,
timeout=1.50
)
This is all working fine, but the call to the private cloud function needs Bearer Authorization. I can't seem to find a way to programmatically get this token. I can get a token for testing by using
gcloud auth print-identity-token > token.txt on the host VM. This works, but is not acceptable for production use.
Any thoughts?
I have tried to use a token generated by another private cloud function that posts data to the container app.
const auth = new GoogleAuth();
const token = await auth.getAccessToken();
But this token didn't work. (401 Unauthorized)
Getting the token via gcloud (see above) from the VM host works, but is obviously hardcoded and will not work for long - only testing.
This use case (calling a private cloud function from a container) doesn't seem to be covered by anything I can find. There is a video on how to do this from google but was useless - basically saying to use the gcloud approach above.
The answer was actually straightforward once I found it from several clues.
add google-auth=2.6.2 to the requirements doc for the container
add the following to your container code
import google.auth.transport.requests
import google.oauth2.id_token
audience = function_url
auth_req = google.auth.transport.requests.Request()
id_token = google.oauth2.id_token.fetch_id_token(auth_req, audience)
Then use the id_token in the header for the function call
head = {
'Content-Type': 'application/json;charset=UTF-8',
'Accept': 'application/json',
'Authorization': 'Bearer ' + id_token
}
resp = requests.post(audience,
data = msgObj,
headers = head
)

Envoy Lua Filter - How to make HTTP request?

I would like to make http request from my lua filter to an external server.
According to Envoy documentation, http call can be done using request_handle:httpCall:
function envoy_on_request(request_handle)
-- Make an HTTP call to an upstream host with the following headers, body, and timeout.
local headers, body = request_handle:httpCall(
"lua_cluster",
{
[":method"] = "POST",
[":path"] = "/",
[":authority"] = "lua_cluster"
},
"hello world",
5000)
I have created a cluster called lua_cluster in my envoy.yaml file as needed, but the request doesn't reach my server and I'm getting 400 response.
Possible solution??
When changing the authority header from [":authority"] = "lua_cluster" to [":authority"] = "<cluster's url hostname>", the request arrived to the server and I got 200 response from the server. Can someone explain this? Is it a valid thing to do?

ODATA javascript client libraries (what is their value over simple Fetch or AJAX)?

Currently we are using client-side javascript fetch to connect to our ODATA V4 ERP server:
const BaseURL = 'https://pwsepicorapp.com/ERP10.2/api/v1/Erp.BO.JobEntrySvc/'
const fetchJobNum = (async () => {
let url = BaseURL + 'GetNextJobNum'
const reply = await fetch(url,{
method: 'POST',
mode: 'cors',
headers: {
'Accept': 'application/json',
'Authorization': 'Basic xxxx',
'x-api-key' : '0HXJZgldKZjKIXNgIycD4c4DPqSrzn2UFCPHbiR1aY7IW',
'Access-Control-Allow-Origin': '*',
'Content-Type': 'application/json'
},
body: JSON.stringify({})
})
let rsp = await reply.json()
let job = rsp.parameters.opNextJobNum
return job
})
And this works fine for us. We recently started looking at javascript ODATA libraries (Apache OLINGO, O.js, JayData (or other ones suggested at: https://www.odata.org/libraries/)
But what I don't see is an objective guide for a developer understand why and what these libraries provide.
I.e. I think they read the meta-data for the particular ODATA service. Fine but what does power does that add?
Perhaps my mental block is that we are only:
searching only JSON data
Not doing any nested queries (only simple $filter, $select)
Just doing simple GET, POST, PATCH
Or perhaps these libraries were needed for functionality that was missing before ODATA V4
Can anyone give a succinct description of the features for these libraries and their UNIQUE VALUE PROPOSITIONS (to borrow a Venture Capital Term) to developers? I bet others would find this useful.
Short Answer
You are right. If all you are doing just the simple operations you don't need any of these libraries as at the end of the day they are just REST calls that follow some specific conventions(i.e. OData specification).
Long Answer
The reason we have all these client side APIs is that OData offers/defines a lot more stuff.
Lets try to go though it with an example. The example I am using is Batch Requests in OData. I simplest of the terms Olingo defines a way to club multiple HTTP requests in one. It has a well defined syntax for it. That looks something like this
POST /service/$batch HTTP/1.1
Host: host
OData-Version: 4.0
Content-Type: multipart/mixed; boundary=batch_36522ad7-fc75-4b56-8c71-56071383e77b
Content-Length: ###
--batch_36522ad7-fc75-4b56-8c71-56071383e77b
Content-Type: application/http
GET /service/Customers('ALFKI')
Host: host
--batch_36522ad7-fc75-4b56-8c71-56071383e77b
Content-Type: application/http
GET /service/Products HTTP/1.1
Host: host
--batch_36522ad7-fc75-4b56-8c71-56071383e77b--
Now there are quite a few things here.
You have to start the batch request with batch_<Unique identifier> and separate individual HTTP requests with the batch boundary, and when you are done you end it with batch__<Unique identifier>--
You set the batch identifier in as the header and send additional headers(like content-type, content-length) properly that you can see in the .
Now coming back to your original question sure you can use a lot of string concatenation in your JavaScript code and generate the right payload and make an ajax call then parse back a similar kind of response, but as an application developer all you care about is batching your GET, POST, PUT and DELETE request and operation the operation you desire.
Now if you use a client library(the example is generic and might differ from library to library) the code should look something like
OData.request( {
requestUri: "http://ODataServer/Myservice.svc/$batch",
method: "POST",
data: { __batchRequests: [
{ requestUri: "Customers('ALFKI')", method: "GET" },
{ requestUri: "Products", method: "GET" }
]}
},
function (data, response) {
//success handler
}, undefined, OData.batchHandler);
So on a purely business proposition terms libraries like these can save you quite a few man hours based on your application size that will be consumed on generating the right payload strings or right URL string(in case of filters, navigation properties etc.) and debugging thought the code in case you missed a bracket or misspelled a header name, which can be used on building the core logic for the application/product and let the standardized, repetitive and boring(opinionated thought) work for you.

How do I receive Oanda stream using oanda API?

I'd like to receive trading information using Oanda-v20-API. On the site it says that I can use rest-api to get a stream, but I can't find out how to do it using their liibrary?
Or using jax-rs?
Thanks.
try this in python 3 or put it into a jupyter notebook
import requests
import json
headers = {'Content-Type': 'application/json',
"Authorization": "Bearer <<YOUR ACCESS CODE HERE>>"}
# Streaming prices
baseurl = 'https://stream-fxpractice.oanda.com/v3/accounts/<<your account id here>>/pricing/stream'
payload = { 'instruments' : 'EUR_USD'}
r = requests.get(baseurl, params=payload, headers=headers, stream=True)
print(r.headers)
print('\n')
for line in r.iter_lines():
if line:
print(json.loads(line.decode("utf-8")))

SOAP request in ruby with authentication/credentials

I am trying to send XML/SOAP data to a server using an http POST request, and am starting by converting a working Perl script to Ruby on Rails. Using some resources, I have written some preliminary code, but I am unsure how to add user authentication (running this code causes a connection timeout error).
My code so far:
http = Net::HTTP.new(' [host] ', [port])
path = '/rest/of/the/path'
data = [ XML SOAP string ]
headers = {
'Content-Type' => 'application/atom+xml',
'Host' => ' [host] '
}
resp, data = http.post(path, data, headers)
Adding http.basic_auth 'user', 'pass' gave me a no method error
Perl code for supplying credentials:
my $ua = new LWP::UserAgent(keep_alive=>1);
$ua->credentials($netloc, '', "$user", "$pwd");
...
my $request = POST ($url, Content_Type=> 'application/atom+xml', Content=> $soap_req);
my $response = $ua->request($request);
the server uses NTLM, so maybe there is a gem you could recommend (like this?). It looks like the Perl script is using user agents, so I would like to do something similar in Ruby. In summary, how do I add user authentication to my request?
Have you looked at savon gem, https://github.com/savonrb/savon?

Resources