I have an API built using Spray that handles file uploads.
I am trying to write a test for the upload functionality but I'm not getting anywhere fast. I'm nots sure how to structure the test to simulate a file upload.
I have the following test...
"Valid POST Requests should return success" in {
Post("/upload", HttpEntity(MediaTypes.`multipart/form-data`, """{"filename":"a.wav"}""")) ~>
sealRoute(uploadRoute) ~> check {
response.status should be equalTo OK
responseAs[String] === "..."
}
}
Running this produces the following error message...
Content-Type with a multipart media type must have a non-empty 'boundary' parameter' is not equal to ...
This seems like an error message similar to how to mock POST/Upload requests using apache bench where you have to specify a post file and the boundary to separate the form items.
I was hoping for something closer to how CURL works.
Either way, can anyone point me in the right direction as to how I correctly structure such a test?
Thanks
So I managed to get this working by cobbling together some code from a variety of posts I found - primarily posts relating to using spray-client to do file uploads.
Probably not the prettiest but works for me! :)
"Valid POST Requests should return success" in {
val file = new File("a.wav")
val httpEntity = HttpEntity(MediaTypes.`multipart/form-data`, HttpData(file)).asInstanceOf[HttpEntity.NonEmpty]
val formFile = FormFile("file", httpEntity)
val mfd = MultipartFormData(Seq(BodyPart(formFile, "file")))
Post("/upload", mfd) ~> sealRoute(uploadRoute) ~> check {
response.status should be equalTo OK
body.contentType.toString() === "application/json; charset=UTF-8"
responseAs[String] === "Success!"
}
}
I have the same issue, or question.
Try adding a boundary by doing:
Post("/upload", HttpEntity(MediaTypes.multipart/form-data.withBoundary("-somerandomboundary"), """{"filename":"a.wav"}""")) ~>
Although, you might face the next bump I face, which is an error saying it requires a start boundary.
Related
I do not know the difference between these two end points:
a) /api/sn_chg_rest/v1/change/emergency
b) /api/now/table/change_request?sys_id=-1&sysparm_query=type=emergency
b) once submitted changes to "normal" response type
Issue: Unable to submit a request of type Emergency, Standard, OR Expedited.
Things I have Tried: url = 'https://xxxx.service-now.com/api/now/table/change_request?sys_id=-1&sysparm_query=type=expedited <<changes to normal, the site only allows edits into emergency or normal once submitted with this link>>
url = 'https://xxxx.service-now.com/api/sn_chg_rest/v1/change/emergency <<This one seems to be working only for emergency & normal, also the user is locked into emergency and normal even when logged in to edit type manually once submitted via script >>
Outcome of the current code below in conjuction with the "Things I have Tried" There is a CHG#XXX created but no matter what the Key:xxxxxx "sys-pram_query=type=xxxxxx" changes to (i.e. "Normal", "Expedited", "Emergency", "Standard") looks like this ---> ("sys-pram_query=type= Emergency","sys-pram_query=type= Expedited","sys-pram_query=type= Standard") the type on the ServiceNow-site defaults to "Normal" once the code below runs creating the request using the POST Method.
#Need to install requests package for python
#easy_install requests
import requests
# Set the request parameters
url = 'https://xxxx.service-now.com/api/now/table/change_request?sysparm_fields=type'
# Eg. User name="admin", Password="admin" for this code sample.
user = 'admin'
pwd = 'admin'
# Set proper headers
headers = {"Content-Type":"application/json","Accept":"application/json"}
# Do the HTTP request
response = requests.post(url, auth=(user, pwd), headers=headers ,data="{\"type\":\"Emergency\"}")
# Check for HTTP codes other than 200
if response.status_code != 200:
print('Status:', response.status_code, 'Headers:', response.headers, 'Error Response:',response.json())
exit()
# Decode the JSON response into a dictionary and use the data
data = response.json()
print(data)
Alternative Options for url THAT MAY NOT WORK = 'https://xxxx.service-now.com/api/now/table/"optionsA" OR "B" OR "C" is as follows:
A) POST /sn_chg_rest/change/standard/{standard_change_template_id}
B) POST api/sn_chg_rest/change/normal
C) POST Versioned URL /api/sn_chg_rest/{version}/change/emergency
link for A, B , C above : https://developer.servicenow.com/dev.do#!/reference/api/orlando/rest/change-management-api#changemgmt-POST-emerg-create-chng-req
Resources:
https://docs.servicenow.com/bundle/paris-it-service-management/page/product/change-management/task/t_AddNewChangeType.html
https://developer.servicenow.com/dev.do#!/reference/api/orlando/rest/change-management-api
API_URL="/api/sn_chg_rest/v1/change/emergency"
this Might have worked, going to confirm.
Yup this works ! unable to submit Standard OR Expedited. But that might be a setting that needs to be enabled (Not sure). Looking into it further. Some progress.
I can't find information about acceptable response content or code status from my site to zapier app on test step.
I have my site on Laravel and Zapier app for this site. In my Zapier app I have an action: "Create New Project". I made my "create" according to the example. Everything works except the testing step. I tested with the following zap:
Trello -> "New card created" trigger. Test successful.
My app -> "Create new project". Test fails with We had trouble sending your test through. Could not handle special, non-standard characters. Please contact support.
Strangely, the project was created successfully. Therefore, I think the problem lies in the response from my site to zapier:
// creates/project.js
// My perform function:
perform: (z, bundle) => {
const promise = z.request({
url: `${process.env.BASE_URL}/api/folder`,
method: 'POST',
body: JSON.stringify({
title: bundle.inputData.title,
}),
headers: {
'content-type': 'application/json',
}
});
return promise.then((response) => JSON.parse(response.content));
}
//ZapierController.php
public function addFolder(Request $request)
{
// Made record to DB, and other, after this returned same data which in request
return response()->json(['title' => $request['title']]);
}
Expected result - successful test on "Test this step". Can anyone help me?
David here, from the Zapier Platform team.
I already answered your support ticket, but I figured I'd reply here in case anyone else has the same issue.
The root problem is made clear when you run zapier logs on the cli:
== Log
Unhandled error: CheckError: Invalid API Response:
- Got a non-object result, expected an object from create ("title")
What happened:
Executing creates.сFolder.operation.perform with bundle
Invalid API Response:
- Got a non-object result, expected an object from create ("title")
Your server can reply with any 2xx status code, but the output needs to be valid json. Something like {"title": "my title here"} would certainly work. Users find it more helpful to get info about the project they just created, so the name, id, etc would be even better.
As for why this surfaced as a character encoding issue, I have no clue. We plan on getting to the bottom of it though!
I understand the whole process of dialogflow and I have a working deployed bot with 2 different intents. How do I actually get the response from the bot when a user answers questions? (I set the bot on fulfillment to go to my domain). Using rails 5 app and it's deployed with Heroku.
Thanks!
If you have already set the GOOGLE_APPLICATION_CREDENTIALS path to the jso file, now you can test using a ruby script.
Create a ruby file -> ex: chatbot.rb
Write the code bellow in the file.
project_id = "Your Google Cloud project ID"
session_id = "mysession"
texts = ["hello"]
language_code = "en-US"
require "google/cloud/dialogflow"
session_client = Google::Cloud::Dialogflow::Sessions.new
session = session_client.class.session_path project_id, session_id
puts "Session path: #{session}"
texts.each do |text|
query_input = { text: { text: text, language_code: language_code } }
response = session_client.detect_intent session, query_input
query_result = response.query_result
puts "Query text: #{query_result.query_text}"
puts "Intent detected: #{query_result.intent.display_name}"
puts "Intent confidence: #{query_result.intent_detection_confidence}"
puts "Fulfillment text: #{query_result.fulfillment_text}\n"
end
Insert your project_id. You can find this information on your agent on Dialogflow. Click on the gear on the right side of the Agent's name in the left menu.
Run the ruby file in the terminal or in whatever you using to run ruby files. Then you see the bot replying to the "hello" message you have sent.
Obs: Do not forget to install the google-cloud gem:
Not Entirely familiar with Dilogflow, but if you want to receive a response when an action occurs on another app this usually mean you need to receive web-hooks from them
A WebHook is an HTTP callback: an HTTP POST that occurs when something happens; a simple event-notification via HTTP POST. A web application implementing WebHooks will POST a message to a URL when certain things happen.
I would recommend checking their fulfillment documentation for an example. Hope this helps you out.
Im trying to create a table on BigQuery - I have a single dataset and need to use the api to add a table and import data (json.tar.gz) from cloud storage. I need to be able to use the ruby client to automate the whole process. I have two questions:
I have read the docs and tried to get it to upload (code below) and have not been successful and have absolutely no idea what Im doing wrong. Could somebody please enlighten me or point me in the right direction?
Once I make the request, how do I know when the job has actually finished? From the API, I presume Im meant to use a jobs.get request? Having not completed the first part I have been unable to get to look at this aspect.
This is my code below.
config= {
'configuration'=> {
'load'=> {
'sourceUris'=> ["gs://person-bucket/person_json.tar.gz"],
'schema'=> {
'fields'=> [
{ 'name'=>'person_id', 'type'=>'integer' },
{ 'name'=> 'person_name', 'type'=>'string' },
{ 'name'=> 'logged_in_at', 'type'=>'timestamp' },
]
},
'destinationTable'=> {
'projectId'=> "XXXXXXXXX",
'datasetId'=> "personDataset",
'tableId'=> "person"
},
'createDisposition' => 'CREATE_IF_NEEDED',
'maxBadRecords'=> 10,
}
},
'jobReference'=>{'projectId'=>XXXXXXXXX}
}
multipart_boundary="xxx"
body = "--#{multipart_boundary}\n"
body += "Content-Type: application/json; charset=UTF-8\n\n"
body += "#{config.to_json}\n"
body += "--#{multipart_boundary}\n"
body +="Content-Type: application/octet-stream\n\n"
body += "--#{multipart_boundary}--\n"
param_hash = {:api_method=> bigquery.jobs.insert }
param_hash[:parameters] = {'projectId' => 'XXXXXXXX'}
param_hash[:body] = body
param_hash[:headers] = {'Content-Type' => "multipart/related; boundary=#{multipart_boundary}"}
result = #client.execute(param_hash)
puts JSON.parse(result.response.header)
I get the following error:
{"error"=>{"errors"=>[{"domain"=>"global", "reason"=>"wrongUrlForUpload", "message"=>"Uploads must be sent to the upload URL. Re-send this request to https://www.googleapis.com/upload/bigquery/v2/projects/XXXXXXXX/jobs"}], "code"=>400, "message"=>"Uploads must be sent to the upload URL. Re-send this request to https://www.googleapis.com/upload/bigquery/v2/projects/XXXXXXXX/jobs"}}
From the request header, it appears to be going to the same URI the error says it should go to, and I am quite at a loss for how to proceed. Any help would be much appreciated.
Thank you and have a great day!
Since this is a "media upload" request, there is a slightly different protocol for making the request. The ruby doc here http://rubydoc.info/github/google/google-api-ruby-client/file/README.md#Media_Upload describes it in more detail. I'd use resumable upload rather than multipart because it is simpler.
Yes, as you suspected, the way to know when it is done is to do a jobs.get() to look up the status of the running job. The job id will be returned in the response from jobs.insert(). If you want more control, you can pass your own job id, so that in the event that the jobs.insert() call returns an error you can find out whether the job actually started.
Thank you for that. Answer resolved. Please see here :
How to import a json from a file on cloud storage to Bigquery
I think that the line of code in the docs for the resumable uploads section (http://rubydoc.info/github/google/google-api-ruby-client/file/README.md#Media_Upload) should read:
result = client.execute(:api_method => drive.files.insert,
Otherwise, this line will throw an error with 'result' undefined:
upload = result.resumable_upload
I'm building an app which is architected as a Rails server app providing RESTful api's to the client. The Rails server uses RABL. The client is an Angular JS client performing standard $http calls (gets, puts, etc).
Occasionally my Rails server will produce an error (let's say validation error attached to the object) or even no error in which case I would want to display something to the user - either the errror e.g., "The record did not save because..." or "The record was updated successfully".
I'm trying to map out a pattern on both the Rails side and the Angular/client side to handle this.
As for Rails:
I can certainly pass back a node in each of my RABL files to contain error arrays
I can also return different RABL by checking in the controller before returning
Most suggest using http codes (which makes sense) as per here (although there doesn't seem to be a consistent usages of the codes for something like a validation error).
As for Angular:
I suppose I can write a response interceptor but not sure how that would fully get flushed out.
I guess I'm hoping that I don't have to reinvent the wheel here and someone can point me to a pattern that's currently used and suggested (and localized).
I went ahead and implemented what I thought needed to be done. Thanks for digger69 for some help with this.
On the Rails side, I went with using an http status code. As per here I agreed with using a 400 http status code for error validation.
In my controllers I now have something like the following:
def create
my_obj = MyObj.build_with_params(params)
if my_obj.save
respond_with(my_obj) # regular RABL response
else
respond_with_errors(my_obj.errors)
end
end
In my application_controller.rb I defined a common method respond_with_errors
# respond back to the client an http 400 status plus the errors array
def respond_with_errors(errors)
render :json => {:errors => errors}, :status => :bad_request
end
Note that the :bad_request symbol is already defined for Rails as per here
On the client side I needed to intercept http calls (not only for validation but for authentication failures too (and probably more). Here is an example of my code in Angular (thanks to this post for the help with that):
var interceptor = ['$rootScope', '$q', function (scope, $q) {
function success(response) {
return response;
}
function error(response) {
var status = response.status;
if (status == 401) { // unauthorized - redirect to login again
window.location = "/";
} else if (status == 400) { // validation error display errors
alert(JSON.stringify(response.data.errors)); // here really we need to format this but just showing as alert.
} else {
// otherwise reject other status codes
return $q.reject(response);
}
}
return function (promise) {
return promise.then(success, error);
}
}];
$httpProvider.responseInterceptors.push(interceptor);
I now can be consistent with my rails code and deal with success returns from http calls on the client. I'm sure I have some more to do, but I think this gives a localized solution.
Use an HTTP response interceptor. I am currently using that successfully in an application.
http://docs.angularjs.org/api/ng.$http
From the documentation:
$provide.factory('myHttpInterceptor', function($q, dependency1, dependency2) {
return function(promise) {
return promise.then(function(response) {
// do something on success
}, function(response) {
// do something on error
if (canRecover(response)) {
return responseOrNewPromise
}
return $q.reject(response);
});
}
});
$httpProvider.responseInterceptors.push('myHttpInterceptor');
In my case I created a feedback service, which displays either success or error messages globally. An other option would be to broadcast the responses on the rootscope.