I have got following URL
https://development.avalara.net/1.0/tax/get
and would like to POST following JSON request body
{
"DocDate": "2011-05-11",
"CustomerCode": "CUST1",
"Addresses":
[
{
"AddressCode": "1",
"Line1": "435 Ericksen Avenue Northeast",
"Line2": "#250",
"PostalCode": "98110"
}
]
}
which then will give JSON response
{
"DocCode": "78b28084-8d9a-477c-9f26-afab1c0c3877",
"DocDate": "2011-05-11",
"Timestamp": "2011-05-11 04:26:41",
"TotalAmount": 10,
"TotalDiscount": 0,
"TotalExemption": 0,
"TotalTaxable": 10,
"TotalTax": 0.86,
“TotalTaxCalculated”: 0.86,
"TaxDate": "2011-05-11",
.......
}
I have tried to use
Ext.Ajax.request
but get error
Origin http://localhost is not allowed by Access-Control-Allow-Origin.
which might be due to having different domain.
So, then i tried to use JSONP
Ext.data.JsonP.request
(
{
url: 'https://development.avalara.net/1.0/tax/get',
callbackName: 'test',
method: 'POST',
jsonData: '{"DocDate": "2011-05-11", "CustomerCode": "CUST1", "Addresses": [ { "AddressCode": "1", "Line1": "435 Ericksen Avenue Northeast","Line2": "#250", "PostalCode": "98110" } ] }' ,
success: function(response) {
//do some successful stuff
Ext.Msg.alert(response);
},
failure: function(response) {
//complain
Ext.Msg.alert('fail');
}
});
But URL 404(Not Found) error is encountered and request method is GET instead of POST.
Can anyone help me how POST request body(JSON) and obtaind JSON response from different domain?
Thanks in advance
You have four options:
Use CORS. development.avalara.net would need to setup CORS on the server and allow the domain that the Sencha page is running on.
Reverse Proxy requests through a server on the domain that the Sencha page is running on:
Sencha page (mydomain.com) ---> Web Server (mydomain.com) ---> development.avalara.net
Sencha page (mydomain.com) <--- Web Server (mydomain.com) <--- development.avalara.net
You could also POST the form as a regular form post action, or POST the form inside a hidden iframe.
http://docs.sencha.com/extjs/4.2.1/#!/api/Ext.form.Basic-cfg-standardSubmit
Run the Sencha app inside phonegap/cordova which does not block cross-domain requests.
You cannot do JSON-P with POST requests, JSON-P only supports GET requests. Your options are:
Use a GET request with JSON-P
Move the server functionality to the same server your ST app is running
Use something like Cordova and Whitelist the server you want to use for your AJAX POST requests, then use Ext.Ajax.request.
Related
Is there any way to pass response object values from first request to second request as input parameters in graph batch request (2nd request is dependant on 1st request - graph/json-batching)
In the following request, the client is specifying that requests 1 should be run first, then request 2.
2nd Request need the id from the 1st Request's response as URL variable. What is the way to achieve it?
JSON
{
"requests": [
{
"id": "1",
"method": "GET",
"url": "/users/<upn>?$select=id"
},
{
"id": "2",
"dependsOn": [ "1" ],
"method": "GET",
"URL": "users/<id from the 1st request>/presence"
}
]
}
Yes, As #Tiny-wa said this is not possible as of now. There is already a feature request raised in the Microsoft Graph Feedback Forum, please upvote it so that the product team may implement it in future.
So, for now you need to make two separate requests, make first request and get response details and use it and make a second request.
I am using the ugcPost endpoint to generate my shares.
According to documentation I am making a pre-upload request (with multipart settings included), and I receive what seems like the appropriate multipart response to that request. However, there are absolutely NO aws key-id or anything... the headers mentioned in the sample are not the ones I am receiving after registering the upload as multipart.
They mention this response to a single part upload request:
{
"value": {
"asset": "urn:li:digitalmediaAsset:C5400AQHpR1ANqMWqNA",
"mediaArtifact": "urn:li:digitalmediaMediaArtifact:(urn:li:digitalmediaAsset:C5400AQHpR1ANqMWqNA,urn:li:digitalmediaMediaArtifactClass:aws-userUploadedVideo)",
"uploadMechanism": {
"com.linkedin.digitalmedia.uploading.MediaUploadHttpRequest": {
"headers": {
"Content-Type": "application/octet-stream",
"x-amz-server-side-encryption": "aws:kms",
"x-amz-server-side-encryption-aws-kms-key-id": "e10ace24-blah-4977-bar-89foo193e2ab"
},
"uploadUrl": "https://video-uploads.s3-accelerate.amazonaws.com/C5400AQHpR1ANqMWqNA/aws-userUploadedVideo?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20180120T000018Z&X-Amz-SignedHeaders=content-type%3Bhost%3Bx-amz-server-side-encryption%3Bx-amz-server-side-encryption-aws-kms-key-id&X-Amz-Expires=86400&X-Amz-Credential=AKIAJYU2MA%2F20180120%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=f7c0756a80998786766588878768778768977687d4c687b3f1a0e8"
}
}
}
}
However, I am receiving something like this when I register a multipart upload:
{
"value": {
"uploadMechanism": {
"com.linkedin.digitalmedia.uploading.MultipartUpload": {
"metadata": "base64_encoded_metadata",
"partUploadRequests": [
{
"headers": {
"Content-Length": "5242880",
"Content-Type": "application/octet-stream"
},
"urlExpiresAt": 1558459064787,
"byteRange": {
"lastByte": 5242879,
"firstByte": 0
},
"url": "https://video-uploads-prod.s3-accelerate.amazonaws.com/ABCD/aws-userUploadedVideo?uploadId=xxx&partNumber=1&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=xxx&X-Amz-SignedHeaders=content-length%3Bcontent-type%3Bhost&X-Amz-Expires=86400&X-Amz-Credential=xxx&X-Amz-Signature=xxx"
},
{...other_parts...}
]
}
}
}
}
As I understand this, I am supposed to iterate on the partUploadRequests array and use the headers contained on each 'chunk' there to make the multipart upload. These headers that I get... I am not getting the "x-amz-server-side-encryption" headers back from LinkedIn when I request it as multipart. So when I attempt the upload of a chunk I get an error from amazon saying something about the signature...
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated...(truncated)
I have had no problems making this request outside the realm of multi-part uploads... but when I try to make it multi-part, it fails royally.
Consistent with their api docs, I'd expect to get a 4xx error if I didn't have permissions to perform this action... but I am not getting such error, I just don't get the right headers back from LinkedIn?
This here is what I send in the body of the initial upload registration request:
{
"registerUploadRequest": {
"supportedUploadMechanism": ["MULTIPART_UPLOAD"],
"fileSize": 123123123,
"owner": "url:li:organization:x123123123",
"recipes": [
"urn:li:digitalmediaRecipe:feedshare-video"
],
"serviceRelationships": [
{
"identifier": "urn:li:userGeneratedContent",
"relationshipType": "OWNER"
}
]
}
}
My app has Marketing Dev. Platform access, and I am following these documentation indications: https://learn.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/vector-asset-api#register-an-upload
So far, unsuccessful only when trying to do multi-part uploads. I have been able to successfully publish single image posts, and article type of posts so far... but Video uploads are eluding me...
Anyone that can shed some light on this?
EDIT: Almost instantly after posting this, (didn't show up before when I searched earlier) I found this question that seems to hint that this kind of upload is not yet supported?? So far this has a taste of errors, can anyone further confirm this is unsupported!?
Video Uploads are a restricted feature that is granted to select developers only.
Source:
https://learn.microsoft.com/en-us/linkedin/marketing/integrations/community-management/shares/ugc-post-api
I want to filter and select Project Online data using Enterprise Custom Fields (ECF) in an Azure Logic App. I start with the out-of-the-box Project Online connector to trigger on project publish, then list project and task details.
Using the default connector, I can return project data but it does not contain the ECF data. Looking at the generated code, the Project Online connector uses /_api/ProjectServer. If I make similar requests (outside of Logic Apps) using /_api/Projectdata I get the ECF data.
If I edit the Logic App code with modified URL and fields, I get a 404 error. There seems to be something missing that I cannot see when I make the request with the modified values from within the Logic App.
My guess is the connector is limiting access to /_api/Projectdata (though the trigger itself is using this URL).
The following returns the data I want if I request it straight from the browser:
https://tenant.sharepoint.com/sites/PWA/_api/Projectdata/Projects(guid'project_GUID')
This is the raw input in the logic app:
{
"host": {
"connection": {
"name": "/subscriptions/subscription_GUID/resourceGroups/MyRG/providers/Microsoft.Web/connections/projectonline-1"
}
},
"method": "get",
"path": "/_api/Projectdata/Projects(guid'project_GUID')",
"queries": {
"siteUrl": "https://tenant.sharepoint.com/sites/PWA"
}
}
This is the raw output of the error:
{
"statusCode": 404,
"headers": {
"Access-Control-Allow-Methods": "GET, PUT, PATCH, DELETE, POST",
"Access-Control-Allow-Origin": "*",
"Access-Control-Max-Age": "3600",
"Access-Control-Expose-Headers": "*",
"Date": "Thu, 20 Sep 2018 16:58:42 GMT",
"Content-Length": "54",
"Content-Type": "application/json"
},
"body": {
"statusCode": 404,
"message": "Resource not found"
}
}
URLs and GUID have been modified in the examples above.
I have a react app that makes API call to the endpoint http://localhost:3020/schema/filter. Following is the payload that I am passing with the POST request.
let filterParams = {
"filter": {
"and": [{
"field": "name",
"operator": "LIKE",
"value": "Core"
}, {
"field": "created_at",
"operator": "GREATER_THAN",
"value": "05/26/2017"
}, {
"field": "created_at",
"operator": "LESS_THAN",
"value": "07/02/2017"
}]
}
}
let response = await apiService.post('https://localhost:3020/schema/filter', filterParams)
API SERVER is rails app with puma server.
Server console responds with
2017-07-04 12:04:05 +0545: HTTP parse error, malformed request ():
#<Puma::HttpParserError: Invalid HTTP format, parsing fails.
API SERVER is configured to respond to the JSON payload. Whenever I try to POST request with the payload , the browser responds with
OPTIONS https://localhost:3020/schema/filter net::ERR_SSL_PROTOCOL_ERROR in the browser console in chrome.
Similarly, safari console returns
Fetch API cannot load https://localhost:3020/schema/filter. An SSL error has occurred and a secure connection to the server cannot be made.
Seems like I am having trouble with SSL or certificates. I tried deleting browser caches, cookies and certificate itself. Still no luck.
Any help is appreciated.
This is an error because your service is posting to https://localhost but your rails server is MOST LIKELY not running with https.
If your react app, you should do something like:
var apiBase = process.env === 'PRODUCTION' ?
'https://www.productionapp.com/' : 'http://localhost:3000/'
let response = await apiService.post(apiBase + '/schema/filter', filterParams)
Our app (parseSDK 1.13) has a parse-server (2.2.15) backend and sometimes (reproduceable) the user's session gets destroyed. After loading some custom objects from parse-server, the app (or more precise: parse-SDK) sends a strange request to parse-server to set the user password while i'm not doing any register/login/pw-reset related action:
Request: POST /parse/batch
{"requests": [{
"path": "\/parse\/classes\/_User\/abcu45BFAd",
"method": "PUT",
"body": {
"password": "xyz"
}
}]}
Response:
[{
"success": {
"updatedAt": "2016-07-05T23:04:51.041Z"
}
}]
After that request, the server completely destroys the Session entry in the database without any notice. All following requests fail with error 209 (invalid session token).
I do not intent to set/update the pssword and i cannot find, where this request comes from. Any hints are highly appreciated, thanks.
Should be fixed with upcoming 2.2.17