I am trying to use refit to upload to azure blob storage from a Xamarin iOS application. This is the interface configuration I am using for Refit:
[Headers("x-ms-blob-type: BlockBlob")]
[Put("/{fileName}")]
Task<bool> UploadAsync([Body]byte[] content, string sasTokenKey,
[Header("Content-Type")] string contentType);
Where the sasTokenKey parameter looks like this:
"/content-default/1635839001660743375-66f93195-e923-4c8b-a3f1-5f3f9ba9dd32.jpeg?sv=2015-04-05&sr=b&sig=Up26vDxQikFqo%2FDQjRB08YtmK418rZfKx1IHbYKAjIE%3D&se=2015-11-23T18:59:26Z&sp=w"
This is how I am using Refit to call the azure blob server:
var myRefitApi = RestService.For<IMyRefitAPI>("https://myaccount.blob.core.windows.net");
myRefitApi.UploadAsync(photoBytes, sasTokenKey, "image/jpeg"
However I am getting the follow error:
Response status code does not indicate success: 403 (Server failed to
authenticate the request. Make sure the value of Authorization header is
formed correctly including the signature.)
The SAS url is working fine if I call it directly like this
var content = new StreamContent(stream);
content.Headers.Add("Content-Type", "jpeg");
content.Headers.Add("x-ms-blob-type", "BlockBlob");
var task = HttpClient.PutAsync(new Uri(sasTokenUrl), content);
task.Wait();
So basically I am just trying to do the same thing using Refit.
Any idea how to get Refit working with Azure Blob Storage?
Thanks!
[UPDATE] I am now able to upload the bytes to the azure blob server but something seems to be wrong with the byte data because I am not able to view the image. Here is the code I am using to convert to byte array.
byte[] bytes;
using (var ms = new MemoryStream())
{
stream.Position = 0;
stream.CopyTo(ms);
ms.Position = 0;
bytes = ms.ToArray();
}
[UPDATE] Got it fixed by using stream instead of byte array!
I see %2F and %3D and I'm curious if refit is encoding those a second time. Try sending the token without encoding it.
This is incorrect use of Authorization header. You use Authorization header when you want to authorize the requests using account key. If you have the Shared Access Signature then you really don't need this header as the authorization information is included in the SAS itself. You can simply use the SAS URL for uploading files.
Related
Hello there I have setup successfully inbound webhook with strongGrid in net core 3.1.
The endpoint gets called and I want to parse value inside the attachment which is csv file.
The code I am using is following
var parser = new WebhookParser();
var inboundEmail = await parser.ParseInboundEmailWebhookAsync(Request.Body).ConfigureAwait(false);
await _emailSender.SendEmailAsyncWithSendGrid("info#mydomain.com", "ParseWebhook1", inboundEmail.Attachments.First().Data.ToString());
Please note I am sending an email as I don t know how to debug webhook with sendgrid as I am not aware of any cli.
but this line apparently is not what I am looking for
inboundEmail.Attachments.First().Data.ToString()
I am getting this on my email
Id = a3e6a543-2aee-4ffe-a36a-a53k95921998, Tag = HttpMultipartParser.MultipartFormDataParser.ParseStreamAsync, Length = 530 bytes
the csv I need to parse has 3 fields Sku productname and quantity I'd like to get sku values.
Any help would be appreciated.
The .Data property contains a Stream and invoking ToString on a stream object does not return its content. The proper way to read the content of a stream in C# is something like this:
var streamReader = new StreamReader(inboundEmail.Attachments.First().Data);
var attachmentContent = await streamReader.ReadToEndAsync().ConfigureAwait(false);
As far as parsing the CSV, there are literally thousands of projects on GitHub and hundreds on NuGet with the keyword 'CSV'. I'm sure one of them will fit your needs.
I want the user to be able to upload a file via my application. I don't have DB access, all my data calls get completed via a web-service that another person is writing. I needed to secure the web service, so I've consumed it & exposed it via WebAPI, & added OAuth security.
Now to my problem.
I've written the following.
public Task<FileResult> Post()
{
if (Request.Content.IsMimeMultipartContent())
{
var task = Request.Content.ReadAsByteArrayAsync().ContinueWith(
o =>
{
var result = this.Client.UploadPicture(this.UserId, o.Result);
if (result.ResultCode == 0)
{
return new FileResult()
{
Message = "Success",
FileId = result.ServerId
};
}
throw new HttpResponseException(...);
});
return task;
}
...
}
I'm pretty much a noob when it comes to WebAPI & multithreading (I'm not sure why this needs to be handled async? I'm sure there is a reason, but for now I'd just like a working example and get to the why later..).
My code is loosely based on some R&D & samples I've found on the net, but i haven't come across a scenario like I'm needing to complete... Yet it doesn't seem like I'm doing something out of the ordinary...
Upload a file to the server, and pass the image byte[] object to either sql or another service?
In this line
var result = this.Client.UploadPicture(this.UserId, o.Result);
I'm uploading a byte[] array of something....
Then later (the retrieval method works, I've managed to retrieve & view a test image)
When retrieving the byte array of the "image" i uploaded i get an array of idk what.. EG, i get a valid result of something, but it ain't no picture. Which leads me to believe that the uploaded data is bogus :|
O_o
How to get the image byte[]?
Mime Multipart is more than just your array of bytes. It also has metadata and boundary stuff. You need to treat it as MultiPartContent and then extract the image byte array out of that.
Filip has a blog post on the subject here.
I've had success creating objects with POST and Content-Type application/xml
I've also had success querying using Content-Type application/x-www-form-urlencoded with a blank request body which returns all of the object type depending on which URI I specify.
I can also get the same to work with something like PageNum=1&ResultsPerPage=1 in the request body and I have figured out how to incorporate that into the signature so I get a valid response.
However no matter how I format it, I cannot get anything other than a 401 response when I try to use a filter (something basic like Filter=FAMILYNAME :EQUALS: Doe). I've read over the OAuth Core 1.0 Revision A specifications on how all parameter names and values are escaped using the [RFC3986] percent-encoding. However I feel like I'm missing a step or formatting incorrectly. I've seen inconsistent information in my searching through Intuit's forums on what exactly is the proper format.
Any help on this would be greatly appreciated. I've been struggling with this for a good week now.
The response I get when trying to use a filter is:
HTTP Status 401 - message=Exception authenticating OAuth; errorCode=003200; statusCode=401
----Update----
I'm am seeing the same error when I try to use filters with the New IPP Developer Tools - IPP API Explorer. I'm using the IDS V2 QBO API Explorer. I'm able to use that tool to do a retrieve all Post and the response shows all of my customers, but when I try to use a filter I get :
Server Error
401 - Unauthorized: Access is denied due to invalid credentials.
You do not have permission to view this directory or page using the credentials that you supplied.
Any Ideas? If I'm getting the same error from the API Explorer tool, it makes me think the problem is something else entirely.
----Final Update----
I have finally had success with filters and I believe I have figure out what my problem was. I was always suspicious that I was able to get queries with pagination like "PageNum=1&ResultsPerPage=1" to work, but could not get something like "Filter=FAMILYNAME :EQUALS: Doe". I suspected there problem was with the white space in the filter format. What threw me off tracking this down earlier was that I could not get the filters to work in the IDS V2 QBO API Explorer. That made me suspect there was something else going on. I decided to ignore the API Explorer all together and focus on why I could get it to work the one way but no the other.
I believe my problem came down to improper encoding of the Filter's value in the signature. That explains the 401 invalid signature errors I was getting.
"Filter=Name :EQUALS: Doe" becomes "Filter=Name%20%3AEQUALS%20%3ADoe" after normalization.
Percent-Encoding that should give "Filter%3DName%2520%253AEQUALS%2520%253ADoe".
In essence you have to "double" encode the blank space and the colons, but not the equal sign. I tried many permutations of doing the encoding, but believe my mistake was that I was either not "double" encoding, or when I was double encoding I was including the '=' sign. Either way breaks your signature. Thanks for everyone's input.
I believe my problem came down to improper encoding of the Filter's value in the signature. That explains the 401 invalid signature errors I was getting.
I used an online tool to take me through the steps in properly signing an Oauth request. While going through those steps I realized my problem was with the steps where you normalize the request parameters and then percent-encode them. I was including the '=' of the filter in the normalization step, which breaks your signature. The tool I used can be found at:
http://hueniverse.com/2008/10/beginners-guide-to-oauth-part-iv-signing-requests/
Thanks for everyone's input.
Do you get a 401 with the same request in the API Explorer?
http://ippblog.intuit.com/blog/2013/01/new-ipp-developer-tool-api-explorer.html
Also, are you using the static base URL or retrieving it at runtime?
https://ipp.developer.intuit.com/0010_Intuit_Partner_Platform/0050_Data_Services/0400_QuickBooks_Online/0100_Calling_Data_Services/0010_Getting_the_Base_URL
If you are using the static base URL, try switching to the runtime base URL to see if you still get the error.
peterl answered one of my questions on here that may also answer yours. I had been trying to put the Filters in the body when they should have gone into the header. Here was peterl's code sample for getting all unpaid invoices (open balance greater than 0.00) for a particular customer.
http://pastebin.com/raw.php?i=7VUB6whp
public List<Intuit.Ipp.Data.Qbo.Invoice> GetQboUnpaidInvoices(DataServices dataServices, int startPage, int resultsPerPage, IdType CustomerId)
{
StringBuilder requestXML = new StringBuilder();
StringBuilder responseXML = new StringBuilder();
var requestBody = String.Format("PageNum={0}&ResultsPerPage={1}&Filter=OpenBalance :GreaterThan: 0.00 :AND: CustomerId :EQUALS: {2}", startPage, resultsPerPage, CustomerId.Value);
HttpWebRequest httpWebRequest = WebRequest.Create(dataServices.ServiceContext.BaseUrl + "invoices/v2/" + dataServices.ServiceContext.RealmId) as HttpWebRequest;
httpWebRequest.Method = "POST";
httpWebRequest.ContentType = "application/x-www-form-urlencoded";
httpWebRequest.Headers.Add("Authorization", GetDevDefinedOAuthHeader(httpWebRequest, requestBody));
requestXML.Append(requestBody);
UTF8Encoding encoding = new UTF8Encoding();
byte[] content = encoding.GetBytes(requestXML.ToString());
using (var stream = httpWebRequest.GetRequestStream())
{
stream.Write(content, 0, content.Length);
}
HttpWebResponse httpWebResponse = httpWebRequest.GetResponse() as HttpWebResponse;
using (Stream data = httpWebResponse.GetResponseStream())
{
Intuit.Ipp.Data.Qbo.SearchResults searchResults = (Intuit.Ipp.Data.Qbo.SearchResults)dataServices.ServiceContext.Serializer.Deserialize<Intuit.Ipp.Data.Qbo.SearchResults>(new StreamReader(data).ReadToEnd());
return ((Intuit.Ipp.Data.Qbo.Invoices)searchResults.CdmCollections).Invoice.ToList();
}
}
protected string GetDevDefinedOAuthHeader(HttpWebRequest webRequest, string requestBody)
{
OAuthConsumerContext consumerContext = new OAuthConsumerContext
{
ConsumerKey = consumerKey,
ConsumerSecret = consumerSecret,
SignatureMethod = SignatureMethod.HmacSha1,
UseHeaderForOAuthParameters = true
};
consumerContext.UseHeaderForOAuthParameters = true;
//URIs not used - we already have Oauth tokens
OAuthSession oSession = new OAuthSession(consumerContext, "https://www.example.com",
"https://www.example.com",
"https://www.example.com");
oSession.AccessToken = new TokenBase
{
Token = accessToken,
ConsumerKey = consumerKey,
TokenSecret = accessTokenSecret
};
IConsumerRequest consumerRequest = oSession.Request();
consumerRequest = ConsumerRequestExtensions.ForMethod(consumerRequest, webRequest.Method);
consumerRequest = ConsumerRequestExtensions.ForUri(consumerRequest, webRequest.RequestUri);
if (webRequest.Headers.Count > 0)
{
ConsumerRequestExtensions.AlterContext(consumerRequest, context => context.Headers = webRequest.Headers);
if (webRequest.Headers[HttpRequestHeader.ContentType] == "application/x-www-form-urlencoded")
{
Dictionary<string, string> formParameters = new Dictionary<string, string>();
foreach (string formParameter in requestBody.Split('&'))
{
formParameters.Add(formParameter.Split('=')[0], formParameter.Split('=')[1]);
}
consumerRequest = consumerRequest.WithFormParameters(formParameters);
}
}
consumerRequest = consumerRequest.SignWithToken();
return consumerRequest.Context.GenerateOAuthParametersForHeader();
}
You can also see my original Question Here on StackOverflow: Query for All Invoices With Open Balances using QuickBooks Online (QBO) Intuit Partner Platform (IPP) DevKit.
I am using Google Data API for .Net(version 1.9) in my application.
I have created a Google apps account and i have set the "Users cannot share documents outside this organization" setting under Google Docs.
When i try to share a file outside of the domain(organization) from Google docs web, i get a error saying the file cannot be shared outside of my domain.
But when i try the same thing from the API, it succeeds. I get a 200 success from the API. When i try to access the file from the share link it says 'You need permission to access this resource'. My question is shouldn't the API return with a error? how can i handle this case?
Here is the code that I am using:
DocumentsRequest request = null;
/* request initialization */
string csBatchReqBody = "<?xml version="1.0" encoding="UTF-8"?><feed xmlns="http://www.w3.org/2005/Atom" xmlns:gAcl="http://schemas.google.com/acl/2007" xmlns:batch="http://schemas.google.com/gdata/batch"><category scheme="http://schemas.google.com/g/2005#kind" term="http://schemas.google.com/acl/2007#accessRule"/><entry><id>https://docs.google.com/feeds/default/private/full/document:1DsELtiNwq-ogOrp8cAONdMpGR4gBF79PjijTae-vVNg/acl/user:myusername#mydomain.com</id><batch:operation type="query"/></entry><entry><batch:id>1</batch:id><batch:operation type="insert"/><gAcl:role value="reader"/><gAcl:scope type="user" value="myusername#gmail.com"/></entry>"
string Url = "https://docs.google.com/feeds/default/private/full/document:1DsELtiNwq-ogOrp8cAONdMpGR4gBF79PjijTae-vVNg/acl/batch";
byte[] byteArray = Encoding.ASCII.GetBytes(csBatchReqBody);
MemoryStream inputStream = new MemoryStream(byteArray);
AtomEntry reply = request.Service.Insert(new Uri(Url), inputStream, "application/atom+xml", "");
MemoryStream stream = new MemoryStream();
reply.SaveToXml(stream);
The API actually returns a 400 if you try to share a file outside the domain and the admins have set the "Users cannot share documents outside this organization" flag.
Your code sends a batch request (even if for a single element), you'd have to check the batch response to notice the error.
Instead, use the following code to share a document to a single user, it assumes that entry is the DocumentEntry you want to share:
AclEntry acl = new AclEntry();
acl.Scope = new AclScope("username#gmail.com", "user");
acl.Role = new AclRole("reader");
acl = service.Insert(new Uri(entry.AccessControlList), acl);
I have a need to store files on Amazon AWS S3, but in order to isolate the user from the AWS authentication I want to go via an ASP page on my site, which the user will be logged into. So:
The application sends the file using the Delphi Indy library TidHTTP.Put (FileStream) routine to the ASP page, along with some authentication stuff (mine, not AWS) on the querystring.
The ASP page checks the auth details and then if OK stores the file on S3 using my Amazon account.
Problem I have is: how do I access the data coming in from the Indy PUT using JScript in the ASP page and pass it on to S3. I'm OK with AWS signing, etc, it's just the nuts and bolts of connecting the two bits (the incoming request and the outgoing AWS request) ...
TIA
R
A HTTP PUT will store the file at the given location in the HTTP header - it "requests that the enclosed entity be stored under the supplied Request-URI".
The disadvantage with the PUT method is that if you are on a shared hosting environment it may not be available to you.
So if the web server supports PUT, the file should be available at the given location in the the (virtual) file system. The PUT request will be handled by the server and not ASP:
In the case of PUT, the web server
handles the request itself: there is
no room for a CGI or ASP application
to step in.
The only way for your application to
capture a PUT is to operate on the
low-level, ISAPI filter level
http://www.15seconds.com/issue/981120.htm
Are you sure you need PUT and can not use a POST, which will send the file to a URL where your ASP script can read it from the request stream?
OK, Ive got a bit further with this. Code at the ASP end is:
var PostedDataSize = Request.TotalBytes ;
var PostedData = Request.BinaryRead (PostedDataSize) ;
var PostedDataStream = Server.CreateObject ("ADODB.Stream") ;
PostedDataStream.Open ;
PostedDataStream.Type = 1 ; // binary
PostedDataStream.Write (PostedData) ;
Response.Write ("PostedDataStream.Size = " + PostedDataStream.Size + "<br>") ;
var XML = AmazonAWSPUTRequest (BucketName, AWSDestinationFileID, PostedDataStream) ;
.....
function AmazonAWSPUTRequest (Bucket, Filename, InputStream)
{
....
XMLHttp.open ("PUT", URL + FRequest, false) ;
XMLHttp.setRequestHeader (....
XMLHttp.setRequestHeader (....
...
Response.Write ("InputStream.Size = " + InputStream.Size + "<br>") ;
XMLHttp.send (InputStream) ;
So I use BinaryRead, write it to a binary stream. If I write out the size of the stream I get the size of the file I POST'ed from my application, so I reckon the data is in there somewhere. I then call a routine (with the stream as a parameter) which sets up the AWS authentication/signing and does a PUT.
The AWS call returns no errors and a file of the correct name is created in the right place, but it has a size of zero! InputStream.Size has a value the same as the stream parameter passed to the routine - i.e. the size of the original file.
Any ideas?
POSTSCRIPT. Found the problem. It's caught me a few times with streams, this one. When you write data to a stream, don't forget to reset the stream position back to zero before trying to read from the stream again. I.e. just before the line:
XMLHttp.send (InputStream) ;
I needed to add:
InputStream.Position = 0 ;
My thanks for the interest and suggestions.