I was working on my download blob function when I ran into some problems..
I want the user to be able to download a blob and I want a specific filename on that item when its downloaded to the users computer. I also want the user to decide which folder the item should be saved to.
This is my not so good looking code so far:
var fileName = "tid.txt9c6b412a-270a-4f67-8e65-7ce2bf87503d";
var containerName = "uploads";
CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference(fileName);
using (var filestream = System.IO.File.OpenWrite(#"C:\Info\tid.txt9c6b412a-270a-4f67-8e65-7ce2bf87503d"))
{
blob.DownloadToStream(filestream);
}
fileName = the blob name
Is it possible to change the name? The file ending gets all messed up with my guid.
At the moment the download to folder is C:\Info.. How would this work when the website is published? How can I let the user decide which folder the item should be saved to? Am i doing this right?
thank you in advance
/Filip
How would this work when the website is published?
Slow for the user and expensive for you. You are streaming the BLOB through your app, so you'll bottleneck. Use Shared Access Signatures and download the blob directly from the browser. Use Content-Disposition as part of the URL to have the browser prompt the user with a Save As dialog. See Javascript download a URL - Azure Blob Storage.
Your question: Is it possible to change the name?
The name of the blob and the name on the user's disk are your/his choice. There is no need for them to match, except perhaps to avoid confusion. On the off chance that your user will upload it again (with changes, perhaps?) save some metadata so the original file and the updated file can be related in blob storage.
Once you execute the line:
var blob = container.GetBlockBlobReference(fileName);
... you have told Azure all it needs to know to locate the blob.
In the line:
using (var filestream = System.IO.File.OpenWrite...
... you tell your code where to put the file on the disk. You say it's a website, so this statement will put the file onto the web server's disk, not your user's. To get the file onto the user's disk, you need one more step - download the file from the web server (web role instance) to your user's computer. You can give him control of the folder and file name. Here is the relevant section in MSDN:
Downloading and Uploading Files
Is this download function acceptable? Slow/expensive or is it as good as it gets?
public void DownloadFile(string blobName)
{
CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(blobName);
MemoryStream memStream = new MemoryStream();
blob.DownloadToStream(memStream);
Response.ContentType = blob.Properties.ContentType;
Response.AddHeader("Content-Disposition", "Attachment; filename=" + blobName.ToString());
Response.AddHeader("Content-Length", (blob.Properties.Length - 1).ToString());
Response.BinaryWrite(memStream.ToArray());
Response.End();
}
Related
I have a webapp in which i can create csv file and locate it in my c drive,
It works fine when running locally, but once i deploy the application to Azure
I'm getting :
UnauthorizedAccessException: Access to the path 'C:\Tuesday_HH19_MI6.csv' is denied.
How can i allow the website to access and create a file in the user's local drive?
I attached the entire log exception from azure if this helps,
Thank you
A Web App running in Azure can't directly save it to your user's local drive, but it can generate the CSV and then prompt them to download it via the browser. You can use a few options depending on if you are trying to send that already exists on the filesystem or if you have generated it dynamically and have it as a byte array or stream.
Here are some sample controller methods to give you and idea. Your controller could be doing a bunch of stuff before the return statement, these examples are simplified.
Existing file on filesystem, use a FileResult:
public FileResult DownloadFile()
{
// create the file etc and save to FS
return File("/Files/File Result.pdf", "text/csv", "MyFileName.csv");
}
If the file is generated in memory and you have it as a byte array:
public FileContentResult DownloadContent()
{
// Create CSV as byte array
var myfile = MyMethodtoCreateCSV();
return new FileContentResult(myfile, "text/csv") {
FileDownloadName = "MyFileName.csv"
};
}
I am trying to use refit to upload to azure blob storage from a Xamarin iOS application. This is the interface configuration I am using for Refit:
[Headers("x-ms-blob-type: BlockBlob")]
[Put("/{fileName}")]
Task<bool> UploadAsync([Body]byte[] content, string sasTokenKey,
[Header("Content-Type")] string contentType);
Where the sasTokenKey parameter looks like this:
"/content-default/1635839001660743375-66f93195-e923-4c8b-a3f1-5f3f9ba9dd32.jpeg?sv=2015-04-05&sr=b&sig=Up26vDxQikFqo%2FDQjRB08YtmK418rZfKx1IHbYKAjIE%3D&se=2015-11-23T18:59:26Z&sp=w"
This is how I am using Refit to call the azure blob server:
var myRefitApi = RestService.For<IMyRefitAPI>("https://myaccount.blob.core.windows.net");
myRefitApi.UploadAsync(photoBytes, sasTokenKey, "image/jpeg"
However I am getting the follow error:
Response status code does not indicate success: 403 (Server failed to
authenticate the request. Make sure the value of Authorization header is
formed correctly including the signature.)
The SAS url is working fine if I call it directly like this
var content = new StreamContent(stream);
content.Headers.Add("Content-Type", "jpeg");
content.Headers.Add("x-ms-blob-type", "BlockBlob");
var task = HttpClient.PutAsync(new Uri(sasTokenUrl), content);
task.Wait();
So basically I am just trying to do the same thing using Refit.
Any idea how to get Refit working with Azure Blob Storage?
Thanks!
[UPDATE] I am now able to upload the bytes to the azure blob server but something seems to be wrong with the byte data because I am not able to view the image. Here is the code I am using to convert to byte array.
byte[] bytes;
using (var ms = new MemoryStream())
{
stream.Position = 0;
stream.CopyTo(ms);
ms.Position = 0;
bytes = ms.ToArray();
}
[UPDATE] Got it fixed by using stream instead of byte array!
I see %2F and %3D and I'm curious if refit is encoding those a second time. Try sending the token without encoding it.
This is incorrect use of Authorization header. You use Authorization header when you want to authorize the requests using account key. If you have the Shared Access Signature then you really don't need this header as the authorization information is included in the SAS itself. You can simply use the SAS URL for uploading files.
I have a MVC 5 internet application, that uploads files to Azure blob storage. I have currently implemented the code to create a SAS for blobs in a container.
I have a MVC view, that lists many images via the <img src> tag for many different objects. Each of these images are the same image. By that I mean they have the same fileName and are in the same container.
Is it possible to check if a blob already has a current SAS url, and if so, retrieve that SAS url?
Thanks in advance.
EDIT
Here is my code to get a sas url:
public string GetBlobSasUri(string containerName, string fileName)
{
CloudBlobContainer container = GetCloudBlobContainer(containerName);
//Get a reference to a blob within the container.
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
//Set the expiry time and permissions for the blob.
//In this case the start time is specified as a few minutes in the past, to mitigate clock skew.
//The shared access signature will be valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddHours(4);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
//Generate the shared access signature on the blob, setting the constraints directly on the signature.
string sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
If I pass in a containerName with "Test" and a FileName with "Example.png", an SAS is created and returned. If I then pass the same parameters into the function, can I check to see if the FileName already has a SAS created for it, and if so, return the same SAS url?
SAS is Shared Access Signature. That means - it is just digital signature identifying given resource. It is not attached in any way to the resource. SAS is not just some randomly generated token, that is assigned to the resource. SAS is verified during the request - evaluated, signature check, resource check, action check. Thus the service itself (blob service) and the resources in that service (containers, blobs) have no idea of whether SAS exists or not.
Having said that you have couple of possible approaches:
Have cache that stores SAS, where cache key would be your blob URI. Appropriate cache expiry time shall be configured with respect to SAS lifetime
Create SAS for every request (in case you cannot co-relate requests)
Here is MSDN doc on Constructing Shared Access Signature URI.
I'm working on a SharePoint mobile solution where I'm using the web services exposed in server/_vti_bin/sitedata.asmx, server/_vti_bin/Lists.asmx and server/_vti_bin/copy.asmx.
I'm able to successfully fetch the list of sites, document libraries and files using the services defined in server/_vti_bin/sitedata.asmx.
Now I'm actually trying to upload an image file from Photo Albums available in iOS to SharePoint. For this, I tried using CopyIntoItems web service, where in I'm getting the following error response.
<CopyResult ErrorCode="DestinationInvalid" ErrorMessage="The Copy web service method must be called on the same domain that contains the destination url." DestinationUrl="http://xxxxserveripxxxxxx/Shared Documents/image1.png"/>
But came to know that this service is used only if the file to be uploaded is also from the same source(i.e., from sharepoint).
Is there any other way to upload a file available in iPhone to SharePoint.
Also tired addAttachment service defiend in server/_vti_bin/Lists.asmx but I'm unable to identify the input parameters which requires list name and list Item ID.
I'm trying to upload a file to Shared Documents, so I've List Name value which is the one in curly braces of Shared Documents but now what should be the List Item Id value?
These are the details I've with regard to "Shared Documents" document library.
{
AllowAnonymousAccess = false;
AnonymousViewListItems = false;
BaseTemplate = DocumentLibrary;
BaseType = DocumentLibrary;
DefaultViewUrl = "/Shared Documents/Forms/AllItems.aspx";
Description = "Share a document with the team by adding it to this document library.";
InheritedSecurity = true;
InternalName = "{425F837A-F110-4876-98DE-C92902446935}";
LastModified = "2013-07-26 20:09:58Z";
ReadSecurity = 1;
Title = "Shared Documents";
},
So, I'm using the using InternalName value for listName tag.
What should be the value of listItemID?
Am I going in the right way or is there any other approach to upload a local file from mobile to SharePoint?
Thanks
Sudheer
Are you actually calling a URL or are you using the IP (you x'ed it out and said server IP)? If you don't have Alternate Access Mappings defined for the IP, uploads will fail but the GET requests will generally work ok.
I have a file stored in a sharepoint libarary like
filePathAndName = "http://spstore/sites/appsitename/documentlibraryname/abc.xls"
I need to be able to open the the abc.xls file using
byte[] buffer = System.IO.File.ReadAllBytes(filePathAndName);
but i get an error stating. uri formats are not supported. How do I get the full path to the file?
You have to download the file first. For example you could use a WebClient to send an HTTP request to the remote server and retrieve the file contents:
using (var client = new WebClient())
{
byte[] file = client.DownloadData("http://spstore/sites/appsitename/documentlibraryname/abc.xls");
// TODO: do something with the file data
}