I have a MVC 5 internet application, that uploads files to Azure blob storage. I have currently implemented the code to create a SAS for blobs in a container.
I have a MVC view, that lists many images via the <img src> tag for many different objects. Each of these images are the same image. By that I mean they have the same fileName and are in the same container.
Is it possible to check if a blob already has a current SAS url, and if so, retrieve that SAS url?
Thanks in advance.
EDIT
Here is my code to get a sas url:
public string GetBlobSasUri(string containerName, string fileName)
{
CloudBlobContainer container = GetCloudBlobContainer(containerName);
//Get a reference to a blob within the container.
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
//Set the expiry time and permissions for the blob.
//In this case the start time is specified as a few minutes in the past, to mitigate clock skew.
//The shared access signature will be valid immediately.
SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddHours(4);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;
//Generate the shared access signature on the blob, setting the constraints directly on the signature.
string sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);
//Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
If I pass in a containerName with "Test" and a FileName with "Example.png", an SAS is created and returned. If I then pass the same parameters into the function, can I check to see if the FileName already has a SAS created for it, and if so, return the same SAS url?
SAS is Shared Access Signature. That means - it is just digital signature identifying given resource. It is not attached in any way to the resource. SAS is not just some randomly generated token, that is assigned to the resource. SAS is verified during the request - evaluated, signature check, resource check, action check. Thus the service itself (blob service) and the resources in that service (containers, blobs) have no idea of whether SAS exists or not.
Having said that you have couple of possible approaches:
Have cache that stores SAS, where cache key would be your blob URI. Appropriate cache expiry time shall be configured with respect to SAS lifetime
Create SAS for every request (in case you cannot co-relate requests)
Here is MSDN doc on Constructing Shared Access Signature URI.
Related
JIRA provides a way to access the attachments of an issue using basic auth, jwt auth mechanisms. Using which we can download those files. We're able to download the files using both authentication mechanisms.
sample jwt auth:
curl -X GET --url https://{site-name}.atlassian.net/secure/attachment/1001/example.txt -H 'Authorization: jwt '
Issue / Our requirement:
But is there a way to generate temporarily accessible url for the JIRA issue's attachments which will have token embedded into the URI itself. I've added the example of that below
example url:
https://{site-name}.atlassian.net/attachment/1001/example.txt?token={temp_access_token}
While accessing / clicking on the above mentioned url the download should automatically start even if the user isn't logged into their account
Reason for our requirement:
We're creating jira cloud based service / app and one of its feature is providing access to the user's attachments through our application. Our limitation(cloud service cost) is that we can't download all the huge sized attachments and store and manage it. So we're looking for a solution using which user's can download from the JIRA's server directly
In your JWT generation steps, you can define how long the JWT should be valid. And you can attach a JWT to the URL like this: <Jira Base Url>/rest/api/3/...?jwt=.... This way, you could generate a JWT on demand and it'll only be valid for the given time that you define.
In the Java Example on the page Understanding JWT for Connect apps you can see how they are setting the expirationTime. Just do the same, on demand. Here is the important part of the code snippet:
public class JWTSample {
public String createUriWithJwt()
throws UnsupportedEncodingException, NoSuchAlgorithmException {
long issuedAt = System.currentTimeMillis() / 1000L;
long expiresAt = issuedAt + 180L;
/* ... */
JwtJsonBuilder jwtBuilder = new JsonSmartJwtJsonBuilder()
.issuedAt(issuedAt)
.expirationTime(expiresAt)
.issuer(key);
/* ... */
String jwtToken = /* ... */;
String apiUrl = baseUrl + apiPath + "?jwt=" + jwtToken;
return apiUrl;
}
}
Security concern: I'm explicitly mentioning that you should generate these links on demand because you should not set the expiration date to more than 5-10 minutes (which is already quite high). Otherwise an attacker just needs to retrieve your generated link (URLs are often logged somewhere) and is also able to retrieve the attachment as well.
Alternative Approach
Since you mentioned you'll build a service/app, why not chain the attachment download through your service? This way you wouldn't have to expose a JWT which is a potential security threat. For example: you offer a download button in your UI, this sends a HTTP request to your service and your service downloads the attachment and then forwards it to your user. However, this would not comply with your requirement to give access to unauthenticated users - if that's really what you want to do.
I am trying to use refit to upload to azure blob storage from a Xamarin iOS application. This is the interface configuration I am using for Refit:
[Headers("x-ms-blob-type: BlockBlob")]
[Put("/{fileName}")]
Task<bool> UploadAsync([Body]byte[] content, string sasTokenKey,
[Header("Content-Type")] string contentType);
Where the sasTokenKey parameter looks like this:
"/content-default/1635839001660743375-66f93195-e923-4c8b-a3f1-5f3f9ba9dd32.jpeg?sv=2015-04-05&sr=b&sig=Up26vDxQikFqo%2FDQjRB08YtmK418rZfKx1IHbYKAjIE%3D&se=2015-11-23T18:59:26Z&sp=w"
This is how I am using Refit to call the azure blob server:
var myRefitApi = RestService.For<IMyRefitAPI>("https://myaccount.blob.core.windows.net");
myRefitApi.UploadAsync(photoBytes, sasTokenKey, "image/jpeg"
However I am getting the follow error:
Response status code does not indicate success: 403 (Server failed to
authenticate the request. Make sure the value of Authorization header is
formed correctly including the signature.)
The SAS url is working fine if I call it directly like this
var content = new StreamContent(stream);
content.Headers.Add("Content-Type", "jpeg");
content.Headers.Add("x-ms-blob-type", "BlockBlob");
var task = HttpClient.PutAsync(new Uri(sasTokenUrl), content);
task.Wait();
So basically I am just trying to do the same thing using Refit.
Any idea how to get Refit working with Azure Blob Storage?
Thanks!
[UPDATE] I am now able to upload the bytes to the azure blob server but something seems to be wrong with the byte data because I am not able to view the image. Here is the code I am using to convert to byte array.
byte[] bytes;
using (var ms = new MemoryStream())
{
stream.Position = 0;
stream.CopyTo(ms);
ms.Position = 0;
bytes = ms.ToArray();
}
[UPDATE] Got it fixed by using stream instead of byte array!
I see %2F and %3D and I'm curious if refit is encoding those a second time. Try sending the token without encoding it.
This is incorrect use of Authorization header. You use Authorization header when you want to authorize the requests using account key. If you have the Shared Access Signature then you really don't need this header as the authorization information is included in the SAS itself. You can simply use the SAS URL for uploading files.
I was working on my download blob function when I ran into some problems..
I want the user to be able to download a blob and I want a specific filename on that item when its downloaded to the users computer. I also want the user to decide which folder the item should be saved to.
This is my not so good looking code so far:
var fileName = "tid.txt9c6b412a-270a-4f67-8e65-7ce2bf87503d";
var containerName = "uploads";
CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference(fileName);
using (var filestream = System.IO.File.OpenWrite(#"C:\Info\tid.txt9c6b412a-270a-4f67-8e65-7ce2bf87503d"))
{
blob.DownloadToStream(filestream);
}
fileName = the blob name
Is it possible to change the name? The file ending gets all messed up with my guid.
At the moment the download to folder is C:\Info.. How would this work when the website is published? How can I let the user decide which folder the item should be saved to? Am i doing this right?
thank you in advance
/Filip
How would this work when the website is published?
Slow for the user and expensive for you. You are streaming the BLOB through your app, so you'll bottleneck. Use Shared Access Signatures and download the blob directly from the browser. Use Content-Disposition as part of the URL to have the browser prompt the user with a Save As dialog. See Javascript download a URL - Azure Blob Storage.
Your question: Is it possible to change the name?
The name of the blob and the name on the user's disk are your/his choice. There is no need for them to match, except perhaps to avoid confusion. On the off chance that your user will upload it again (with changes, perhaps?) save some metadata so the original file and the updated file can be related in blob storage.
Once you execute the line:
var blob = container.GetBlockBlobReference(fileName);
... you have told Azure all it needs to know to locate the blob.
In the line:
using (var filestream = System.IO.File.OpenWrite...
... you tell your code where to put the file on the disk. You say it's a website, so this statement will put the file onto the web server's disk, not your user's. To get the file onto the user's disk, you need one more step - download the file from the web server (web role instance) to your user's computer. You can give him control of the folder and file name. Here is the relevant section in MSDN:
Downloading and Uploading Files
Is this download function acceptable? Slow/expensive or is it as good as it gets?
public void DownloadFile(string blobName)
{
CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(blobName);
MemoryStream memStream = new MemoryStream();
blob.DownloadToStream(memStream);
Response.ContentType = blob.Properties.ContentType;
Response.AddHeader("Content-Disposition", "Attachment; filename=" + blobName.ToString());
Response.AddHeader("Content-Length", (blob.Properties.Length - 1).ToString());
Response.BinaryWrite(memStream.ToArray());
Response.End();
}
I'm working on a SharePoint mobile solution where I'm using the web services exposed in server/_vti_bin/sitedata.asmx, server/_vti_bin/Lists.asmx and server/_vti_bin/copy.asmx.
I'm able to successfully fetch the list of sites, document libraries and files using the services defined in server/_vti_bin/sitedata.asmx.
Now I'm actually trying to upload an image file from Photo Albums available in iOS to SharePoint. For this, I tried using CopyIntoItems web service, where in I'm getting the following error response.
<CopyResult ErrorCode="DestinationInvalid" ErrorMessage="The Copy web service method must be called on the same domain that contains the destination url." DestinationUrl="http://xxxxserveripxxxxxx/Shared Documents/image1.png"/>
But came to know that this service is used only if the file to be uploaded is also from the same source(i.e., from sharepoint).
Is there any other way to upload a file available in iPhone to SharePoint.
Also tired addAttachment service defiend in server/_vti_bin/Lists.asmx but I'm unable to identify the input parameters which requires list name and list Item ID.
I'm trying to upload a file to Shared Documents, so I've List Name value which is the one in curly braces of Shared Documents but now what should be the List Item Id value?
These are the details I've with regard to "Shared Documents" document library.
{
AllowAnonymousAccess = false;
AnonymousViewListItems = false;
BaseTemplate = DocumentLibrary;
BaseType = DocumentLibrary;
DefaultViewUrl = "/Shared Documents/Forms/AllItems.aspx";
Description = "Share a document with the team by adding it to this document library.";
InheritedSecurity = true;
InternalName = "{425F837A-F110-4876-98DE-C92902446935}";
LastModified = "2013-07-26 20:09:58Z";
ReadSecurity = 1;
Title = "Shared Documents";
},
So, I'm using the using InternalName value for listName tag.
What should be the value of listItemID?
Am I going in the right way or is there any other approach to upload a local file from mobile to SharePoint?
Thanks
Sudheer
Are you actually calling a URL or are you using the IP (you x'ed it out and said server IP)? If you don't have Alternate Access Mappings defined for the IP, uploads will fail but the GET requests will generally work ok.
There exists a DocsClient.get_resource_by_id function to get the document entry for a single ID. Is there a similar way to obtain (in a single call) multiple document entries given multiple document IDs?
My application needs to efficiently download the content from multiple files for which I have the IDs. I need to get the document entries to access the appropriate download URL (I could manually construct the URLs, but this is discouraged in the API docs). It is also advantageous to have the document type and, in the case of spreadsheets, the document entry is required in order to access individual worksheets.
Overall I'm trying to reduce I/O waits, so if there's a way I can bundle the doc ID lookup, it will save me some I/O expense.
[Edit] Backporting AddQuery to gdata v2.0 (from Alain's solution):
client = DocsClient()
# ...
request_feed = gdata.data.BatchFeed()
request_entry = gdata.data.BatchEntry()
request_entry.batch_id = gdata.data.BatchId(text=resource_id)
request_entry.batch_operation = gdata.data.BATCH_QUERY
request_feed.add_batch_entry(entry=request_entry, batch_id_string=resource_id, operation_string=gdata.data.BATCH_QUERY)
batch_url = gdata.docs.client.RESOURCE_FEED_URI + '/batch'
rsp = client.batch(request_feed, batch_url)
rsp.entry is a collection of BatchEntry objects, which appear to refer to the correct resources, but which differ from the entries I'd normally get via client.get_resource_by_id().
My workaround is to convert gdata.data.BatchEntry objects into gdata.docs.data.Resource objects ilke thus:
entry = atom.core.parse(entry.to_string(), gdata.docs.data.Resource)
You can use a batch request to send multiple "GET" requests to the API using a single HTTP request.
Using the Python client library, you can use this code snippet to accomplish that:
def retrieve_resources(gd_client, ids):
"""Retrieve Documents List API Resources using a batch request.
Args:
gd_client: authorized gdata.docs.client.DocsClient instance.
ids: Collection of resource id to retrieve.
Returns:
ResourceFeed containing the retrieved resources.
"""
# Feed that holds the batch request entries.
request_feed = gdata.docs.data.ResourceFeed()
for resource_id in ids:
# Entry that holds the batch request.
request_entry = gdata.docs.data.Resource()
self_link = gdata.docs.client.RESOURCE_SELF_LINK_TEMPLATE % resource_id
request_entry.id = atom.data.Id(text=self_link)
# Add the request entry to the batch feed.
request_feed.AddQuery(entry=request_entry, batch_id_string=resource_id)
# Submit the batch request to the server.
batch_url = gdata.docs.client.RESOURCE_FEED_URI + '/batch'
response_feed = gd_client.Post(request_feed, batch_url)
# Check the batch request's status.
for entry in response_feed.entry:
print '%s: %s (%s)' % (entry.batch_id.text,
entry.batch_status.code,
entry.batch_status.reason)
return response_feed
Make sure to sync to the latest version of the project repository.