How Do I read size of an directory in a softlayer cloud storage, using jCloud Swift api - jclouds

I am trying to set a quota based on directory, say about 5 mb. And Hence I need to read the size of present blobs in a directory.
I am able to get the size of individual BLOB using the below code,
if(containerName!=null && objectName!=null){
BlobMetadata metaData = blobStore.blobMetadata(containerName, objectName);
if(metaData !=null){
userMetaData = metaData.getUserMetadata();
}
ContentMetadata contMetadata = metaData.getContentMetadata();
System.out.println("Object Size "+ contMetadata.getContentLength());
}
Just not able to find out if I can check the size of all the blobs in a directory with out looping through all the blob metadata.

The container will have that info:
http://jclouds.apache.org/reference/javadoc/1.9.x/org/jclouds/openstack/swift/v1/domain/Container.html#getBytesUsed()

Related

Core Data Binary Data Allow External Storage

I am storage image into Core Data as an external storage as shown in the screenshot below:
The problem is when I retrieve the couponImage property I never get nil even if I don't save any couponImage. The coupon Image is of type NSData?. When I print it on the console it prints the following:
External Resource path = nil // This is where there is no image
External Resource path = EDCS-EDAS-23eD-EDRF-EWQ3-234F
My question is how do you differentiate between an image which does not exist vs image which does exist.

Where to save files from Firefox add-on?

I am working on a Firefox add-on which among other stuff generates thumbnails of websites for use by the add-on. So far I've been storing them by their image data URL using simple-storage. Two problems with this: the storage space is limited and sending very long strings around doesn't seem optimal(I assume the browser has optimized ways of loading image files, but maybe not data URLs). I think it shouldn't be a problem to save the files to disk, the question is where though. I googled quite a bit and could not find anything. Is there a natural place for this? Are there any restrictions?
As of Firefox 32, the place to store data for your add-on is supposed to be: [profile]/extension-data/[add-on ID]. This was established by the resolution of "Bug 915838 - Provide add-ons a standard directory to store data, settings". There is a follow-on bug, "Bug 952304 - (JSONStore) JSON storage API for addons to use in storing data and settings" which is supposed to provide an API for easy access.
For the Addon-SDK, you can obtain the addon ID (which you define in package.json) with:
let self = require("sdk/self");
let addonID = self.id;
For XUL and restartless extensions, you should be able to get the ID of your addon (which you define in the install.rdf file) with:
Components.utils.import("resource://gre/modules/Services.jsm");
let addonID = Services.appInfo.ID
You can then do the following to generate a URI for a file in that directory:
userProfileDirectoryPath = Components.classes["#mozilla.org/file/directory_service;1"]
.getService( Components.interfaces.nsIProperties)
.get("ProfD", Components.interfaces.nsIFile).path,
/**
* Generate URI for a filename in the extension's data directory under the preferences
* directory.
*/
function generateURIForFileInPrefExtensionDataDirectory (fileName) {
//Account for the path separator being OS dependent
let toReturn = "file://" + userProfileDirectoryPath.replace(/\\/g,"/");
return toReturn +"/extension-data/" + addonID + "/" + fileName;
}
}
The object myExtension.addonData is a copy that I store of the Bootstrap data provided to entry points in bootstrap.js.

Resize / Convert an image from a stream with ImageResizer

I'm trying to figure out how to convert an image from a stream with ImageResizer (http://imageresizing.net/).
I have tried something like this.
Stream s = WebRequest.Create("http://example.com/resources/gfx/unnamed.webp").GetResponse().GetResponseStream();
ImageBuilder.Current.Build(s, "~/resources/gfx/photo3.png", new ResizeSettings("format=png"));
But i just get the error
"File may be corrupted, empty, or may contain a PNG image with a single dimension greater than 65,535 pixels."
When i do
using (Stream output = File.OpenWrite(Server.MapPath("~/resources/gfx/test.webp")))
using (Stream input = WebRequest.Create("http:///example.com/resources/gfx/unnamed.webp").GetResponse().GetResponseStream()) {
input.CopyTo(output);
}
ImageBuilder.Current.Build("~/resources/gfx/test.webp", "~/resources/gfx/photo3.png",
new ResizeSettings("format=png"));
It works fine, am i'm missing something here?
It's possible that 'output' has not been flushed to disk. .NET 4+ doesn't guarantee the file's actually written to disk just because you disposed the stream.
I assume you have the ImageResizer.Plugins.WebP plugin installed?

How to download all files in an Azure Container Directory?

I have an aspnet app which i upload files to the azure blobs. I know that azure don't create structural paths in the containers, just blobs, but you can emulate directories putting a "/" on the uri.
i.e
I'd upload a list of files and my uri is like this
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName01.jpg
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName02.jpg
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName03.jpg
My download method:
public RemoteFile Download(DownloadRequest request)
{
var fileFinal = string.Format("{0}/{1}/{2}",request.IDProtocol ,request.IDDocument, request.FileName);
var blobBlock = InitializeDownload(fileFinal);
if (!blobBlock.Exists())
{
throw new FileNotFoundException("Error");
}
var stream = new MemoryStream();
blobBlock.DownloadToStream(stream);
return File(request.FileName)
}
private CloudBlob InitializeDownload(string uri)
{
var blobBlock = _blobClient.GetBlobReference(uri);
return blobBlock;
}
This way, i'm getting just one file. But i need to see and download all files inside http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/
Thanks
Adding more details. You will need to use one of the listing APIs provided by the client library: CloudBlobContainer.ListBlobs(), CloudBlobContainer.ListBlobsSegmented(), and CloudBlobContainer.ListBlobsSegmentedAsync() (and various overloads.). You can specify the directory prefix, and the service will only enumerate blobs matching the prefix. You can then download each blob. You may also want to look at the ‘useFlatBlobListing’ argument, depending on your scenario.
http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblobcontainer.listblobs.aspx
In addition AzCopy (see http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx) also supports this scenario of downloading all blobs in a given directory path.
Since each blob is a separate web resource, function above will download only one file. One thing you could do is list all blobs using the logic you are using and then download those blobs on your server first, zip them and the return that zip file to your end user.
Use AzCopy functionalities, now, it has a lot of supports.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10

Upload all files from local storage to Azure Blob Storage

I am currently struggling to upload multiple files from the local storage to the Azure Blob Storage, I was wondering if anyone could help me, below is the code i was previously using to upload a single zip file.
private void SaveZip(string id, string fileName, string contentType, byte[] data)
{
// Create a blob in container and upload image bytes to it
var blob = this.GetContainer().GetBlobReference(fileName);
blob.Properties.ContentType = contentType;
// Create some metadata for this image
var metadata = new NameValueCollection();
metadata["Id"] = id;
metadata["Filename"] = fileName;
}
SaveZip(
Guid.NewGuid().ToString(),
zipFile.FileName,
zipFile.PostedFile.ContentType,
zipFile.FileBytes);
Thanks, Sami.
It's quite straightforward with Set-AzureStorageBlobContent from azure storage powershell.
ls -File -Recurse | Set-AzureStorageBlobContent -Container upload
MSDN documentation : http://msdn.microsoft.com/en-us/library/dn408487.aspx
I don't think there's any build-in methods you can use to upload multiple files to the BLOB. What you can do is to upload them one by one, or parallel.
If you're just starting to work with Blob Storage, I'd encourage you to take a look at the "How to" article we've published. Specifically, the section on "How to Upload a Blob into a Container" should be helpful. Beyond that, Shaun is correct - there is no built-in support in the StorageClient library for uploading multiple files at once, but you can certainly upload them one-by-one.
If your need is just to get it done, and not to make an app out of it, you should consider checking out Cloud Storage Studio.
Like CodeThug said, "You never do anything with the byte array".
You have to upload the data stream to the blob and you are done.

Resources