Can the data at UseMachineKeyStore be backed up and recovered? - rsacryptoserviceprovider

I have the following code:
const int PROVIDER_RSA_FULL = 1;
const string CONTAINER_NAME = "Example";
CspParameters cspParams;
cspParams = new CspParameters(PROVIDER_RSA_FULL);
cspParams.KeyContainerName = CONTAINER_NAME;
cspParams.Flags = CspProviderFlags.UseMachineKeyStore;
cspParams.ProviderName = "Microsoft Strong Cryptographic Provider";
RSACryptoServiceProvider rsa = new RSACryptoServiceProvider(cspParams);
As I understand it, a keypair is generated automatically and then becomes the referenced key pair using the KeyContainerName "Example".
I'm using a dedicated host. I want to be sure that our hosting company are aware of this information being important, making sure it's backed up, and not losing it, because then all the information I have encrypted and stored in a database will be useless.
I can't find any word in MSDN about how it works in the background.

The Key containers are stored in the
file system. The directories are
Machine Keys: documents and
settings\all
users\application\data\microsoft\crypto
and subdirectories.
Be aware that you can not "reuse"
those keys on other machines or if you
are going to rebuild your machine!
Ref.

Related

How to read config file in electronjs app

It's my first time using Electron JS and nodejs. I've built a small app that reads some records from a database and updates them. Everything is working fine. I have a config file with the database credentials but when I build a portable win app, I cannot figure out how to read the config file that I would like to place next to the exe. I would like to have easy access to the file, so I could run the same app on different databases.
Can anyone tell me if what I want is possible and how? I already tried to get the exe location but I couldn't. I also read a lot of topics here but nothing seems to solve my problem (I might be doing something wrong).
I'm using electron-builder to build my app.
Thanks in advance.
Edit #1
My Config file is
{
"user" :"X",
"password" :"X",
"server":"X",
"database":"X",
"options":
{
"trustedconnection": true,
"enableArithAbort" : true,
"trustServerCertificate": true
}
}
This is what I've and works when I run the project with npm start
const configRootPath = path.resolve(__dirname,'dbConfig.json');
dbConfig = JSON.parse(fs.readFileSync(configRootPath, { encoding: 'utf-8' }));
However, when I build it, the app is looking for the file in another location different from the one where the executable is.
Use of Electron's app.getPath(name) function will get you the path(s) you are after, irrespective of which OS (Operating System) you are using.
Unless your application writes your dbConfig.json file, it may be difficult for your user to understand exactly where they should place their database config file as each OS will run and store your application data in a different directory. You would need to be explicit to the user as to where to place their config file(s). Alternatively, your application could create the config file(s) on the user's behalf (automatically or through a html form) and save it to a location 'known' to the application.
A common place where application specific config files are stored is in the user's application data directory. With the application name automatically amended to the directory, it can be found as shown below.
const electronApp = require('electron').app;
let appUserDataPath = electronApp.getPath('userData');
console.log(appUserDataPath );
In your use case, the below would apply.
const electronApp = require('electron').app;
const nodeFs = require('fs');
const nodePath = require('path');
const configRootPath = nodePath.join(electronApp.getPath('userData'), 'dbConfig.json');
dbConfig = JSON.parse(nodeFs.readFileSync(configRootPath, 'utf-8'));
console.log(configRootPath);
console.log(dbConfig);
You can try electron-store to store config.
Electron doesn't have a built-in way to persist user preferences and other data. This module handles that for you, so you can focus on building your app. The data is saved in a JSON file named config.json in app.getPath('userData').

How can I locate a file to use with the system/child_process API within a Firefox add-on?

I would like to write a Firefox add-on that communicates with a locally installed program to exchange data. It looks like this can be done using either js-ctypes or the low-level system/child_process API, with the latter being the recommended solution.
The child_process API appeals because it sends and receives data abstractly over a pipe rather than directly at the C interface level. However, to use it you need (it seems) to supply the full path to the executable within your code:
var child_process = require("sdk/system/child_process");
var ls = child_process.spawn('/bin/ls', ['-lh', '/usr']);
In my case, the executable is installed by another application and we don't know it's exact location - it will differ according to OS, the user's drives, and possibly the user's preference. I imagine this problem will be common to most executables that are not built in to the OS. So my question is: what means do I have to locate the full path of the executable I want to use? I will need to support multiple OSes but presumably could have different solutions for each if needed.
Thanks!
Here's the code I used on Windows - the key was being able to read an environment variable to find the location of the appropriate application folder. After that I assume that my application is stored under a well-known subpath (we don't allow customization of it).
var system = require("sdk/system");
var iofile = require('sdk/io/file');
var child_process = require('sdk/system/child_process');
var progFilesFolder = system.env["programfiles(x86)"],
targetFile = iofile.join(progFilesFolder, 'FolderName', 'Program.exe');
targetFileExists = iofile.exists(targetFile);
if (targetFileExists) {
var p = child_process.spawn(targetFile);
}
I haven't written the code for Mac yet but I expect it to be similar, with the difference being that there are no drive letters to worry about and the system folders in OS X have standard names (even on localized systems).

Where to save files from Firefox add-on?

I am working on a Firefox add-on which among other stuff generates thumbnails of websites for use by the add-on. So far I've been storing them by their image data URL using simple-storage. Two problems with this: the storage space is limited and sending very long strings around doesn't seem optimal(I assume the browser has optimized ways of loading image files, but maybe not data URLs). I think it shouldn't be a problem to save the files to disk, the question is where though. I googled quite a bit and could not find anything. Is there a natural place for this? Are there any restrictions?
As of Firefox 32, the place to store data for your add-on is supposed to be: [profile]/extension-data/[add-on ID]. This was established by the resolution of "Bug 915838 - Provide add-ons a standard directory to store data, settings". There is a follow-on bug, "Bug 952304 - (JSONStore) JSON storage API for addons to use in storing data and settings" which is supposed to provide an API for easy access.
For the Addon-SDK, you can obtain the addon ID (which you define in package.json) with:
let self = require("sdk/self");
let addonID = self.id;
For XUL and restartless extensions, you should be able to get the ID of your addon (which you define in the install.rdf file) with:
Components.utils.import("resource://gre/modules/Services.jsm");
let addonID = Services.appInfo.ID
You can then do the following to generate a URI for a file in that directory:
userProfileDirectoryPath = Components.classes["#mozilla.org/file/directory_service;1"]
.getService( Components.interfaces.nsIProperties)
.get("ProfD", Components.interfaces.nsIFile).path,
/**
* Generate URI for a filename in the extension's data directory under the preferences
* directory.
*/
function generateURIForFileInPrefExtensionDataDirectory (fileName) {
//Account for the path separator being OS dependent
let toReturn = "file://" + userProfileDirectoryPath.replace(/\\/g,"/");
return toReturn +"/extension-data/" + addonID + "/" + fileName;
}
}
The object myExtension.addonData is a copy that I store of the Bootstrap data provided to entry points in bootstrap.js.

How to download all files in an Azure Container Directory?

I have an aspnet app which i upload files to the azure blobs. I know that azure don't create structural paths in the containers, just blobs, but you can emulate directories putting a "/" on the uri.
i.e
I'd upload a list of files and my uri is like this
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName01.jpg
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName02.jpg
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName03.jpg
My download method:
public RemoteFile Download(DownloadRequest request)
{
var fileFinal = string.Format("{0}/{1}/{2}",request.IDProtocol ,request.IDDocument, request.FileName);
var blobBlock = InitializeDownload(fileFinal);
if (!blobBlock.Exists())
{
throw new FileNotFoundException("Error");
}
var stream = new MemoryStream();
blobBlock.DownloadToStream(stream);
return File(request.FileName)
}
private CloudBlob InitializeDownload(string uri)
{
var blobBlock = _blobClient.GetBlobReference(uri);
return blobBlock;
}
This way, i'm getting just one file. But i need to see and download all files inside http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/
Thanks
Adding more details. You will need to use one of the listing APIs provided by the client library: CloudBlobContainer.ListBlobs(), CloudBlobContainer.ListBlobsSegmented(), and CloudBlobContainer.ListBlobsSegmentedAsync() (and various overloads.). You can specify the directory prefix, and the service will only enumerate blobs matching the prefix. You can then download each blob. You may also want to look at the ‘useFlatBlobListing’ argument, depending on your scenario.
http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblobcontainer.listblobs.aspx
In addition AzCopy (see http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx) also supports this scenario of downloading all blobs in a given directory path.
Since each blob is a separate web resource, function above will download only one file. One thing you could do is list all blobs using the logic you are using and then download those blobs on your server first, zip them and the return that zip file to your end user.
Use AzCopy functionalities, now, it has a lot of supports.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10

Upload all files from local storage to Azure Blob Storage

I am currently struggling to upload multiple files from the local storage to the Azure Blob Storage, I was wondering if anyone could help me, below is the code i was previously using to upload a single zip file.
private void SaveZip(string id, string fileName, string contentType, byte[] data)
{
// Create a blob in container and upload image bytes to it
var blob = this.GetContainer().GetBlobReference(fileName);
blob.Properties.ContentType = contentType;
// Create some metadata for this image
var metadata = new NameValueCollection();
metadata["Id"] = id;
metadata["Filename"] = fileName;
}
SaveZip(
Guid.NewGuid().ToString(),
zipFile.FileName,
zipFile.PostedFile.ContentType,
zipFile.FileBytes);
Thanks, Sami.
It's quite straightforward with Set-AzureStorageBlobContent from azure storage powershell.
ls -File -Recurse | Set-AzureStorageBlobContent -Container upload
MSDN documentation : http://msdn.microsoft.com/en-us/library/dn408487.aspx
I don't think there's any build-in methods you can use to upload multiple files to the BLOB. What you can do is to upload them one by one, or parallel.
If you're just starting to work with Blob Storage, I'd encourage you to take a look at the "How to" article we've published. Specifically, the section on "How to Upload a Blob into a Container" should be helpful. Beyond that, Shaun is correct - there is no built-in support in the StorageClient library for uploading multiple files at once, but you can certainly upload them one-by-one.
If your need is just to get it done, and not to make an app out of it, you should consider checking out Cloud Storage Studio.
Like CodeThug said, "You never do anything with the byte array".
You have to upload the data stream to the blob and you are done.

Resources