Loading leveldb from stream - stream

Is there a way to load a leveldb store from a data stream?
If I were to take the stream of a leveldb instance and tuck it in a DLL as a manifest resource stream, will I have a way to just load that db from that stream later when I retrieve the manifest resource from my DLL? Essentially, I am looking for a way to build, save, and later load a leveldb without ever writing to a physical file on disk.
Thanks in advance for any useful info.
Raja.

You might have already figured this out since it's been a long time since you asked.
leveldb allows you to override the "Environment" such that reads and writes don't need to access a physical file.
You might want to look at this file:
http://code.google.com/p/leveldb/source/browse/helpers/memenv/memenv_test.cc
in particular the DBTest, for an example.

Related

Difference between DownloadTo(stream) vs DownloadTo(string) in working with Azure Storage blob

I am working with the Azure Storage and I am using the functionality to download the blob from the Azure Storage Container. I did the search and found several overload methods to download.
I want to understand what is the difference between the method that takes the stream vs string.
I currently used the DownloadTo(string folderTodownLoad). However, if I want to use the stream what should I pass as a parameter and what would be the purpose or benefit if any over Download(string) method.
BlobClient Class
DownloadTo(string) downloads directly to your file system and supports downloading multiple blocks at a time.
DownloadTo(stream) downloads a single block at a time to an stream, the advantage of this, is that it provides you more flexibility.
An simple example could be downloading to an GZipStream so you can decompress an file while downloading it from blob storage.
Another example could be downloading to an MemoryStream, so you can process the result in memory right away, instead of having to load the file from disk.

Storing CoreData to RackSpace

I am developing an app on xCode 5, iOS 7. I have some data stored in CoreData. My requirement is to upload that data to RackSpace. Whats the best way to do this?
Where can I find .sqlite file associated with CoreData?
The SQLite file is wherever you put it. There's no magic to it, you have to tell Core Data exactly where you want the file. You do this when you call addPersistentStoreWithType:configuration:URL:options:error:. The URL argument is the location of the SQLite file.
If you try and use the file directly, make sure that:
You shut down your Core Data stack completely before doing so, to make sure that all unsaved data has been flushed to disk. That means no managed objects, managed object contexts, or persistent store coordinators in memory anywhere.
Make sure to get the SQLite journal files. If your store file were named Foo.sqlite, they will be named Foo.sqlite-wal and Foo.sqlite-shm and will be located in the same directory. If you don't get these files, most or all of your data will be missing.
However simply uploading the file is not a good solution for syncing data. To sync data, you'd have to download a copy of the data, load that, and compare every object in the file with every object that's already on the phone. It's not impossible but it's definitely making things much more difficult than necessary. There are many options that can simplify the process, including full service providers like Parse, SDKs that let you use one of a variety of back ends like Ensembles.io, and others.

XNA Content Loading

Anybody know how i can stream load Model data in XNA 4 Content Loader ?
And not have to specify a farken filename ...
Was hoping somehow get a stream running as the model data resides on a db.
And no, im not interested in temp files :p
Regards
If you really want to use the Content Pipeline for this, you can subclass ContentManager and override OpenStream. This would assume that the built XNB files reside in the database and you can provide the stream to them when requested :)
Content.Load<Model>() requires a string parameter, so I don't think that you'll be able to stream a Model in. I should mention that the string parameter that is required is a filePath, so you wouldn't be able to convert a stream to a string and pass that in.
I believe that this should help. If I remember correctly, I can't check because I'm at work, it lets you run the content importer on a stream dynamically. So you should be able to dump your file in a MemeoryStream and load it so long as its any of the file types XNA supports.
Be warned though this is pretty slow because you have to compile every file when you load it.
Curious of why you need to load from a database. I assume its from a remote server? In which case the download time + compile time might be a bit much. since it sounds like you would have to do it every time the game is loaded.
http://create.msdn.com/en-US/education/catalog/sample/winforms_series_2

In Memory INI File Writer

I have an MFC app which is wizard based. The App asks a user a variable number of questions which are then written to an INI file which is later encrypted when the user clicks Finish.
All the INI file parsers I have seen so far seen read or write to a physical file on Disk. I don't want to do this as the INI file contains confidential information. Instead I would like the INI file to be only based in memory and never written to disk in an un-encrypted form.
As the app allows users to go back and change answers, It occurred to me that I could use an in memory Database for this purpose but again I do not want anything written to Disk and don't want to ship a DB with my app if it can be avoided.
I have to use an INI file as it the file when un-encrypted will be processed by a 3rd party.
Any suggestions welcomed.
Thanks..
I have an IniFile C++ class which allows you to work with Ini files in memory:
http://www.lemonteam.com/downloads/inifile.h
It's a short, well documented single .h file. Sample usage:
IniFile if ( "myinifile.ini" );
if.SetString( "mykey", "myvalue" );
// Nothing gets actually written to disk until you call Flush(), Close() or the object is deleted
if.Flush();
if.Close();
You should be able to modify the Flush() method so that it applies some kind of encryption to the saved data.
Sounds like a good application for a memory-mapped file, since you can control when your in-memory view gets flushed back to the file on disk.
Why would you need to have it in an ini file format if it is never stored to disk?
Why not just keep it in memory as a data structure and use your normal ini file methods to write it to disk when you want to.
If you don't want to save into file, what is the point of using INI file then?
INI API is bascially a property bag or key value pair based on disk file.
If you don't want to use file, I suggest you use your own hash or dictionary data structure to store the key value parirs

Techniques for writing critical text data

We take text/csv like data over long periods (~days) from costly experiments and so file corruption is to be avoided at all costs.
Recently, a file was copied from the Explorer in XP whilst the experiment was in progress and the data was partially lost, presumably due to multiple access conflict.
What are some good techniques to avoid such loss? - We are using Delphi on Windows XP systems.
Some ideas we came up with are listed below - we'd welcome comments as well as your own input.
Use a database as a secondary data storage mechanism and take advantage of the atomic transaction mechanisms
How about splitting the large file into separate files, one for each day.
If these machines are on a network: send a HTTP post with the logging data to a webserver.
(sending UDP packets would be even simpler).
Make sure you only copy old data. If you have a timestamp on the filename with a 1 hour resolution, you can safely copy the data older than 1 hour.
If a write fails, cache the result for a later write - so if a file is opened externally the data is still stored internally, or could even be stored to a disk
I think what you're looking for is the Win32 CreateFile API, with these flags:
FILE_FLAG_WRITE_THROUGH : Write operations will not go through any intermediate cache, they will go directly to disk.
FILE_FLAG_NO_BUFFERING : The file or device is being opened with no system caching for data reads and writes. This flag does not affect hard disk caching or memory mapped files.
There are strict requirements for successfully working with files opened with CreateFile using the FILE_FLAG_NO_BUFFERING flag, for details see File Buffering.
Each experiment much use a 'work' file and a 'done' file. Work file is opened exclusively and done file copied to a place on the network. A application on the receiving machine would feed that files into a database. If explorer try to move or copy the work file, it will receive a 'Access denied' error.
'Work' file would become 'done' after a certain period (say, 6/12/24 hours or what ever period). So it create another work file (the name must contain the timestamp) and send the 'done' through the network ( or a human can do that, what is you are doing actually if I understand your text correctly).
Copying a file while in use is asking for it being corrupted.
Write data to a buffer file in an obscure directory and copy the data to the 'public' data file periodically (every 10 points for instance), thereby reducing writes and also providing a backup
Write data points discretely, i.e. open and close the filehandle for every data point write - this reduces the amount of time the file is being accessed provided the time between data points is low

Resources