How to avoid renaming Oledbconnection data source - oledbconnection

When transfering file to another location i always need to change the source or directory..
Dim cnn = New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Users\Renz\Desktop\FINAL\Database\AuditDB.mdb")
Is there a way I can avoid that?

You could use a path relative to your applications location, e.g.
Dim cnn = New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=Database\AuditDB.mdb")
Where in this example I am presuming that your database is stored in a folder called Database in the same folder as your program.

Related

Sensenet API: Best Way to Create Folder structure

I want upload content to a specific path on sensenet. This path may not be already created on sensenet. So if the path do not exists, the system has to create it.
Using the Client API of Sensenet, the method available to create content runs Asynchronous. I tried to force it to run Synchronous but it seems not to happen, becaus sometimes the second folder is not created...
Here's a sample code:
private async Task CreateFolder(String parentPath, String folderName){
var folder = Content.CreateNew(parentPath, "Folder", folderName);
await folder.SaveAsync();
}
CreateFolder("/Root/Sites/Test/DocumentWorkSpace", "folder").Wait();
CreateFolder("/Root/Sites/Test/DocumentWorkSpace/folder", "subfolder").Wait();
I can use Tools.EnsurePathAsync(path) to create folder structure. But after this, I want to upload the file... (I'm having the same problem of the folder structure reported above.)
Task.Run(() => Tools.EnsurePathAsync(pathDocType)).Wait();
Task.Run(() =>{
var stream = new MemoryStream(byteContent);
Content.UploadAsync(pathDocType, "test.doc", stream).WaitAndUnwrapException();
stream.Dispose();
}).Wait();
You have multiple options, depending on your use case.
Importing a whole folder structure
Take a look at the import api in the client library. It is actually a single method that you can use to import a folder structure from the file system. It handles all folder creation and upload internally:
await Importer.ImportAsync(sourcePath, targetPath, options);
The options object can be used to customize the behavior of the algorithm (e.g. max degree of parallelism for large structures, defining a custom container type instead of the default Folder or defining an optional logging delegate method).
This method is scalable, meaning it is capable of importing a huge number of folders and files efficiently. It will return after importing every folder and file (filtered by the optional filters in the options parameter).
Ensuring a parent folder chain
There is a single method for creating parent folders (if they do not exist yet).
await Tools.EnsurePathAsync(path);
You can call this with a non-existing path (e.g. /Root/Folder1/Folder2) and it will create it for you. This method only deals with folders, it has nothing to do with files.

Writing generic function in F# to save csv files

I have a requirement to read a csv file and generate several different projections of the data from save to files. I'm using CsvProvider to read the file and then map the data into other CsvProviders which I save to disk. Currently I have seperate save functions to save each of these projections. I'm wondering if I could create a generic saveCsv function like this?
let saveCsv<'a when 'a :> CsvProvider> (csvType:'a) fileName data =
let csv = new csvType(data)
csv.Save(fileName)
I can't seem to get the type constraint correct and also how do I new up a instance of the csvtype?

when to use globalIdField

As far as I can tell, relay relies on nodeDefitions for queries when variables are being changed.
It'd appear that all objects with an id field should be a valid node. However, if I have data like this:
type User {
id: globalIdField('User'),
name: String,
folders: [ Folder ]
}
type Folder {
id: ???,
...
}
The data is stored in a document based solution, and the Folder objects are nested in the User object. But Folder objects are given an id so that some other objects could reference the Folder objects under the context of a User.
If Folder implements the nodeInterface, and uses globalIdField, then I need to figure out a way to fetch the Folder object from a globalId, meaning that I might have to scan through all the Users to find it, have a data map that'd allow me to find the object, or normalize the data so that Folders are in their own table.
If it doesn't implement the nodeInterface, and just uses Strings as id field, what happens when I try to mutate some fields on the Folder object?
It's often useful for these objects to have ids, even if there's no real id directly in your database. For example, if you want to write a mutation to rename a folder, it'd be great to have a global ID to reference this folder. Relay also uses them internally when the UI requests some additional data on a node that's not loaded yet.
One way to generate a global ID for the folder could be to take a prefix and add the user id and a way to identify the folder within the user, for example:
var folderID = ['folder', userID, folderID].join(':');
Whenever you want to resolve this id on your server, you split at the :, see that you want to load a folder by looking at the first part and then go via user to the right folder.

Strategies For Creating Unique File Names/Locations for Uploaded Files

My ASP.NET MVC app accepts files uploaded and stores these in a single folder. However I want to ensure that when a user uploads a file the app accepts any filename, however this will fail when users try upload files with the same file name.
I guess I could create separate folders for each file but I'd like a clean and flat directory structure. Currently I append a GUID to the file name but this isn't a nice solution as it results in weird filenames when a user downloads a file.
I thought about storing the file data in a database and then writing it out to a file when it was requested, but this is a lot of overhead.
Any alternative approaches?
In order to keep your directory structure flat store your files by appending a GUID (as you already did). In your download handler (controller action method) first convert the GUID based file name to the original file name by removing the GUID from the file name. Then use the FileContentResult class to transfer the file. You can set the FileDownloadName property to specify the file name for the file to transfer. In fact the FileDownloadName property sets the Content-Disposition header under the hood.
Here is a small code example (action method of your download controller class):
string fileToDownload = "test.jpg_4274B9D4-9084-441C-9617-EAD03CC9F47F";
string originalFileName = fileToDownload.Substring(0, fileToDownload.LastIndexOf('_'));
FileContentResult result = new FileContentResult(
System.IO.File.ReadAllBytes(Server.MapPath(String.Format("~/files/{0}", fileToDownload))), "application/binary");
result.FileDownloadName = originalFileName; // Sets the Content-Disposition header
return result;
The user downloading the file is prompted to open/save a file with the original file name.
Hope, this helps.

Best Way To Store Tons of Data

I'm working on an application that will need to pull from a list of data depending on where the user is located in the US. In a sense, I will have a database full of information based on their location, and a condition statement will determine while value from the list to use.
Example of data:
Tennessee:
Data2 = 25;
Data3 = 58;
...
Texas:
Data2 = 849;
Data3 = 9292;
...
So on...
My question is, what is the best practice to use when developing iOS apps and you have a lot of data? Should you just put all the related data in a file, and import that file when you need to like normal, or is there another method you should use? I know they state you should follow the MVC practice, and I think in this case my data would be considered the Model, but just want to double check if that applies here.
You have some options here:
SQLite database
Core Data (its not a relational database model like sqlite)
write to plain text file. (using NSFileManager )
NSKeyedArchiever
If you want to frequently keep appending data to a single file, I would recommend using sqlite fast and robust.

Resources