Sensenet API: Best Way to Create Folder structure - task

I want upload content to a specific path on sensenet. This path may not be already created on sensenet. So if the path do not exists, the system has to create it.
Using the Client API of Sensenet, the method available to create content runs Asynchronous. I tried to force it to run Synchronous but it seems not to happen, becaus sometimes the second folder is not created...
Here's a sample code:
private async Task CreateFolder(String parentPath, String folderName){
var folder = Content.CreateNew(parentPath, "Folder", folderName);
await folder.SaveAsync();
}
CreateFolder("/Root/Sites/Test/DocumentWorkSpace", "folder").Wait();
CreateFolder("/Root/Sites/Test/DocumentWorkSpace/folder", "subfolder").Wait();
I can use Tools.EnsurePathAsync(path) to create folder structure. But after this, I want to upload the file... (I'm having the same problem of the folder structure reported above.)
Task.Run(() => Tools.EnsurePathAsync(pathDocType)).Wait();
Task.Run(() =>{
var stream = new MemoryStream(byteContent);
Content.UploadAsync(pathDocType, "test.doc", stream).WaitAndUnwrapException();
stream.Dispose();
}).Wait();

You have multiple options, depending on your use case.
Importing a whole folder structure
Take a look at the import api in the client library. It is actually a single method that you can use to import a folder structure from the file system. It handles all folder creation and upload internally:
await Importer.ImportAsync(sourcePath, targetPath, options);
The options object can be used to customize the behavior of the algorithm (e.g. max degree of parallelism for large structures, defining a custom container type instead of the default Folder or defining an optional logging delegate method).
This method is scalable, meaning it is capable of importing a huge number of folders and files efficiently. It will return after importing every folder and file (filtered by the optional filters in the options parameter).
Ensuring a parent folder chain
There is a single method for creating parent folders (if they do not exist yet).
await Tools.EnsurePathAsync(path);
You can call this with a non-existing path (e.g. /Root/Folder1/Folder2) and it will create it for you. This method only deals with folders, it has nothing to do with files.

Related

Using Dart for HTML5 app, but want to load a file from the server-side

I'm new to Dart, and trying to create my first Dart web game. Sorry if I missed an answered question related to this. I did search, but wasn't having much luck.
To load a level, I would like to be able to read in a text file with the level data, then process that and use it to build the level. Unfortunately, I am running into the issue where dart:io and dart:html can not both be loaded if you want to use the File() object. From what I can tell, dart:html's File() object is client-side, so that would not be able to open the text-file I will have on the server.
Is there another way to read in a text file from the server? If not, do I need to set up a database just to store the game data, or is there a better option I'm not thinking about here?
In case it helps, the game data I'm working with currently is just a text file that gives a map of what the level will look like. For example:
~~~~Z~~~~
P
GGGGLLGGG
Each of those characters would denote a type of block to be placed in the level. It's not the best way to store levels, but it is pretty easy to create and easy to read in and process.
Thanks so much for the help!
If the file you are loading is a sibling of the index.html your game is loaded from, then you can just make an HTTP request to fetch the file.
To download web/level1.json you can use
Future<String> getGameData(String name) {
var response = await HttpRequest.getString('${name}.json');
print(response);
return response;
}
otherMethod() async {
var data = await getGameData('level1');
}
See also https://api.dartlang.org/stable/1.24.3/dart-html/HttpRequest-class.html

when to use globalIdField

As far as I can tell, relay relies on nodeDefitions for queries when variables are being changed.
It'd appear that all objects with an id field should be a valid node. However, if I have data like this:
type User {
id: globalIdField('User'),
name: String,
folders: [ Folder ]
}
type Folder {
id: ???,
...
}
The data is stored in a document based solution, and the Folder objects are nested in the User object. But Folder objects are given an id so that some other objects could reference the Folder objects under the context of a User.
If Folder implements the nodeInterface, and uses globalIdField, then I need to figure out a way to fetch the Folder object from a globalId, meaning that I might have to scan through all the Users to find it, have a data map that'd allow me to find the object, or normalize the data so that Folders are in their own table.
If it doesn't implement the nodeInterface, and just uses Strings as id field, what happens when I try to mutate some fields on the Folder object?
It's often useful for these objects to have ids, even if there's no real id directly in your database. For example, if you want to write a mutation to rename a folder, it'd be great to have a global ID to reference this folder. Relay also uses them internally when the UI requests some additional data on a node that's not loaded yet.
One way to generate a global ID for the folder could be to take a prefix and add the user id and a way to identify the folder within the user, for example:
var folderID = ['folder', userID, folderID].join(':');
Whenever you want to resolve this id on your server, you split at the :, see that you want to load a folder by looking at the first part and then go via user to the right folder.

How to avoid renaming Oledbconnection data source

When transfering file to another location i always need to change the source or directory..
Dim cnn = New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Users\Renz\Desktop\FINAL\Database\AuditDB.mdb")
Is there a way I can avoid that?
You could use a path relative to your applications location, e.g.
Dim cnn = New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=Database\AuditDB.mdb")
Where in this example I am presuming that your database is stored in a folder called Database in the same folder as your program.

Strategies For Creating Unique File Names/Locations for Uploaded Files

My ASP.NET MVC app accepts files uploaded and stores these in a single folder. However I want to ensure that when a user uploads a file the app accepts any filename, however this will fail when users try upload files with the same file name.
I guess I could create separate folders for each file but I'd like a clean and flat directory structure. Currently I append a GUID to the file name but this isn't a nice solution as it results in weird filenames when a user downloads a file.
I thought about storing the file data in a database and then writing it out to a file when it was requested, but this is a lot of overhead.
Any alternative approaches?
In order to keep your directory structure flat store your files by appending a GUID (as you already did). In your download handler (controller action method) first convert the GUID based file name to the original file name by removing the GUID from the file name. Then use the FileContentResult class to transfer the file. You can set the FileDownloadName property to specify the file name for the file to transfer. In fact the FileDownloadName property sets the Content-Disposition header under the hood.
Here is a small code example (action method of your download controller class):
string fileToDownload = "test.jpg_4274B9D4-9084-441C-9617-EAD03CC9F47F";
string originalFileName = fileToDownload.Substring(0, fileToDownload.LastIndexOf('_'));
FileContentResult result = new FileContentResult(
System.IO.File.ReadAllBytes(Server.MapPath(String.Format("~/files/{0}", fileToDownload))), "application/binary");
result.FileDownloadName = originalFileName; // Sets the Content-Disposition header
return result;
The user downloading the file is prompted to open/save a file with the original file name.
Hope, this helps.

ASP.NET MVC - Sharing Session State Between Controllers

I am still mostly unfamiliar with Inversion of Control (although I am learning about it now) so if that is the solution to my question, just let me know and I'll get back to learning about it.
I have a pair of controllers which need to a Session variable, naturally nothing too special has happen because of how Session works in the first place, but this got me wondering what the cleanest way to share related objects between two separate controllers is. In my specific scenario I have an UploadController and a ProductController which work in conjunction with one another to upload image files. As files are uploaded by the UploadController, data about the upload is stored in the Session. After this happens I need to access that Session data in the ProductController. If I create a get/set property for the Session variable containing my upload information in both controllers I'll be able to access that data, but at the same time I'll be violating all sorts of DRY, not to mention creating a, at best, confusing design where an object is shared and modified by two completely disconnected objects.
What do you suggest?
Exact Context:
A file upload View posts a file to UploadController.ImageWithpreview(), which then reads in the posted file and copies it to a temporary directory. After saving the file, another class produces a thumbnail of the uploaded image. The path to both the original file and the generated thumbnail are then returned with a JsonResult to a javascript callback which updates some dynamic content in a form on the page which can be "Saved" or "Cancelled". Whether the uploaded image is saved or it is skipped, I need to either move or delete both it and the generated thumbnail from the temporary directory. To facilitate this, UploadController keeps track of all of the upload files and their thumbnails in a Session-maintained Queue object.
Back in the View: after the form is populated with a generated thumbnail of the image that was uploaded, the form posts back to the ProductsController where the selected file is identified (currently I store the filename in a Hidden field, which I realize is a horrible vulnerability), and then copied out of the temp directory to a permanent location. Ideally, I would like to simply access the Queue I have stored in the Session so that the form does not need to contain the image location as it does now. This is how I have envisioned my solution, but I'll eagerly listen to any comments or criticisms.
A couple of solutions come to mind. You could use a "SessionState" class that maps into the request and gets/sets the info as such (I'm doing this from memory so this is unlikely to compile and is meant to convey the point):
internal class SessionState
{
string ImageName
{
get { return HttpContext.Current.Session["ImageName"]; }
set { HttpContext.Current.Session["ImageName"] = value; }
}
}
And then from the controller, do something like:
var sessionState = new SessionState();
sessionState.ImageName = "xyz";
/* Or */
var imageName = sessionState.ImageName;
Alternatively, you could create a controller extension method:
public static class SessionControllerExtensions
{
public static string GetImageName(this IController controller)
{
return HttpContext.Current.Session["ImageName"];
}
public static string SetImageName(this IController controller, string imageName)
{
HttpContext.Current.Session["ImageName"] = imageName;
}
}
Then from the controller:
this.SetImageName("xyz");
/* or */
var imageName = this.GetImageName();
This is certainly DRY. That said, I don't particularly like either of these solutions as I prefer to store as little data, if any, in session. But if you're intent is to hold onto all of this information without having to load/discern it from some other source, this is the quickest (dirtiest) way I can think of to do it. I'm quite certain there's a much more elegant solution, but I don't have all of the information about what it is you're trying to do and what the problem domain is.
Keep in mind that when storing information in the session, you will have to dehydrate/rehydrate the objects via serialization and you may not be getting the performance you think you are from doing it this way.
Hope this helps.
EDIT: In response to additional information
Not sure on where you're looking to deploy this, but processing images "real-time" is a sure fire way to be hit with a DoS attack. My suggestion to you is as follows -- this is assuming that this is public facing and anyone can upload an image:
1) Allow the user to upload an image. This image goes into the processing queue for background processing by the application or some service. Additionally, the name of the image goes into the user's personal processing queue -- likely a table in the database. Information about background processing in a web app can be found # Schedule a job in hosted web server
2) Process these images and, while processing, display a "processing graphic". You can have an ajax request on the product page that checks for images being processed and trys to reload them every X seconds.
3) While an image is being "processed", the user can opt out of processing assuming they're the one that uploaded the image. This is available either on the product page(s) that display the image or on a separate "user queue" view that will allow them to remove the image from consideration.
So, you end up with some more domain objects and those objects are managed by the queue. I'm a strong advocate of convention over configuration so the final destination of the product image(s) should be predefined. Something like:
images/products/{id}.jpg or, if a collection, images/products/{id}/{sequence}.jpg.
You then don't need to know the destination in the form. It's the same for all images.
The queue then needs to know where the temp image was uploaded and what the product id was. The queue worker pops items from the queue, processes them, and stores them accordingly.
I know this sounds a little more "structured" than what you originally intended, but I think it's a little cleaner.
Is there complete equivalence between the UploadController and ProductController?
As files are uploaded by the UploadController, data about the upload is stored in the Session. After this happens I need to access that Session data in the ProductController.
As I read that the UploadControl needs read and write access to Upload data, the ProductController needs only read.
If that's true then you can make it clear by using an immuatable wrapper around the upload information and have the UploadController put that into the session.
The Session itself is by definiton a public shared noticeboard, decouples explicit relationships at the cost of allowing anyone to get and put. You could allow the ProductController to know about the UploadController and hence remove the need for passing the upload information via the session. My instinct is that the upload info is interesting to the public, so using Session is reasonable.
I don't see any DRY violation here, we are explicitly trying to separate responsibilities.

Categories

Resources