how to serve static file in node.js? - upload

in my app user can upload images to server,and i put the uploaded file in upload folder like this
public
upload
but when i serve the
<img src='upload/image.jpg'/>
there is a 404 error.
i use express and use the
app.use(express.static(__dirname + '/public'));
does it mean i have to put all the uploaded in the public folder, what if there is so many images that the hard drive can not hold, can't i put the rest in a another server?

No you don't have to put all uploaded files to public folder.
Express allows to configure multiple middle-ware so you can add one more middle-ware to serve static content from the desired directory.
If you want directory to be specified in Url path then you can use directory mount feature like this !!
app.use('/upload',express.static(__dirname + '/upload'));
Ofcouse you can use other server to store uploaded files and you can retrieve those file by proxing request to it.
app.get('/upload/:fileName',function(req,res) {
var options = {
port: 80,
method : 'GET',
hostname : "localhost",
path : "/upload/" + req.params.fileName
};
var req = http.request(options,function(response) {
response.pipe(res);
});
req.on('error',function(err) {
res.statusCode = 404;
res.end("Error : file not found");
});
req.end();
});
This code will work not only when uploaded files are stored in the original server but also if you have files on some other server too !!

You can put the images in ./public/upload

Try using this:
<img src='/upload/image.jpg'/>
Notice the slash before upload. It could be that the problem isn't where you save the files, but rather how you're referencing them. If the page with the img tag isn't in your root directory, then the relative url upload/image.jpg won't work. You need to use /upload/image.jpg.

If you
app.use(express.static(__dirname + '/public'));
and ./public contains:
/img.png
/img2.png
/logo.png
yourapp.com/img.png will be img.png.
If you want yourapp.com/img.png to be a 404 and yourapp.com/public/img.png to be img.png, you need to replace
app.use(express.static(__dirname + '/public'));
with
app.use("/public",express.static(__dirname + '/public'));

Related

creating CSV file in local drive with asp.net in azure

I have a webapp in which i can create csv file and locate it in my c drive,
It works fine when running locally, but once i deploy the application to Azure
I'm getting :
UnauthorizedAccessException: Access to the path 'C:\Tuesday_HH19_MI6.csv' is denied.
How can i allow the website to access and create a file in the user's local drive?
I attached the entire log exception from azure if this helps,
Thank you
A Web App running in Azure can't directly save it to your user's local drive, but it can generate the CSV and then prompt them to download it via the browser. You can use a few options depending on if you are trying to send that already exists on the filesystem or if you have generated it dynamically and have it as a byte array or stream.
Here are some sample controller methods to give you and idea. Your controller could be doing a bunch of stuff before the return statement, these examples are simplified.
Existing file on filesystem, use a FileResult:
public FileResult DownloadFile()
{
// create the file etc and save to FS
return File("/Files/File Result.pdf", "text/csv", "MyFileName.csv");
}
If the file is generated in memory and you have it as a byte array:
public FileContentResult DownloadContent()
{
// Create CSV as byte array
var myfile = MyMethodtoCreateCSV();
return new FileContentResult(myfile, "text/csv") {
FileDownloadName = "MyFileName.csv"
};
}

Getting "This site can’t be reached" from remote server

I have a .NET MVC project and I'm trying to download a file from temporary folder. Here is my view code:
window.open('#Url.Action("DownloadDocument", "Controller")?fileName=' + fileName, '_blank');
And here is my controller code:
public FileResult DownloadDocument(string fileName)
{
string path = Web.Common.Properties.Settings.Default.TempDocPath + "OpenTable\\";
if (!Directory.Exists(path))
{
Directory.CreateDirectory(path);
}
return File(fileName, "application/xml", fileName + ".xml");
}
This code works on IIS and file is being downloaded. But I'm getting 500 error when trying to download it from remote server.
This site can’t be reached
The webpage at <...> might be temporarily down or it may have moved permanently to a new web address.
ERR_INVALID_RESPONSE
And the worst part is that there are some almost exact code snippets in the system which are working in both IIS and remote server.
Any suggestions what could be done?
UPDATE
Good news. When I tried postman it threw a file not found error with wrong path. It's an easy fix:
return File(Path.Combine(path, fileName), "application/xml", fileName + ".xml");
So it puzzles me... How and why it was working locally on IIS with wrong path?
You need to create your temp folder inside the root folder of your web app, for example ~/Temp, then use below code to access to that folder instead of Web.Common.Properties.Settings.Default.TempDocPath, I think this maybe just a constant.
System.Web.Hosting.HostingEnvironment.MapPath("~/Temp")

How to upload large files directly to S3 via Rails in a AngularJS App

In my project I have a requirement to allow users to upload very large files (up to 4GB) to S3.
My current setup includes Rails 4.2, S3, Paperclip, Mongoid for the API and AngularJS (with this upload library) for the front-end. Currently I can upload regular sized (up to 20MB) files perfectly fine.
So, I need to allow the front-end app users to upload zip files from an Angular JS app, through a Rails API to S3.
There is no need for file processing on the server, so all the serve needs to do with the file is keep track of its url (the location on S3) and file name so these details can be referenced later on.
After some search I've found this tutorial and this gem that suggest uploading the files directly to S3 bypassing the app altogether.
In my Angular controller I have this code to upload small sized files:
$scope.submitItem = function(file, file2, file3) {
Upload.upload({
url: API_PROVIDER.full_path + 'instant_downloads.json',
data: {file: file, data: $scope.instant_download, categories: $scope.instant_download.categories, avatar: file2,
subcategories: $scope.subcategories, zipfile: file3, options: $scope.instant_download.options}
}).then(function (result) {
if (result.data.success) {
$scope.showAlert('Instant Download created successfully', 'success');
$state.go('app.instant_downloads.index');
}
}, function (result) {
angular.forEach(result.message, function(value, key){
$scope.showAlert(_.capitalize(key) + ': ' + value.join(', '), 'danger');
});
}, function (evt) {
$scope.progressPercentage = parseInt(100.0 * evt.loaded / evt.total);
});
};
And the above code works well. If I try the above code on a large file I get a nginx timeout errors.
Then I tried to use aforementioned gem (while still using the same AngularJS code) to bypass the server but I get an error on the S3 side (it seems it also timeout.
Can someone point me out how I can upload directly to S3, using the AngularJS controller above with a callback to the server so I can get the confirmation the file has been uploaded successfully along with its url?

Load images saved in iOS App Documents directory from cordova html page (cordova)?

I have built 1 app that has to run on iOS, android and Windows Phone. all work fine apart from iOS. My app takes a photo using the camera, resizes that image and saves the image.
On iOS, these images were being saved into the tmp directory of the application, which was being deleted, so now i save the images into the Documents folder.
I then save the path to this image into my sqlite database. On my html page, i reference the file using the url, such as
var/mobile/application/GUID-HERE/Documents/imageName.jpg
Now it seems that when i rebuild my application in xCode, the application guid is changed, so all my file references that have been previously saved, are now invalid.
So
Can i reference the documents folder relatively from my HTML page?
OR
Can i stop my application changing this GUID?
Use the toInternalURL() property of the File plugin. After you save your image, you can get a local url without that GUID and save that to your database instead. The url has this format
cdvfile://localhost/persistent/path/to/file
Documentation
An example:
(mostly from file-transfer docs. requires file and file-transfer plugins)
resolveLocalFileSystemURL(cordova.file.documentsDirectory, function success(entry) {
download(entry.toInternalURL() + "image.jpg");
});
var download = function(localUrl) {
var fileTransfer = new FileTransfer();
var uri = encodeURI("https://cordova.apache.org/images/cordova_bot.png");
fileTransfer.download(
uri,
localUrl,
function(entry) {
document.body.innerHTML = entry.toInternalURL();
var img = document.createElement("img");
img.setAttribute('src', entry.toInternalURL());
document.body.appendChild(img);
},
function(error) {
console.log("error code" + error.code);
},
false,
{
headers: {
"Authorization": "Basic dGVzdHVzZXJuYW1lOnRlc3RwYXNzd29yZA=="
}
}
);
}

Azure download blob to users computer

I was working on my download blob function when I ran into some problems..
I want the user to be able to download a blob and I want a specific filename on that item when its downloaded to the users computer. I also want the user to decide which folder the item should be saved to.
This is my not so good looking code so far:
var fileName = "tid.txt9c6b412a-270a-4f67-8e65-7ce2bf87503d";
var containerName = "uploads";
CloudStorageAccount account = CloudStorageAccount.DevelopmentStorageAccount;
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference(fileName);
using (var filestream = System.IO.File.OpenWrite(#"C:\Info\tid.txt9c6b412a-270a-4f67-8e65-7ce2bf87503d"))
{
blob.DownloadToStream(filestream);
}
fileName = the blob name
Is it possible to change the name? The file ending gets all messed up with my guid.
At the moment the download to folder is C:\Info.. How would this work when the website is published? How can I let the user decide which folder the item should be saved to? Am i doing this right?
thank you in advance
/Filip
How would this work when the website is published?
Slow for the user and expensive for you. You are streaming the BLOB through your app, so you'll bottleneck. Use Shared Access Signatures and download the blob directly from the browser. Use Content-Disposition as part of the URL to have the browser prompt the user with a Save As dialog. See Javascript download a URL - Azure Blob Storage.
Your question: Is it possible to change the name?
The name of the blob and the name on the user's disk are your/his choice. There is no need for them to match, except perhaps to avoid confusion. On the off chance that your user will upload it again (with changes, perhaps?) save some metadata so the original file and the updated file can be related in blob storage.
Once you execute the line:
var blob = container.GetBlockBlobReference(fileName);
... you have told Azure all it needs to know to locate the blob.
In the line:
using (var filestream = System.IO.File.OpenWrite...
... you tell your code where to put the file on the disk. You say it's a website, so this statement will put the file onto the web server's disk, not your user's. To get the file onto the user's disk, you need one more step - download the file from the web server (web role instance) to your user's computer. You can give him control of the folder and file name. Here is the relevant section in MSDN:
Downloading and Uploading Files
Is this download function acceptable? Slow/expensive or is it as good as it gets?
public void DownloadFile(string blobName)
{
CloudBlobContainer blobContainer = CloudStorageServices.GetCloudBlobsContainer();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(blobName);
MemoryStream memStream = new MemoryStream();
blob.DownloadToStream(memStream);
Response.ContentType = blob.Properties.ContentType;
Response.AddHeader("Content-Disposition", "Attachment; filename=" + blobName.ToString());
Response.AddHeader("Content-Length", (blob.Properties.Length - 1).ToString());
Response.BinaryWrite(memStream.ToArray());
Response.End();
}

Resources