Download functionality MVC 4 - asp.net-mvc

I have created a web api which connects users to dropbox via OAuth. I am using an API to interact with Dropbox, which works locally as I would like, however when I deploy the API to my Azure server, I am unable to download. I had anticipated this would happen, as my API is currently hard codded to a path on my machine.
Here is the method I am using:
NOTE: I call this method through an ActionResult, as part of the MVC portion of my project
public FileSystemInfo DownloadFile(string root, string path)
{
var uri = new Uri(new Uri(DropboxRestApi.ApiContentServer),
String.Format("files?root={0}&path={1}",
root, UpperCaseUrlEncode(path)));
var oauth = new OAuth();
var requestUri = oauth.SignRequest(uri, _consumerKey, _consumerSecret, _accessToken);
var request = (HttpWebRequest) WebRequest.Create(requestUri);
request.Method = WebRequestMethods.Http.Get;
var response = request.GetResponse();
var metadata = response.Headers["x-dropbox-metadata"];
var file = ParseJson<FileSystemInfo>(metadata);
using (Stream responseStream = response.GetResponseStream())
using (MemoryStream memoryStream = new MemoryStream())
{
byte[] buffer = new byte[1024];
int bytesRead;
do
{
bytesRead = responseStream.Read(buffer, 0, buffer.Length);
memoryStream.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
file.Data = memoryStream.ToArray();
}
return file;
}
This is where I call the method in my action result.
var file = api.DownloadFile("dropbox", "Public/downloadThis.jpg");
path = file.Path;
file.Save(#"....\Desktop\DemoTest\Downloads\downloadThis.jpg"); --- this is the problem & *Save* is a stream writer
Is there a procedure to follow when downloading files from a server on a browser?

public ActionResult download(Models.downloadModel dowld, Models.LoggerView log)
{
string TC_ID = Request.QueryString["id"].ToString();
string filename = TC_ID+"_LoggerData" + ".zip";
Response.ContentType = "application/octet-stream";
Response.AppendHeader("Content-Disposition", "attachment;filename=" + filename);
Response.TransmitFile(Server.MapPath("~/files/" + filename));
Response.End();
}

Related

The request was aborted error while writing to request stream

public HttpWebResponse PushFileToWistia(byte[] contentFileByteArray, string fileName)
{
StringBuilder postDataBuilder = new StringBuilder();
postDataBuilder.Append("I am appending all the wistia config and setting here");
byte[] postData = null;
using (MemoryStream postDataStream = new MemoryStream())
{
byte[] postDataBuffer = Encoding.UTF8.GetBytes(postDataBuilder.ToString());
postDataStream.Write(postDataBuffer, 0, postDataBuffer.Length);
postDataStream.Write(contentFileByteArray, 0, contentFileByteArray.Length);
postDataBuffer = Encoding.UTF8.GetBytes("\r\n--" + boundary + "--");
postDataStream.Write(postDataBuffer, 0, postDataBuffer.Length);
postData = postDataStream.ToArray();
}
ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(AppConfig.WistiaCustomCourseBucket);
request.Method = "POST";
request.Expect = String.Empty;
request.Headers.Clear();
request.ContentType = "multipart/form-data; boundary=" + boundary;
request.ContentLength = postData.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postData, 0, postData.Length); //for file > 100mb this call throws and error --the requet was aborted. the request was canceled.
requestStream.Flush();
requestStream.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return response;
}
The above code works for video file mp4 less then 50mb. But when I try to upload a 100mb file it throws and exception (Request was aborted.) I need to support file size up to 1.5gb So now I am not sure if this approach is correct for such a big file size upload. Any suggestions in the right direction will be helpful...thanks(I am trying to upload the file to Wistia Server)
The exception is thrown at this line
-- requestStream.Write(postData, 0, postData.Length);
I have tried changing the web.config setting but didn't work:
httpRuntime targetFramework="4.5" maxRequestLength="2048576" executionTimeout="12000" requestLengthDiskThreshold="1024"
------Async Call-------
MemoryStream wistiaFileStream = null;
using (MemoryStream postDataStream = new MemoryStream())
{
postDataStream.Write(contentFileByteArray, 0, contentFileByteArray.Length);
wistiaFileStream = postDataStream;
postDataStream.Flush();
postDataStream.Close();
}
Stream requestStream = await request.GetRequestStreamAsync();
await requestStream.WriteAsync(wistiaMetadata, 0, wistiaMetadata.Length);
using (wistiaFileStream)
{
byte[] wistiaFileBuffer = new byte[500*1024];
int wistiaFileBytesRead = 0;
while (
(wistiaFileBytesRead =
await wistiaFileStream.ReadAsync(wistiaFileBuffer, 0, wistiaFileBuffer.Length)) != 0)
{
await requestStream.WriteAsync(wistiaFileBuffer, 0, wistiaFileBytesRead);
await requestStream.FlushAsync();
}
await requestStream.WriteAsync(requestBoundary, 0, requestBoundary.Length);
}
I would suggest moving to async and write file directly from file system to request in order to avoid triple buffering of 1.5GB in memory (warning below is not tested).
public async Task<HttpWebResponse> PushFileToWistiaAsync(string contentFilePath)
{
string boundary = "---------------------------" + DateTime.Now.Ticks.ToString("x");
string contentBoundary = "\r\n--" + boundary + "\r\n";
StringBuilder wistiaMetadataBuilder = new StringBuilder();
wistiaMetadataBuilder.Append("--" + boundary + "\r\n");
// Append all the wistia config and setting here
byte[] wistiaMetadata = Encoding.UTF8.GetBytes(wistiaMetadataBuilder.ToString());
byte[] requestBoundary = Encoding.UTF8.GetBytes(contentBoundary);
ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(AppConfig.WistiaCustomCourseBucket);
request.Method = "POST";
request.Headers.Clear();
request.Expect = String.Empty;
request.ContentType = "multipart/form-data; boundary=" + boundary;
Stream requestStream = await request.GetRequestStreamAsync();
await requestStream.WriteAsync(wistiaMetadata, 0, wistiaMetadata.Length);
using (FileStream wistiaFileStream = new FileStream(contentFilePath, FileMode.Open, FileAccess.Read))
{
byte[] wistiaFileBuffer = new byte[500 * 1024];
int wistiaFileBytesRead = 0;
while ((wistiaFileBytesRead = await wistiaFileStream.ReadAsync(wistiaFileBuffer, 0, wistiaFileBuffer.Length)) != 0)
{
await requestStream.WriteAsync(wistiaFileBuffer, 0, wistiaFileBytesRead);
await requestStream.FlushAsync();
}
}
await requestStream.WriteAsync(requestBoundary, 0, requestBoundary.Length);
return (HttpWebResponse)(await request.GetResponseAsync());
}
You should play with buffer sizes, amount of data you read at once and request.SendChunked to achieve reasonable performance.
Here is another approach (not asynchronous so possibly worst scalability) which wirtes directly from buffer to request:
public HttpWebResponse PushFileToWistia(byte[] contentFileByteArray)
{
string boundary = "---------------------------" + DateTime.Now.Ticks.ToString("x");
string contentBoundary = "\r\n--" + boundary + "\r\n";
StringBuilder wistiaMetadataBuilder = new StringBuilder();
wistiaMetadataBuilder.Append("--" + boundary + "\r\n");
// Append all the wistia config and setting here
byte[] wistiaMetadata = Encoding.UTF8.GetBytes(wistiaMetadataBuilder.ToString());
byte[] requestBoundary = Encoding.UTF8.GetBytes(contentBoundary);
ServicePointManager.Expect100Continue = false;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(AppConfig.WistiaCustomCourseBucket);
request.Method = "POST";
request.Headers.Clear();
request.Expect = String.Empty;
request.ContentType = "multipart/form-data; boundary=" + boundary;
request.ContentLength = wistiaMetadata.Length + contentFileByteArray.Length + requestBoundary.Length
// You can play with SendChunked and AllowWriteStreamBuffering to control the size of chunks you send and performance
//request.SendChunked = true;
//request.AllowWriteStreamBuffering = false;
int contentFileChunkSize = 500 * 1024;
int contentFileBytesRead = 0;
Stream requestStream = request.GetRequestStream();
requestStream.Write(wistiaMetadata, 0, wistiaMetadata.Length);
while (contentFileBytesRead < contentFileByteArray.Length)
{
if ((contentFileBytesRead + contentFileChunkSize) > contentFileByteArray.Length)
{
contentFileChunkSize = contentFileByteArray.Length - contentFileBytesRead;
}
requestStream.Write(contentFileByteArray, contentFileBytesRead, contentFileChunkSize);
requestStream.Flush();
contentFileBytesRead += contentFileChunkSize;
}
requestStream.Write(requestBoundary, 0, requestBoundary.Length);
requestStream.Close();
// You might need to play with request.Timeout here
return (HttpWebResponse)request.GetResponse();
}
Also if you doing this in web application and you want to use asynchronous approach you need to "async/await" all the way up (so async action in async controller etc.).
In general I would discourage doing this as part of request handling in web application (the total time observed from user perspective would be a sum of uploading to your app and then to Wistia which might be much more than client timeout allows). In such case it is usually better to save the file and schedule some other "background task" to do the upload job.

Uploading large files to Sharepoint 365

I'm using the CSOM to upload files to a Sharepoint 365 site.
I've logged in succesfully with Claims based authentication using methods found here "http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx"
But using SaveBinaryDirect on the ClientContext fails with a 405 due to cookies being attached to request too late.
Another method of using CSOM to upload files is similar to below. But with SP 365, this limits the file size to about 3 meg.
var newFileFromComputer = new FileCreationInformation
{
Content = fileContents,
Url = Path.GetFileName(sourceUrl)
};
Microsoft.SharePoint.Client.File uploadedFile = customerFolder.Files.Add(newFileFromComputer);
context.Load(uploadedFile);
context.ExecuteQuery();
Is there ANY way to do this using CSOM, SP 365 and file sizes up to say 100 meg?
Indeed while trying to upload a file in SharePoint Online which size exceeds 250MB file limit the following exception will occur:
Response received was -1,
Microsoft.SharePoint.Client.InvalidClientQueryExceptionThe request
message is too big. The server does not allow messages larger than
262144000 bytes.
To circumvent this error chunked file upload methods have been introduced which support uploading files larger than 250 MB. In the provided link there is an sample which demonstrates how to utilize it via SharePoint CSOM API.
Supported versions:
SharePoint Online
SharePoint On-Premise 2016 or above
The following example demonstrates how to utilize chunked file upload methods in SharePoint REST API:
class FileUploader
{
public static void ChunkedFileUpload(string webUrl, ICredentials credentials, string sourcePath, string targetFolderUrl, int chunkSizeBytes, Action<long, long> chunkUploaded)
{
using (var client = new WebClient())
{
client.BaseAddress = webUrl;
client.Credentials = credentials;
client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
var formDigest = RequestFormDigest(webUrl, credentials);
client.Headers.Add("X-RequestDigest", formDigest);
//create an empty file first
var fileName = System.IO.Path.GetFileName(sourcePath);
var createFileRequestUrl = string.Format("/_api/web/getfolderbyserverrelativeurl('{0}')/files/add(url='{1}',overwrite=true)", targetFolderUrl, fileName);
client.UploadString(createFileRequestUrl, "POST");
var targetUrl = System.IO.Path.Combine(targetFolderUrl, fileName);
var firstChunk = true;
var uploadId = Guid.NewGuid();
var offset = 0L;
using (var inputStream = System.IO.File.OpenRead(sourcePath))
{
var buffer = new byte[chunkSizeBytes];
int bytesRead;
while ((bytesRead = inputStream.Read(buffer, 0, buffer.Length)) > 0)
{
if (firstChunk)
{
var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/startupload(uploadId=guid'{1}')", targetUrl, uploadId);
client.UploadData(endpointUrl, buffer);
firstChunk = false;
}
else if (inputStream.Position == inputStream.Length)
{
var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/finishupload(uploadId=guid'{1}',fileOffset={2})", targetUrl, uploadId, offset);
var finalBuffer = new byte[bytesRead];
Array.Copy(buffer, finalBuffer, finalBuffer.Length);
client.UploadData(endpointUrl, finalBuffer);
}
else
{
var endpointUrl = string.Format("/_api/web/getfilebyserverrelativeurl('{0}')/continueupload(uploadId=guid'{1}',fileOffset={2})", targetUrl, uploadId, offset);
client.UploadData(endpointUrl, buffer);
}
offset += bytesRead;
chunkUploaded(offset, inputStream.Length);
}
}
}
}
public static string RequestFormDigest(string webUrl, ICredentials credentials)
{
using (var client = new WebClient())
{
client.BaseAddress = webUrl;
client.Credentials = credentials;
client.Headers.Add("X-FORMS_BASED_AUTH_ACCEPTED", "f");
client.Headers.Add("Accept", "application/json; odata=verbose");
var endpointUrl = "/_api/contextinfo";
var content = client.UploadString(endpointUrl, "POST");
var data = JObject.Parse(content);
return data["d"]["GetContextWebInformation"]["FormDigestValue"].ToString();
}
}
}
Source code: FileUploader.cs
Usage
var userCredentials = GetCredentials(userName, password);
var sourcePath = #"C:\temp\jellyfish-25-mbps-hd-hevc.mkv"; //local file path
var targetFolderUrl = "/Shared Documents"; //library reltive url
FileUploader.ChunkedFileUpload(webUrl,
userCredentials,
sourcePath,
targetFolderUrl,
1024 * 1024 * 5, //5MB
(offset, size) =>
{
Console.WriteLine("{0:P} completed", (offset / (float)size));
});
References
Always use File Chunking to Upload Files > 250 MB to SharePoint Online
Well, I haven't found a way to do it with the CSOM and that is truly infuriating.
A work around was posted by SEvans at the comments on http://www.wictorwilen.se/Post/How-to-do-active-authentication-to-Office-365-and-SharePoint-Online.aspx .
Basically just do an http put and attach the cookie collection from the claims based authentication. SEvans posted workaround is below
Great piece of code Wichtor. As others have noted, SaveBinaryDirect does not work correctly, as the FedAuth cookies never get attached to the HTTP PUT request that the method generates.
Here's my workaround:
// "url" is the full destination path (including filename, i.e. https://mysite.sharepoint.com/Documents/Test.txt)
// "cookie" is the CookieContainer generated from Wichtor's code
// "data" is the byte array containing the files contents (used a FileStream to load)
System.Net.ServicePointManager.Expect100Continue = false;
HttpWebRequest request = HttpWebRequest.Create(url) as HttpWebRequest;
request.Method = "PUT";
request.Accept = "*/*";
request.ContentType = "multipart/form-data; charset=utf-8";
request.CookieContainer = cookie; request.AllowAutoRedirect = false;
request.UserAgent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)";
request.Headers.Add("Accept-Language", "en-us");
request.Headers.Add("Translate", "F"); request.Headers.Add("Cache-Control", "no-cache"); request.ContentLength = data.Length;
using (Stream req = request.GetRequestStream())
{ req.Write(data, 0, data.Length); }
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream res = response.GetResponseStream();
StreamReader rdr = new StreamReader(res);
string rawResponse = rdr.ReadToEnd();
response.Close();
rdr.Close();

WinRT StorageFile write downloaded file

I'm struggling with a easy problem. I want to download an image from web using this code:
WebRequest requestPic = WebRequest.Create(#"http://something.com/" + id + ".jpg");
WebResponse responsePic = await requestPic.GetResponseAsync();
Now I wanted to write the WebResponse's stream in a StorageFile (eg. create a file id.jpg in the app's storage), but I haven't found any way to achieve that. I searched the web for it, but no success - all ways incompatible Stream types and so on.
Could you please help?
I have found the following solution, which works and is not too complicated.
public async static Task<StorageFile> SaveAsync(
Uri fileUri,
StorageFolder folder,
string fileName)
{
var file = await folder.CreateFileAsync(fileName, CreationCollisionOption.ReplaceExisting);
var downloader = new BackgroundDownloader();
var download = downloader.CreateDownload(
fileUri,
file);
var res = await download.StartAsync();
return file;
}
You will need to read the response stream into a buffer then write the data to a StorageFile. THe following code shows an example:
var fStream = responsePic.GetResponseStream();
var file = await ApplicationData.Current.LocalFolder.CreateFileAsync("testfile.txt");
using (var ostream = await file.OpenStreamForWriteAsync())
{
int count = 0;
do
{
var buffer = new byte[1024];
count = fStream.Read(buffer, 0, 1024);
await ostream.WriteAsync(buffer, 0, count);
}
while (fStream.CanRead && count > 0);
}
That can be done using the C++ REST SDK in Windows Store Apps. It's explained by HTTP Client Tutorial.

Uploading PostedFile to FTP

I am needing to upload a posted file to an FTP file location in my controller.
Here is what I have now.
public ActionResult Upload(HttpPostedFileBase file)
{
string fileName = System.IO.Path.GetFileName(file.FileName);
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://10.10.0.3"+"/"+fileName);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential("username", "password");
StreamReader streamReader = new StreamReader(file.InputStream);
byte[] fileContents = Encoding.UTF8.GetBytes(streamReader.ReadToEnd());
streamReader.Close();
request.ContentLength = fileContents.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
.....
}
The file is being uploaded, it has the correct number of pages, however there is no text in the new file. (these are pdfs, I will do validation on the type later, just trying to get it to work now).
Thanks!
You are reading PDF file as if they were text files. Instead try this.
var sourceStream = file.InputStream;
requestStream = request.GetRequestStream();
request.ContentLength = sourceStream.Length;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead = sourceStream.Read(buffer, 0, BUFFER_SIZE);
do
{
requestStream.Write(buffer, 0, bytesRead);
bytesRead = sourceStream.Read(buffer, 0, BUFFER_SIZE);
} while (bytesRead > 0);
sourceStream.Close();
requestStream.Close();
response = (FtpWebResponse)request.GetResponse();

Downloading Azure Blob files in MVC3

Our ASP.NET MVC 3 application is running on Azure and using Blob as file storage. I have the upload part figured out.
The View is going to have the File Name, which, when clicked will prompt the file download screen to appear.
Can anyone tell me how to go about doing this?
Two options really... the first is to just redirect the user to the blob directly (if the blobs are in a public container). That would look a bit like:
return Redirect(container.GetBlobReference(name).Uri.AbsoluteUri);
If the blob is in a private container, you could either use a Shared Access Signature and do redirection like the previous example, or you could read the blob in your controller action and push it down to the client as a download:
Response.AddHeader("Content-Disposition", "attachment; filename=" + name); // force download
container.GetBlobReference(name).DownloadToStream(Response.OutputStream);
return new EmptyResult();
Here's a resumable version (useful for large files or allowing seek in video or audio playback) of private blob access:
public class AzureBlobStream : ActionResult
{
private string filename, containerName;
public AzureBlobStream(string containerName, string filename)
{
this.containerName = containerName;
this.filename = filename;
this.contentType = contentType;
}
public override void ExecuteResult(ControllerContext context)
{
var response = context.HttpContext.Response;
var request = context.HttpContext.Request;
var connectionString = ConfigurationManager.ConnectionStrings["Storage"].ConnectionString;
var account = CloudStorageAccount.Parse(connectionString);
var client = account.CreateCloudBlobClient();
var container = client.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference(filename);
blob.FetchAttributes();
var fileLength = blob.Properties.Length;
var fileExists = fileLength > 0;
var etag = blob.Properties.ETag;
var responseLength = fileLength;
var buffer = new byte[4096];
var startIndex = 0;
//if the "If-Match" exists and is different to etag (or is equal to any "*" with no resource) then return 412 precondition failed
if (request.Headers["If-Match"] == "*" && !fileExists ||
request.Headers["If-Match"] != null && request.Headers["If-Match"] != "*" && request.Headers["If-Match"] != etag)
{
response.StatusCode = (int)HttpStatusCode.PreconditionFailed;
return;
}
if (!fileExists)
{
response.StatusCode = (int)HttpStatusCode.NotFound;
return;
}
if (request.Headers["If-None-Match"] == etag)
{
response.StatusCode = (int)HttpStatusCode.NotModified;
return;
}
if (request.Headers["Range"] != null && (request.Headers["If-Range"] == null || request.Headers["IF-Range"] == etag))
{
var match = Regex.Match(request.Headers["Range"], #"bytes=(\d*)-(\d*)");
startIndex = Util.Parse<int>(match.Groups[1].Value);
responseLength = (Util.Parse<int?>(match.Groups[2].Value) + 1 ?? fileLength) - startIndex;
response.StatusCode = (int)HttpStatusCode.PartialContent;
response.Headers["Content-Range"] = "bytes " + startIndex + "-" + (startIndex + responseLength - 1) + "/" + fileLength;
}
response.Headers["Accept-Ranges"] = "bytes";
response.Headers["Content-Length"] = responseLength.ToString();
response.Cache.SetCacheability(HttpCacheability.Public); //required for etag output
response.Cache.SetETag(etag); //required for IE9 resumable downloads
response.ContentType = blob.Properties.ContentType;
blob.DownloadRangeToStream(response.OutputStream, startIndex, responseLength);
}
}
Example:
Response.AddHeader("Content-Disposition", "attachment; filename=" + filename); // force download
return new AzureBlobStream(blobContainerName, filename);
I noticed that writing to the response stream from the action method messes up the HTTP headers. Some expected headers are missing and others are not set correctly.
So instead of writing to the response stream, I get the blob content as a stream and pass it to the Controller.File() method.
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
Stream blobStream = blob.OpenRead();
return File(blobStream, blob.Properties.ContentType, "FileName.txt");

Resources