here i want to point out a code which i found from here Using plupload with MVC3. whose intention is to upload a single file but my requirement is bit different like i need to upload few large file say 3 files and each file size could be 2GB.
[HttpPost]
public ActionResult Upload(int? chunk, string name)
{
var fileUpload = Request.Files[0];
var uploadPath = Server.MapPath("~/App_Data");
chunk = chunk ?? 0;
using (var fs = new FileStream(Path.Combine(uploadPath, name), chunk == 0 ? FileMode.Create : FileMode.Append))
{
var buffer = new byte[fileUpload.InputStream.Length];
fileUpload.InputStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
}
return Json(new { message = "chunk uploaded", name = name });
}
$('#uploader').pluploadQueue({
runtimes: 'html5,flash',
url: '#Url.Action("Upload")',
max_file_size: '5mb',
chunk_size: '1mb',
unique_names: true,
multiple_queues: false,
preinit: function (uploader) {
uploader.bind('FileUploaded', function (up, file, data) {
// here file will contain interesting properties like
// id, loaded, name, percent, size, status, target_name, ...
// data.response will contain the server response
});
}
});
just wonder anyone can tell me what else i need to add in above server side and client side code which enable me to upload multiple large files. thanks
You might well need to add an entry to your web.config file to allow for the large file size (2097152KB = 2GB). The timeout in seconds you can adjust accordingly:
<system.web>
<httpRuntime maxRequestLength="2097152" executionTimeout="3600" />
</system.web>
also you can set request limit (which is in bytes) to be 2GB,
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483648"/>
</requestFiltering>
</security>
</system.webServer>
Related
We are trying to embed an SSRS 2016 mobile report into our MVC Application.
We attempted to implement a "Reverse Proxy using HTTP Handler while SSRS authentication is set to basic".
The reason behind the use of Basic Authentication is that not all users that are accessing this page have NTLM access.
In the rsreportserver.config we set the Authentication to be as follows:
<Authentication>
<AuthenticationTypes>
<RSWindowsBasic>
<LogonMethod>3</LogonMethod>
<Realm></Realm>
<DefaultDomain>workgroup</DefaultDomain>
</RSWindowsBasic>
</AuthenticationTypes>
<RSWindowsExtendedProtectionLevel>Off</RSWindowsExtendedProtectionLevel>
<RSWindowsExtendedProtectionScenario>Proxy</RSWindowsExtendedProtectionScenario>
<EnableAuthPersistence>true</EnableAuthPersistence>
</Authentication>
So in general the link that is being redirected to within our page is partitioned as : //AppServer/KEYWORD/ReportPath
In the codebehind we are setting the handlers not only for KEYWORD but also for the elements within the page like assests and api. Such that we are getting the entire content of the mobile report
(//Server/ReportS/mobilereport/ReportPath) and writing it to our OutputStream.
Within the handler the authentication header for the request is being set.
The issue is that OAuth.js keeps failing with net: ERR_CONNECTION_RESET.
And we are never redirected to that specific report but to the homepage even if rs:Embed=true is included in the URL.
Why is it constantly failing?
And how can I directly load the report?
Code snippet from Handler:
public void Process(HttpContext context) {
HttpResponse response = context.Response;
string uri = "value";
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(new Uri(uri));
webRequest.Method = context.Request.HttpMethod;
webRequest.ImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
string svcCredentials = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(username + ":" + password));
webRequest.Headers.Add("Authorization", "Basic " + svcCredentials);
HttpWebResponse serverResponse = null;
try {
serverResponse = (HttpWebResponse)webRequest.GetResponse();
}
catch (WebException webExc)
{
response.StatusCode = 500;
response.StatusDescription = webExc.Status.ToString();
response.End();
return;
}
response.ContentType = serverResponse.ContentType;
const int blockSize = 2048;
byte[] contentBlock = new byte[blockSize];
long bytesToRead = serverResponse.ContentLength;
Stream dataStream = serverResponse.GetResponseStream();
while (bytesToRead > 0) {
int blockToRead = (int)Math.Min(blockSize, bytesToRead);
int bytesRead = dataStream.Read(contentBlock, 0, blockToRead);
bytesToRead -= bytesRead;
context.Response.OutputStream.Write(contentBlock, 0, bytesRead);
}
serverResponse.Close();
context.Response.OutputStream.Close();
context.Response.End();
}
Note: We are using this handler cause we are aware that the report-viewer does not cater to mobile reports
In my MVC application, I am using a dropzone control to allow users to upload a file from their local system to the SQL database.
if (Request.Files.Count > 0)
{
var name = Request.Files[0].FileName;
var size = Request.Files[0].ContentLength;
var type = Request.Files[0].ContentType;
var fileStream = Request.Files[0].InputStream;
byte[] documentBytes = new byte[fileStream.Length];
fileStream.Read(documentBytes, 0, documentBytes.Length);
Documents databaseDocument = new Documents
{
FileContent = documentBytes,
DocumentName = System.IO.Path.GetFileName(name),
DocumentSize = size,
DocumentType = type
};
bool result = this.updateService.SaveDocument(databaseDocument);
}
"updateService" is actually a reference to the WCF service.
I get the error on the "SaveDocument" call in above code.
I have set uploadReadAheadSize (in applicationHost.config), and maxReceivedMessageSize (in WCF and Web configuration files) as suggested on other forums.
Still this error is not resolving for me.
This gives an error saying "The remote server returned an error: (413) Request Entity Too Large "
You could use a stream as parameter to your service operation if you don't want to have issues with transmitting too large objects.
Service interface:
[ServiceContract]
public interface IStreamedService
{
[OperationContract]
void PrepareUpload(long fileLength, string fileName);
[OperationContract]
void UploadFile(Stream fileStream);
}
Service implementation:
public class StreamedService : IStreamedService
{
private static long lengthOfFile;
private static string nameOfFile;
public void PrepareUpload(long fileLength, string fileName)
{
lengthOfFile = fileLength;
nameOfFile = fileName;
}
public void UploadFile(Stream fileStream)
{
if(lengthOfFile==0 || string.IsNullOrEmpty(nameOfFile))
throw new ArgumentException("Upload must be prepared");
var bytes = new byte[lengthOfFile];
var numberOfBytesToRead = bytes.Length;
var numberOfReadBytes = 0;
while (numberOfBytesToRead > 0)
{
var n = fileStream.Read(bytes, numberOfReadBytes, numberOfBytesToRead);
if (n == 0)
break;
numberOfReadBytes += n;
numberOfBytesToRead -= n;
}
var fsOut = new FileStream(string.Format(#"c:\temp\{0}", nameOfFile), FileMode.Create);
fsOut.Write(bytes, 0, numberOfReadBytes);
fsOut.Close();
}
}
Service config:
system.serviceModel>
<services>
<service name="StreamedService.StreamedService">
<endpoint address="net.tcp://localhost:60000/StreamedService"
binding="netTcpBinding" bindingConfiguration="NewBinding0" contract="Contracts.IStreamedService" />
</service>
</services>
<bindings>
<netTcpBinding>
<binding name="NewBinding0" transferMode="Streamed" maxReceivedMessageSize="67108864" />
</netTcpBinding>
</bindings>
Client implementation:
var proxy = new ChannelFactory<IStreamedService>("MyEndpoint").CreateChannel();
var fs = new FileStream(#"c:\temp\FileToUpload.zip", FileMode.Open);
proxy.PrepareUpload(fs.Length, "uploadedFile.zip");
proxy.UploadFile(fs);
Client config:
<system.serviceModel>
<bindings>
<netTcpBinding>
<binding name="NewBinding0" transferMode="Streamed" maxReceivedMessageSize="67108864" />
</netTcpBinding>
</bindings>
<client>
<endpoint address="net.tcp://localhost:60000/StreamedService"
binding="netTcpBinding" bindingConfiguration="NewBinding0"
contract="Contracts.IStreamedService" name="MyEndpoint">
</endpoint>
</client>
</system.serviceModel>
The above works with basicHttpBinding as well. And you could of course use a MemoryStream on the server side instead of a FileStream, and then deserialize it to some entity that you want to save to a DB.
In my MVC 4 app, I have a view that uploads a file from the client machine with:
<snip>
#using (Html.BeginForm("Batch", "Home", FormMethod.Post, new { enctype = "multipart/form-data" }))
{
<input class="full-width" type="file" name="BatchFile" id="BatchFile"
<input type="submit" value="Do It" />
}
<snip>
The "Batch" action in the home controller takes that file and processes it in a way that may be very lengthy.... minutes even:
<snip>
[HttpPost]
public FileResult Batch(ModelType modelInstance)
{
// Do the batch work.
string result = LengthyBatchProcess(modelInstance.BatchFile.InputStream)
var encoding = new ASCIIEncoding();
Byte[] byteArray = encoding.GetBytes(result);
Response.AddHeader("Content-Disposition", "attachment;filename=download.csv");
return File(byteArray, "application/csv");
}
<snip>
This all works fine, and it isn't an inherent problem that the user is locked out for the time it takes for the batch process to run. In fact they expect it. The problem is that the user may not be in a position to know whether this process will take a couple of seconds or a couple of minutes, and I would like to provide them with status information while LengthyBatchProcess is running. I have researched unobtrusive ajax, but it does not seem to have the functionality necessary for this, unless there is some way to chain unobtrusive ajax calls. Any thoughts on how to best architect this? Many thanks in advance.
What you want to achieve requires a bit of work.
One way is to open another channel (ajax call) to get the progress report. Quoting from How do you measure the progress of a web service call?:
Write a separate method on the server that you can query by passing the ID of the job that has been scheduled and which returns an approximate value between 0-100 (or 0.0 and 1.0, or whatever) of how far along it is.
I've found a great tutorial on this matter.
Yes, you can start downloading the file in chuncks so the user can see the download progress of the browser:
try
{
// Do the batch work.
string result = LengthyBatchProcess(modelInstance.BatchFile.InputStream)
var encoding = new ASCIIEncoding();
Byte[] byteArray = encoding.GetBytes(result);
Response.Clear();
Response.ClearContent();
Response.Buffer = true;
Response.AddHeader("Content-Disposition",
"attachment;filename=download.csv");
Response.ContentType = "application/csv";
Response.BufferOutput = false;
for (int i = 0; i < byteArray.Length; i++)
{
if (i % 10000 == 0)
{
Response.Flush();
}
Response.Output.WriteLine(byteArray[i]);
}
}
catch (Exception ex)
{
}
finally
{
Response.Flush();
Response.End();
}
I need to upload files (with different extensions ie .txt, .jpg, .pdf....) on a server.
I created a site that accepts http requests an maps a virtual directory to a physical one.
all thi works fine in dowload, now I have to implement the upload.
here is my code
private void UploadFile(string uploadFileName, string localFileName)
{
//long length = 0;
string boundary = "----------------------------" + DateTime.Now.Ticks.ToString("x");
HttpWebRequest httpWebRequest2 = (HttpWebRequest)WebRequest.Create(uploadFileName);
httpWebRequest2.ContentType = "multipart/form-data; boundary=" + boundary;
httpWebRequest2.Method = "POST";
httpWebRequest2.KeepAlive = true;
httpWebRequest2.Credentials = new NetworkCredential("USER", "PASSWORD", "DOMAIN");
Stream memStream = new System.IO.MemoryStream();
byte[] boundarybytes = System.Text.Encoding.ASCII.GetBytes("\r\n--" +boundary + "\r\n");
string formdataTemplate = "\r\n--" + boundary +"\r\nContent-Disposition: form-data; name=\"{0}\";\r\n\r\n{1}";
memStream.Write(boundarybytes, 0, boundarybytes.Length);
string headerTemplate = "Content-Disposition: form-data; name=\"{0}\"; filename=\"{1}\"\r\n Content-Type: application/octet-stream\r\n\r\n";
string header = string.Format(headerTemplate, "uplTheFile", localFileName);
byte[] headerbytes = System.Text.Encoding.UTF8.GetBytes(header);
memStream.Write(headerbytes, 0, headerbytes.Length);
FileStream fileStream = new FileStream(localFileName, FileMode.Open, FileAccess.Read);
byte[] buffer = new byte[1024];
int bytesRead = 0;
while ((bytesRead = fileStream.Read(buffer, 0, buffer.Length)) != 0)
{
memStream.Write(buffer, 0, bytesRead);
}
memStream.Write(boundarybytes, 0, boundarybytes.Length);
fileStream.Close();
httpWebRequest2.ContentLength = memStream.Length;
Stream requestStream = httpWebRequest2.GetRequestStream();
//error returned in lenght field: "This stream does not support seek operations."
memStream.Position = 0;
byte[] tempBuffer = new byte[memStream.Length];
memStream.Read(tempBuffer, 0, tempBuffer.Length);
memStream.Close();
requestStream.Write(tempBuffer, 0, tempBuffer.Length);
requestStream.Close();
WebResponse webResponse2 = httpWebRequest2.GetResponse();
//error returned from getResponse: "The remote server returned an error: (405) Method Not Allowed."
//I guess cause my folder is in read only
Stream stream2 = webResponse2.GetResponseStream();
StreamReader reader2 = new StreamReader(stream2);
MessageBox.Show(reader2.ReadToEnd());
webResponse2.Close();
httpWebRequest2 = null;
webResponse2 = null;
}
firstly I got this error: The remote server returned an error: (405) Method Not Allowed.
so I tried to enable POST by adding a mapping onto the web site.
now in the folder on te server I have a web.config file that is:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<system.webServer>
<directoryBrowse enabled="true" />
<handlers>
<add name="File TXT" path="*.*" verb="*" modules="IsapiModule" scriptProcessor="C:\Windows\System32\inetsrv\asp.dll" resourceType="Unspecified" requireAccess="Script" preCondition="bitness64" />
</handlers>
</system.webServer>
</configuration>
in this way I do not get any error, but the application does not upload any file.
how do I solve this problem?
A much simpler solution would be to use the System.Net.Webclient class to upload the file.
System.Net.WebClient client = new System.Net.WebClient();
client.UploadFile("www.myfileuploadendpoint.com",#"c:\myfiletoupload.txt");
The client also has an OnUploadFileCompleted event that will call a handler when its done.
client.OnUploadFileCompleted += UploadCompleted(sender,args) => { //do whatever you want}
This will save you code and bugs. Good Luck! :)
Both of the above sample not working,
"The remote server returned an error: (405) Method Not Allowed." is the exception caught on trying both the above codes
Expect someone to share their thought on these exceptions, Please.
rgds,
thiru
So, I've implemented plupload using flash runtime in MVC3.
It works perfectly, in the sense that it uploads using the correction Action and runs it all. However, I'd really like to be able to control the response, and handle it in plupload, but I can't seem to get any response through.
I've tried overriding fileUploaded, but I can't seem to get anything out of the arguments. I've tried return simple strings, json and what have you. I can't seem to get anything out on the client side. And of course being sent through flash, I can't even debug the requests with firebug :/
The same with the Error event, and throwing exceptions. It correctly interprets the exception as an error, but it's always that #IO ERROR with some code like 2038 or something coming out the other end. I can't show my exception string or anything at all. Can anyone help?
Bonus question: How would I send session/cookie data along with the plupload, so I can access the session in my action?
The following has worked for me:
[HttpPost]
public ActionResult Upload(int? chunk, string name)
{
var fileUpload = Request.Files[0];
var uploadPath = Server.MapPath("~/App_Data");
chunk = chunk ?? 0;
using (var fs = new FileStream(Path.Combine(uploadPath, name), chunk == 0 ? FileMode.Create : FileMode.Append))
{
var buffer = new byte[fileUpload.InputStream.Length];
fileUpload.InputStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, buffer.Length);
}
return Json(new { message = "chunk uploaded", name = name });
}
and on the client:
$('#uploader').pluploadQueue({
runtimes: 'html5,flash',
url: '#Url.Action("Upload")',
max_file_size: '5mb',
chunk_size: '1mb',
unique_names: true,
multiple_queues: false,
preinit: function (uploader) {
uploader.bind('FileUploaded', function (up, file, data) {
// here file will contain interesting properties like
// id, loaded, name, percent, size, status, target_name, ...
// data.response will contain the server response
});
}
});
As far as the bonus question is concerned I am willing to answer it by don't use sessions, as they don't scale well, but because I know that you probably won't like this answer you have the possibility to pass a session id in the request using the multipart_params:
multipart_params: {
ASPSESSID: '#Session.SessionID'
},
and then on the server perform some hacks to create the proper session.
Look here:
$("#uploader").pluploadQueue({
// General settings
runtimes: 'silverlight',
url: '/Home/Upload',
max_file_size: '10mb',
chunk_size: '1mb',
unique_names: true,
multiple_queues: false,
// Resize images on clientside if we can
resize: { width: 320, height: 240, quality: 90 },
// Specify what files to browse for
filters: [
{ title: "Image files", extensions: "jpg,gif,png" },
{ title: "Zip files", extensions: "zip" }
],
// Silverlight settings
silverlight_xap_url: '../../../Scripts/upload/plupload.silverlight.xap'
});
// Client side form validation
$('form').submit(function (e) {
var uploader = $('#uploader').pluploadQueue();
// Files in queue upload them first
if (uploader.files.length > 0) {
// When all files are uploaded submit form
uploader.bind('StateChanged', function () {
if (uploader.files.length === (uploader.total.uploaded + uploader.total.failed)) {
$('form')[0].submit();
}
});
uploader.start();
} else {
alert('You must queue at least one file.');
}
return false;
});
And in Controller:
[HttpPost]
public string Upload( ) {
HttpPostedFileBase FileData = Request.Files[0];
if ( FileData.ContentLength > 0 ) {
var fileName = Path.GetFileName( FileData.FileName );
var path = Path.Combine( Server.MapPath( "~/Content" ), fileName );
FileData.SaveAs( path );
}
return "Files was uploaded successfully!";
}
That's all...No chunk is needed in Controller...