Generate big zip file in asp net core - asp.net-mvc

I'm building a MVC controller action which build a zip file containing two files :
Some metadata serialized from the database.
A related content file actually store in an azure storage blob
So far, my controller works fine until a certain file size. Then when the content from azure gets too big I have an out of memory exception certainely related to the fact that i'm writing stuff in the server's memory which is not infinite.
So now I'm wondering which approach should I take ? Write stuff in a temp path on the server or are there other options ?
Here's my controller aciton for the reference :
public async Task<ActionResult> Download(Guid? id)
{
if (id == null)
{
return NotFound();
}
var cIApplication = await _context.CIApplications
.AsNoTracking()
.SingleOrDefaultAsync(m => m.ID == id);
if (cIApplication == null)
{
return NotFound();
}
//Serialize metadata : this will always be small
byte[] metaData = BinSerializer.SerializeToByteArrayAsync<CIApplication>(cIApplication);
//GetFile from Azure blob : This can reach several GB
//StorageManagement is an helper class to manipulate Azure storage objects
StorageManagement storage = new StorageManagement();
byte[] content = await storage.GetBlobToStream("application", $"{cIApplication.ID}.zip");
//Zip It and send it
using (MemoryStream ms = new MemoryStream())
{
using (var archive = new ZipArchive(ms, ZipArchiveMode.Create, true))
{
var zipArchiveEntry = archive.CreateEntry($"{cIApplication.ID}.bin", CompressionLevel.Fastest);
using (var zipStream = zipArchiveEntry.Open()) zipStream.Write(metaData, 0, metaData.Length);
zipArchiveEntry = archive.CreateEntry($"{cIApplication.ID}.zip", CompressionLevel.Fastest);
using (var zipStream = zipArchiveEntry.Open()) zipStream.Write(content, 0, content.Length);
}
return File(ms.ToArray(), "application/zip", $"{cIApplication.Publisher} {cIApplication.Name} {cIApplication.Version}.zip");
}
}

Related

How to upload image in blob storage using ASP.NET MVC web app

I need to upload an image as I also create a new row in mysql database. The instruction given to me is that images should be stored in Azure blob storage while info are stored in mysql database.
This is my code for creating a new row
// POST: Books/Create
// To protect from overposting attacks, enable the specific properties you want to bind to, for
// more details, see http://go.microsoft.com/fwlink/?LinkId=317598.
[HttpPost]
[ValidateAntiForgeryToken]
public async Task<IActionResult>Create([Bind("Id,Isbn,Price,Rank,Title,Genre,Author,Overview,Summary,Publisher,PubDate,Pages,Length,Width,Height,Weight")] Books books)
{
try
{
if (ModelState.IsValid)
{
var responseTask = Client.PostAsJsonAsync("api/Books", books);
responseTask.Wait();
var result = responseTask.Result;
if (result.IsSuccessStatusCode)
{
return RedirectToAction(nameof(Index));
}
}
}
catch
{
return BadRequest();
}
return View(books);
}
Now I don't have any idea what codes to put in so I can upload an image alongside it
Please help me I need to pass this project in 24hrs and I am still far from the projected outcome 🥺
So, at first, you need to add a new property which type is IFormFile into your Books model, so that when you creating a new row you can also get the Iformfile for the image.
Then pls follow this answer to upload your image to azure storage blob. Here's the code snippet when I test in my side:
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
[HttpPost]
public async Task<string> uploadFile(TestModel mod)
{
var connectionstring = "connection_string";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionstring);
BlobContainerClient blobContainerClient = blobServiceClient.GetBlobContainerClient("container_name");
await blobContainerClient.CreateIfNotExistsAsync();
BlobClient blobClent = blobContainerClient.GetBlobClient(mod.fileName);
BlobHttpHeaders httpheaders = new BlobHttpHeaders()
{
ContentType = mod.img.ContentType
};
await blobClent.UploadAsync(mod.img.OpenReadStream(), httpheaders);
return "success";
}

How do I write FileContentResult on disk?

I am trying to use the Rotativa component to store (not to show) a copy of the invoice permanently on web server disk. Two questions:
Why I need to specify a controller action? ("Index", in this
case)
How do I write the FileContentResult on local disk without
displaying it?
Thanks.
Here is my code:
[HttpPost]
public ActionResult ValidationDone(FormCollection formCollection, int orderId, bool fromOrderDetails)
{
Order orderValidated = context.Orders.Single(no => no.orderID == orderId);
CommonUtils.SendInvoiceMail(orderValidated.customerID , orderValidated.orderID);
var filePath = Path.Combine(Server.MapPath("/Temp"), orderValidated.invoiceID + ".pdf");
var pdfResult = new ActionAsPdf("Index", new { name = orderValidated.invoiceID }) { FileName = filePath };
var binary = pdfResult.BuildPdf(ControllerContext);
FileContentResult fcr = File(binary, "application/pdf");
// how do I save 'fcr' on disk?
}
You do not need the FileContentResult to create a file. You've got the byte array which can be saved directly to the disk:
var binary = pdfResult.BuildPdf(ControllerContext);
System.IO.File.WriteAllBytes(#"c:\foobar.pdf", binary);
string FileName="YOUR FILE NAME";
//first give a name to file
string Path=Server.MapPath("YourPath in solution"+Filename+".Pdf")
//Give your path and file extention. both are required.
binary[]= YOUR DATA
//Describe your data to be save as file.
System.IO.File.WriteAllBytes(Path, binary);
Thats simple...

File Name from HttpRequestMessage Content

I implemented a POST Rest service to upload files to my server. the problem i have right now is that i want to restrict the uploaded files by its type. lets say for example i only want to allow .pdf files to be uploaded.
What I tried to do was
Task<Stream> task = this.Request.Content.ReadAsStreamAsync();
task.Wait();
FileStream requestStream = (FileStream)task.Result;
but unfortunately its not possible to cast the Stream to a FileStream and access the type via requestStream.Name.
is there an easy way (except writing the stream to the disk and check then the type) to get the filetype?
If you upload file to Web API and you want to get access to file data (Content-Disposition) you should upload the file as MIME multipart (multipart/form-data).
Here I showed some examples on how to upload from HTML form, Javascript and from .NET.
You can then do something like this, this example checks for pdf/doc files only:
public async Task<HttpResponseMessage> Post()
{
if (!Request.Content.IsMimeMultipartContent())
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.NotAcceptable,
"This request is not properly formatted - not multipart."));
}
var provider = new RestrictiveMultipartMemoryStreamProvider();
//READ CONTENTS OF REQUEST TO MEMORY WITHOUT FLUSHING TO DISK
await Request.Content.ReadAsMultipartAsync(provider);
foreach (HttpContent ctnt in provider.Contents)
{
//now read individual part into STREAM
var stream = await ctnt.ReadAsStreamAsync();
if (stream.Length != 0)
{
using (var ms = new MemoryStream())
{
//do something with the file memorystream
}
}
}
return Request.CreateResponse(HttpStatusCode.OK);
}
}
public class RestrictiveMultipartMemoryStreamProvider : MultipartMemoryStreamProvider
{
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
var extensions = new[] {"pdf", "doc"};
var filename = headers.ContentDisposition.FileName.Replace("\"", string.Empty);
if (filename.IndexOf('.') < 0)
return Stream.Null;
var extension = filename.Split('.').Last();
return extensions.Any(i => i.Equals(extension, StringComparison.InvariantCultureIgnoreCase))
? base.GetStream(parent, headers)
: Stream.Null;
}
}

Compare Byte Arrays Before Saving To Database

What is the best way of comparing that my image saved to a database isnt different, thus saving I/O.
Scenario:
Im writing an ASP.NET Application in MVC3 using Entity Framework. I have an Edit action method for my UserProfile Controller. Now i want to check that the image i have posted back to the method is different, and if it is, then i want to call the ObjectContext .SaveChanges() if it is the same image, then move on.
Here is a cut down version of my code:
[HttpPost, ActionName("Edit")]
public ActionResult Edit(UserProfile userprofile, HttpPostedFileBase imageLoad2)
{
Medium profileImage = new Medium();
if (ModelState.IsValid)
{
try
{
if (imageLoad2 != null)
{
if ((db.Media.Count(i => i.Unique_Key == userprofile.Unique_Key)) > 0)
{
profileImage = db.Media.SingleOrDefault(i => i.Unique_Key == userprofile.Unique_Key);
profileImage.Amend_Date = DateTime.Now;
profileImage.Source = Images.ImageToBinary(imageLoad2.InputStream);
profileImage.File_Size = imageLoad2.ContentLength;
profileImage.File_Name = imageLoad2.FileName;
profileImage.Content_Type = imageLoad2.ContentType;
profileImage.Height = Images.FromStreamHeight(imageLoad2.InputStream);
profileImage.Width = Images.FromStreamWidth(imageLoad2.InputStream);
db.ObjectStateManager.ChangeObjectState(profileImage, EntityState.Modified);
db.SaveChanges();
}
}
}
}
So i save my image as a varbinary(max) in nto a SQL Server Express DB, which is referenced as a byte array in my entities.
Is it just a case of looping around the byte array from the post and comparing it to the byte array pulled back into the ObjectContext?
Rather than directly comparing the byte array, I would compare the hash of the images. Perhaps something like the following could be extracted into a comparison method:
SHA256Managed sha = new SHA256Managed();
byte[] imgHash1 = sha.ComputeHash(imgBytes1);
byte[] imgHash2 = sha.ComputeHash(imgBytes2);
// compare the hashes
for (int i = 0; i < imgHash1.Length && i < imgHash2.Length; i++)
{
//found a non-match, exit the loop
if (!(imgHash1[i] == imgHash2[i]))
return false;
}
return true;

Store uploaded document in database using Entity Framework (via MVC)

Does anyone have an example of how to store an uploaded document (whether Word or PDF, etc.) in a SQL Server 2008 DB using the Entity Framework?
I think I've figured out the file upload part (see code below) although I'm open to constructive comments.
I used to do a lot of document-storing in older DBs, and as SQL Server 2008 has new data types, can someone suggest which I should be using for my uploaded document (is it still Image)?
Furthermore, as you can see from my code, if I have the uploaded document as a byte array (or please suggest otherwise) how do I pass this to my entity?
Here's what I have so far:
[HttpPost]
public ActionResult Create(HttpPostedFileBase fileUpload)
{
if (fileUpload == null) return View();
if (fileUpload.ContentLength == 0) return View();
// not sure whether this is useful or the method below this
var reader = new StreamReader(fileUpload.InputStream);
foreach (string file in Request.Files)
{
var hpf = Request.Files[file] as HttpPostedFileBase;
if (hpf.ContentLength == 0) continue;
var savedFileName = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, Path.GetFileName(hpf.FileName));
hpf.SaveAs(savedFileName);
var curFile = System.IO.File.Open(file, FileMode.Open);
var fileLength = curFile.Length;
var tempFile = new byte[fileLength];
curFile.Read(tempFile, 0, Convert.ToInt32(fileLength));
var jobFile = new JobFile
{
UploadDate = DateTime.Now,
UploadedBy = User.Identity.Name,
FileName = savedFileName,
ContentType = hpf.ContentType
};
jobFile.FileData =
_jobFileRepository.Save();
}
return RedirectToAction("Index");
}
Any help greatly appreciated.
Assuming you have FileData as type binary.
jobFile.FileData = tempFile.ToArray();

Resources