Get a list of all files during multifileupload in the same batch - vaadin

I'm using this upload addon:
<dependency>
<groupId>com.wcs.wcslib</groupId>
<artifactId>wcslib-vaadin-widget-multifileupload</artifactId>
<version>4.0</version>
</dependency>
I'm uploading multiple files and each file upload is processed in:
void handleFile(InputStream stream, String fileName, String mimeType, long length, int filesLeftInQueue);
but it gives me only info about currently processed file.
I need a list of all files to check if two files with the same name but different extensions are uploaded. I checked some components related to this upload but methods that could be useful are private.
How do I get a list of all files that are uploaded in the same batch?

Related

How to get the file path to an asset included in a Dart package?

I am writing a Dart package (not Flutter). I have included a few bitmap images as public assets, e.g., lib/assets/empty.png. When this package is running as a command-line app for an end-user, how can I get the file path to these assets on the user's system?
Use-case: My Dart package calls out to FFMPEG, and I need to tell FFMPEG where to find these asset files on the system that's using my package. For example, the call to FFMPEG might look like:
ffmpeg -i "path/to/lib/assets/empty.png" ...
Accessing a Dart package's assets can happen in two modalities:
Running a Dart CLI app with the dart tool and accessing a dependency's assets, or
Running an executable CLI app
The difference between these two situations is that when you're running a CLI app using the dart tool, all of your dependencies are available as structured packages in a local cache on your system. However, when you're running an executable, all relevant code is compiled into a single binary, which means you no longer have access at runtime to your dependencies' packages, you only have access to your dependencies' tree-shaken, compiled code.
Accessing assets when running with dart
The following code will resolve a package asset URI to a file system path.
final packageUri = Uri.parse('package:your_package/your/asset/path/some_file.whatever');
final future = Isolate.resolvePackageUri(packageUri);
// waitFor is strongly discouraged in general, but it is accepted as the
// only reasonable way to load package assets outside of Flutter.
// ignore: deprecated_member_use
final absoluteUri = waitFor(future, timeout: const Duration(seconds: 5));
final file = File.fromUri(absoluteUri);
if (file.existsSync()) {
return file.path;
}
This resolution code was adapted from Tim Sneath's winmd package: https://github.com/timsneath/winmd/blob/main/lib/src/metadatastore.dart#L84-L106
Accessing assets when running an executable
When compiling a client app to an executable, that client app simply cannot access any asset files that were stored with the dependent package. However, there is a work around that may work for some people (it did for me). You can store Base64 encoded versions of your assets in your Dart code, within your package.
First, encode each of your assets into a Base64 string and store those strings somewhere in your Dart code.
const myAsset = "iVBORw0KGgoAAA....kJggg==";
Then, at runtime, decode the string back to bytes, and then write those bytes to a new file on the local file system. Here's the method I used in my case:
/// Writes this asset to a new file on the host's file system.
///
/// The file is written to [destinationDirectory], or the current
/// working directory, if no destination is provided.
String inflateToLocalFile([Directory? destinationDirectory]) {
final directory = destinationDirectory ?? Directory.current;
final file = File(directory.path + Platform.pathSeparator + fileName);
file.createSync(recursive: true);
final decodedBytes = base64Decode(base64encoded);
file.writeAsBytesSync(decodedBytes);
return file.path;
}
This approach was suggested by #passsy
Have a look at the dcli package.
It has a 'pack' command designed to solve exactly this problem.
It encodes assets into dart files that can be unpacked at runtime.

Is there any offline safe method to prevent web-shell uploading in aspnet?

I have a simple page in asp net 5, and users can upload their images there. Valid files are: *.jpg, *.png, so I'm doing steps below to validating the files:
Validating filename length : e.g: file name must be less than 50 alphabet characters
Validating filename : replacing any hidden or invalid characters
Validating file size : based on our configurations (e.g: less than 10MB)
Validating file extensions : based on our white-list: *.jpg, *.png
Validating Mimetypes : based on our white-list for IMAGE/JPEG, IMAGE/PNG
Validating file's first bytes (Magic Number) : based on our white-list for JPG: "FF-D8-FF-DB", "FF-D8-FF-E0" , "FF-D8-FF-EE" ,"FF-D8-FF-E1" and PNG: "89-50-4E-47"
Uploading the file with a random (guid) filename in the temp folder outside webroot: without any executing permissions.
Scanning the file with AV (Kaspersky or Norton Security) service installed.
But, some webshells can bypass these steps, like Insomnia webshell or others (they use the magic number headers at the first of file headers and inject their codes into some part of the file).
So my question is :
how can I detect and prevent webshell uploading?
Should I read and check the whole file for some black-list keywords?
Or what?
btw :We can't use any online webshell detection services.
This is a simple shell injected into a PNG file by woanware.co.uk:

Stream multiple Excel files as one file

I want to deliver large Excel files using a webservice or httphandler.
As the Excel files can be very big in size, I want to split them up into smaller files, to decrease the memory footprint.
So I will have a master excelfile that contains the column headers and data.
And further files which will only contain data.
During download, I want to stream the master excel file first and then append all other related excel files as one download stream.
I don't want to zip them! It should be one file at the end
Is this possible?
Master excel file with headers:
All other files will look like this (without headers):
This will indeed return crap:
void Main()
{
CombineMultipleFilesIntoSingleFile();
}
// Define other methods and classes here
private static void CombineMultipleFilesIntoSingleFile(string outputFilePath= #"C:\exceltest\main.xlsx", string inputDirectoryPath = #"C:\exceltest", string inputFileNamePattern="*.xlsx")
{
string[] inputFilePaths = Directory.GetFiles(inputDirectoryPath, inputFileNamePattern);
Console.WriteLine("Number of files: {0}.", inputFilePaths.Length);
using (var outputStream = File.Create(outputFilePath))
{
foreach (var inputFilePath in inputFilePaths)
{
using (var inputStream = File.OpenRead(inputFilePath))
{
// Buffer size can be passed as the second argument.
inputStream.CopyTo(outputStream);
}
Console.WriteLine("The file {0} has been processed.", inputFilePath);
}
}
}
When you are requesting for file, do not download it at first request.
Request for file names to be downloaded in AJAX request.
For each file name received, prepare its path to the server.
Create hidden iFrames for each file path and specify src as file path for each file to be downloaded.
When iFrame's src attribute is set, it will navigate to the file path and each iFrame will download single file, so multiple iFrame downloads multiple files.
You cannot download multiple files in single request. As if you will append the stream of multiple files, it will create a garbage file, a single garbage file.

How to download all files in an Azure Container Directory?

I have an aspnet app which i upload files to the azure blobs. I know that azure don't create structural paths in the containers, just blobs, but you can emulate directories putting a "/" on the uri.
i.e
I'd upload a list of files and my uri is like this
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName01.jpg
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName02.jpg
http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/FileName03.jpg
My download method:
public RemoteFile Download(DownloadRequest request)
{
var fileFinal = string.Format("{0}/{1}/{2}",request.IDProtocol ,request.IDDocument, request.FileName);
var blobBlock = InitializeDownload(fileFinal);
if (!blobBlock.Exists())
{
throw new FileNotFoundException("Error");
}
var stream = new MemoryStream();
blobBlock.DownloadToStream(stream);
return File(request.FileName)
}
private CloudBlob InitializeDownload(string uri)
{
var blobBlock = _blobClient.GetBlobReference(uri);
return blobBlock;
}
This way, i'm getting just one file. But i need to see and download all files inside http://myaccount.windowsazure.blob.net/MyProtocolID-01/MyDocumentID-01/
Thanks
Adding more details. You will need to use one of the listing APIs provided by the client library: CloudBlobContainer.ListBlobs(), CloudBlobContainer.ListBlobsSegmented(), and CloudBlobContainer.ListBlobsSegmentedAsync() (and various overloads.). You can specify the directory prefix, and the service will only enumerate blobs matching the prefix. You can then download each blob. You may also want to look at the ‘useFlatBlobListing’ argument, depending on your scenario.
http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblobcontainer.listblobs.aspx
In addition AzCopy (see http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx) also supports this scenario of downloading all blobs in a given directory path.
Since each blob is a separate web resource, function above will download only one file. One thing you could do is list all blobs using the logic you are using and then download those blobs on your server first, zip them and the return that zip file to your end user.
Use AzCopy functionalities, now, it has a lot of supports.
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10

Upload all files from local storage to Azure Blob Storage

I am currently struggling to upload multiple files from the local storage to the Azure Blob Storage, I was wondering if anyone could help me, below is the code i was previously using to upload a single zip file.
private void SaveZip(string id, string fileName, string contentType, byte[] data)
{
// Create a blob in container and upload image bytes to it
var blob = this.GetContainer().GetBlobReference(fileName);
blob.Properties.ContentType = contentType;
// Create some metadata for this image
var metadata = new NameValueCollection();
metadata["Id"] = id;
metadata["Filename"] = fileName;
}
SaveZip(
Guid.NewGuid().ToString(),
zipFile.FileName,
zipFile.PostedFile.ContentType,
zipFile.FileBytes);
Thanks, Sami.
It's quite straightforward with Set-AzureStorageBlobContent from azure storage powershell.
ls -File -Recurse | Set-AzureStorageBlobContent -Container upload
MSDN documentation : http://msdn.microsoft.com/en-us/library/dn408487.aspx
I don't think there's any build-in methods you can use to upload multiple files to the BLOB. What you can do is to upload them one by one, or parallel.
If you're just starting to work with Blob Storage, I'd encourage you to take a look at the "How to" article we've published. Specifically, the section on "How to Upload a Blob into a Container" should be helpful. Beyond that, Shaun is correct - there is no built-in support in the StorageClient library for uploading multiple files at once, but you can certainly upload them one-by-one.
If your need is just to get it done, and not to make an app out of it, you should consider checking out Cloud Storage Studio.
Like CodeThug said, "You never do anything with the byte array".
You have to upload the data stream to the blob and you are done.

Resources