Is it possible to upload big file to wildfly chunk by chunk? - upload

there,
We have to upload big file to wildfly app server. The internet is not stable and may break from time to time. we need to transfer the big file chunk by chunk just like this post did: https://scribie.com/blog/2014/09/chunked-upload-with-nginx-and-nodejs/
I have googled for a while but not found clues about how to do on wildfly. Is there any example to kindly provide?
Thanks in advance.
Li

Related

YAWS crash uploading 7 MB file

I'm using a Raspberry like board with YAWS 2.0.4 and Erlang 19.
I wrote two webpages to upload a file and save it on the server: with "larger" file (I mean, ~7MB) the server crashes, with smaller file all works fine.
I already tried to use the example code found in the YAWS site and another one with the temp_file and binary options, but it doesn't work.
Any suggestions?
Thanks in advance.
After spending much time, I've found the problem: the partial_post_size parameter in the YAWS configuration was too much high.
I've changed it, near the default value (10240) and all works fine.

How can I open or review a large txt file

I have a few large txt files I'm trying to load into a data warehouse, I do get an error message with the offending row/line number but cannot open the txt file to review it as says it's 2 large 2,413,060KB. Someone suggested using the cmd option to do this but unsure how.
You can either use HJSplit in order to split the data or using another vim-based program such as gVim. Also it may be good to free up your PC's RAM as much as possible.

var/tmp folder taking up half my storage space?

I know this isn't really code related, but I don't know where else to ask?
While working yesterday I got a message saying that my startup disk was almost full. Which I wasn't too surprised by because it's only a 128gb Air.
But when I fired up Daisydisk to see what the issue was it appears that my computer has stored 2 files in the private/var/tmp directory, each over 30gb. Obviously Daisydisk won't let me erase them because of the directory they are in.
They are called magick-23598T_US4im5XKvQ.pam and magick-23587vell8J7UTKgS.pam
I have no idea where they came from, but I was testing a file upload system for a rails project when this happened. I was however uploading images over no more than 800kb or so. This seems a little extreme for that.
If anyone has any idea what might have happened, or how I can safely free up this space again, I would be massively grateful.
Looks like ImageMagick temp files -- are you processing the images with ImageMagick? There's a similar problem discussed here although the exact cause may be different.
It is likely a large swap file from ImageMagick that hasn't been cleaned up. You can limit the file sizes by editing your policy.xml config for ImageMagic (/etc/ImageMagick/policy.xml on Ubuntu).
More info here: https://www.imagemagick.org/discourse-server/viewtopic.php?f=3&t=29225&p=130707#p130707

red5 1.0.1 : truncated recorded stream .flv.ser : Error -18 truncated box

I've recorded streams using streamPublishStart callback with Red5 streaming server. It works. But a few times, the internet connection fall down in the publisher side. Then, in the streams directory, I have got a .flv.ser file. It's not playable. I've tried to repair/fix it with all software that propose it. No success. I've use flvcheck.exe and the report is : Error -18 truncated box. I've seen discussion on Adobe forums but no interesting things. Could you propose me a technic or a software to solve my problem.
thanks in advance,
Pascal.
Did you get your question answered? The .flv.ser is a temp file, created until Red5 is done processing the stream. When done, there is a new file without the .ser extension. What I had to do was create a ajax script that looks at the directory for a .flv.ser file and prevents closing the page until the conversion is completed. Red5 version one is slow at doing the conversion. I'm testing 1.0.2 RC1 right now, but initial results look like it is even worse. I hear version .8 is the best for recording so I may have to downgrade to that.
I'm late to the party but you simply have to concatenate the files. On Linux this works like this:
cat foo.flv foo.flv.ser > playable_foo.flv
I read that somewhere else but I forgot where it was.

How we can upload Large file in chunks in rails?

I am trying to upload a zip file of 350mb - 500mb to server. It gives "ENOSPC" error.
Is it possible to upload file in chunks and receive it on server as one file ?
or
Use custom location for tmpfs, so that it will be independent of system tmp, because in my case tmp is of 128MB only.
Why not use the Web-server uploading feature like nginx-upload and apache-upload
Not sure what it is called in apache but I guess apache too has it
if you are using Nginx
there is also a nginx-upload-progress which can be helpful if you want to track the progress of the upload
Hope this help

Resources