Lighttpd and buffering of POST uploads to CGI on embedded device - post

I'm using lighttpd on an embedded device with relatively low amounts of RAM and flash storage, and I'm running into the issue with Lighttpd where it's buffering to disk(or RAM) the entire file upload and the system is running out of space. When using apache, it will essentially stream through the data directly to the CGI program, which is what I need.
From my research, I haven't been able to find any way to configure lighttpd (or nginx) in order so that it will not buffer the entire file upload, but rather pass it directly or stream it to the CGI program which will consume it.
The application is a system upgrade which will be written directly to a certain area of flash by the CGI program, but I simply don't have the space for any type of buffering/caching which seems to be required by the lightweight web servers I have looked at.
Does anyone know of a way to avoid this buffering with lighttpd/nginx or other lightweight web server ?

The Nginx Upload Module was written to handle these types of situations but it appears abandoned by the author and apparently does not work with Nginx 1.3.9+
The Nginx Big Upload Module is an extension to the Nginx Lua Module to handle this.
If you prefer to do things yourself, you can try the Lua Resty Upload extension to the Nginx Lua Module written by the author of the Lua Module himself.

Since lighttpd 1.4.40 (released July 2016) server.stream-request-body = 2
See lighttpd server.stream-request-body doc
(old question, but it came up at the top of a search, so I am updating with an answer)

Related

Embeddable or web servers requiring no installation

I am working on a c++/cli program that which records live video (I use OpenCV). Now I need to facilitate it with broadcast support where other people can watch it by using their web browser. As you know videos are sequence of images so I need the web page to get the new "frame" grabbed from the web cam and display it in web page in speed of 5 times in every 1 second (5fps). For this, in order to allow others to access the web page, I need a web server (running in localhost is OK because connected machines can access it).
I am not up to web technologies so I have only used Apache Tomcat and Microsoft default server.But in both, you have to setup the server, upload the files to it and much more. But there are some programs like this and this where they do not require such servers where you need to install and setup but still does the job. I think they are using embedded web servers.
So, are there are any servers like that which matches to my requirements? Please note my program is in c++/cli and I need to distribute the server with it as well, just like the above 2 programs do.
Edit
Please note that I mentioned C++ / CLI just to give you a clear understanding about my system and not because I am seeking for a server with CGI script support or I m seeking for a server built with c++. If at least JavaScript can run there that is enough because I can update using it.

Writing Minecraft panel in Ruby on Rails

I'm planning on writing a control panel for Minecraft in Rails but I don't have much experience with Java at all, Minecraft seems to have some standard remote connection and query tools, but most conventional panels don't seem to use them. For example with McMyAdmin, I have disabled remote connectiona and the query, but it still seems to be able to communicate with the server after restarting it after I've edited the server configs to disable the settings.
What I'm asking is if anyone knows how McMyAdmin communicates with the Minecraft server, it comes with a plugin, but I've deleted that as well and it still seems to be able to communicate with the server, I know McMyAdmin is written in .NET and I believe it uses Mono as it's server, as it's cross platform.
If anybody could shed some light on this I'd be ever so greatful, just trying to get my head around the communication.
McMyAdmin uses the plugin to open a socket that it can interact with(Not sure which features are provided using this plugin). The rest of the features are just from the Process instance that it creates. It also just edits the config files for a few things as well or runs commands using the input stream of the process.

Flash HTTP Streaming - Multiple Files

With Flash 10.1+ and the ability to use appendBytes on a NetStream, its possible to use HTTP streaming in Flash for video delivery. But it seems that the delivery method requires the segments to be stored in a single file on disk, which can only be broken into discrete segment files with an FMS or an Apache module. You can cache the individual segment files once they're created, but the documentation indicates that you still must always use an FMS / Apache module to produce those files in the first instance.
Is it possible to break the single on-disk file into multiple on-disk segments without using an FMS, Wowza product or Apache?
There was an application which decompiled the output of the F4fpackager to allow it to be hosted anywhere, without the Apache Module. Unfortunately this application was withdrawn.
It should be possible to use a proxy to cache the fragments. Then you can use these cached files on any webserver.

How to monitor File Uploads without using Flash?

I've been looking for a way to monitor file uploading information without using flash, but probably using ajax, i suppose. I want to monitor speed and percentage of finished file upload.
Do you know of any resource that describes how to do that, or what i should follow to do it ?
In the pre-HTML5 world I believe this requires web-server support. I've used this Apache module successfully in the past:
http://piotrsarnacki.com/2008/06/18/upload-progress-bar-with-mod_passenger-and-apache/
The only way without flash is to do it on the server. The gist is:
Start the file upload
Open a streaming connection to the server
Have the server read the post headers to tell you how large the file is going to be
Have the server repeatedly check the file size (in /tmp generally) to see how complete it is
stream the % done back to the client
I've done it before in other languages, but never in ruby, so not sure of a project that's done it, sorry.

Limit upload speed for testing on lighttpd

I'm implementing ubr upload. It used Perl and PHP to upload files with a progress bar. I'm running a lighttpd development server and would like to test it fully. Currently it just transfer the files instantly since its really just moving files on my computer. Is there a way to make it seem like it actually transfers it slowly so I can watch the progress bar?
I tried adding the following to my lighttpd.conf. It may have slowed down loading the pages a little, but uploads are still instantanteous.
$HTTP["host"] == "localhost" {
server.kbytes-per-second = 8
}
Thanks
Instead of throttling things on the server side, you could try throttling your client machine. There's a nice article on how to throttle bandwidth on macs over at O'Reilly:
Exploring the Mac OS X firewall
ipfw is a BSD thing, but on Linux you could try using the shaper module and shapecfg:
Traffic Shaping Basics
$HTTP['host'] contains the host of the server. You could put the config variable in the configuration file without the host check.
Thanks for the help! Actually, I'm dual booting and just tested my exact script on my apache server. When I transfer a 200mb file on apache it actually displays the progress bar as the file transfers. On my lighttpd server, the page is "busy" as it posts the file in the background, then the bar pops up as 100% complete.
I think the way the script works is that CGI posts the file, and as it is doing that it keeps writing the size it has written into another file. Then a php script is being called every second which opens this file and looks at how much has been written.
It seems like my lighttpd server is not allowing perl and php to work at the same time.. I may be wrong though.
On my windows server I actually installed WAMP and perl. My lighttpd is using fastcgi for the php and just mod_cgi module for the perl scripts.
Ah it looks like other people have issues with lighttpd and uber uploader...
(can't link to it since I'm new)
Now the question is if lighttpd is worth using since I'll have to change this on top of all my mod_rewrite stuff.
Try using charles: http://www.charlesproxy.com/
You can limit your browser bandwidth by using the Sloppy HTTP proxy: http://www.dallaway.com/sloppy/
Sloppy deliberately slows the transfer of data between client and server.
Example usage: you probably build web sites on your local network, which is fast. Using Sloppy is one way to get the "dial-up experience" of your work without the hassle of having to install a modem.

Resources