invalid content-Length ERROR while uploading 2 GB of stream - ruby-on-rails

When Trying to upload 2GB of stream i got invalid content length error
am running Apache as frontend server to mongrel and chrome as my browser.
One more thing one i do it with mongrel alone am able to upload this 2 GB of stream ,cud anybody tell me whats the problem and how do i configure content length in apache??

I'd imagine the problem is that Apache considers it a denial of service attempt and has a cap to prevent you locking up all the server's resources (very reasonable), whether that's configurable or not I'm not sure - can't find anything but will keep hunting. If it's not, you can always build your own copy of Apache with the limit removed.
Have you considered sending the data in reasonable-sized chunks?
You might also wish to inspect the outgoing request using packet inspection or browser debugging tools. It's possible the content-length is being malformed by the browser (I doubt they have a 2GB test case...)

Related

Neo4j Docker: http-post returns error EOF

I wrote a service which purpose is to archive some sort of data every 10s.
My problem is that after aprox. 3-4 hours the service is no longer capable of writing to the neo4j-docker datatbase.
I'm confident that I am closing the connection the right way. Strangely after restarting the neo4j-docker everything is back to normal, as it supposed to be.
I'm wondering if that is some sort of limitation of the community-version of neo4j, if so it would be really nice to tell me.
Or do I need to modify a parameter of the linux image neo4j is running in, some sort of max. connections setting?
I am providing the followinng params as headers to the post:
{
"Content-Type": "application/json"
"Connection": "close"
}
After committing the request, I'm closing the connection.
Client Side Error Message:
Post "http://192.168.178.55:7474/db/neo4j/tx/commit": EOF
Neo4j Errror Log:
Caused by: java.lang.OutOfMemoryError: Java heap space
2021-02-25 00:44:47.908+0000 WARN /db/neo4j/tx/commit
it's probably not clearing down quick enough so you need to look at your memory memory consumption and may need to allocatre more memory to the container. Have a read through Understanding Memory Consumption within the knowledgebase which may help. You can also have a the memory recommendations section as well.

Is there a max size for receiving SOAP messages in Delphi?

Is there a limit to the size of incoming SOAP messages in Delphi? I have code that receives very large XML SOAP messages, but it's currently failing on ones that are over 50MB or so with this error :
EDOMParseError Exception in TServiceWrapper
Not enough storage is available to complete this operation.
Line: 11
With logging I can see that I'm getting past the BeforeExecute event in RIO with no problem, but I don't make it to the AfterExecute event. I'm not running out of pure storage, but is there a limit to what the DOMParser is allowed to use and is it configurable?
EDIT: Some more information I've found. Part of the issue does seem to be Delphi thread related. I've been able to successfully manually grab the XML response stream from the WebNode and bind if I do it in a standalone exe, but when it's run via thread from my main app, I get the same error message regarding storage.
There are several similar errors that are raised by the Operating System.
The following was taken from Steve Friedl site.
Not Enough Storage is available to process this command
When running out of memory explicitly, such as via LocalAlloc, or on CreateThread if there are too many threads
Not enough storage is available to complete this operation.
Running out of indirect non-disk resources (file HANDLEs, etc)
TechNet Suggests the following:
Do one of the following, then retry the operation:
Reduce the number of running programs
Remove unwanted files from the disk the paging file is on and restart the system
Check the paging file disk for an I/O error
Install additional memory in your system.
In addition I make sure you have the latest MsXML Parser installed, I remember this error running under Windows XP and upgrading solved the problem.

trying to load QC11.5 using Neoload but recorded response not stored by neoload

I am trying to do a loadtest on QC11.5 application using Neoload, While recording request are being captured but response body not been stored.
<<body not stored by Neoload >> Error
please help to resolve this issue
Hum... It sounds like a well-known project ;> Here is a summary for the other readers. Hewlett Packard Quality Center 10 or 11 is not a "full" web application. It is kind of local application, installed through Internet Explorer with .cab and .ocx using a HTTP tunnel. The problem to load test it is that the dialog sent by this fat client is fully encrypted. For NeoLoad, an encrypted conversation (over HTTP or HTTPS) is considered as binary and it is not stored in the design. But it is clearly showed in the "Check VU" step. Here we speak about an "alien" encrypting service on top of standard services like SSL, where NeoLoad performs well.
For the readers, to put it in a nutshell, QC cannot be load tested with a network-based approach, like all the majors and professional load testing tools do. Here it is one of the rare situations where a synchronized functional test could be the solution... with hundreds or thousands of desktops.

ASP.NET MVC pages aren't served over 3G or certain proxy servers

We've just pushed out a new ASP.NET MVC based web-app that works fine on all desktop connections, and all mobile devices such as iPhone, etc. However, when certain pages are viewed over a 3G connection (either via a 3G dongle on a laptop, or directly on a mobile device), a blank, white page is served with no content of any kind. It appears as if we've returned an empty request.
On certain proxied networks, we're getting a similar issue whereby they say our request size is too large. This sort of makes sense, as it only affects certain pages, and I would assume the mobile network providers operate all manner of proxy servers on their side.
However, I've not been able to find any information on what would constitute a request that was too large. I've profiled one of the pages in question, here are some of the stats I thought might be relevant:
HTML content size: 33.04KB compressed, 50.65KB uncompressed
Total size of all stylesheets (4 files): 32.39KB compressed, 181.65KB uncompressed
Total size of all external JS (24 files): 227.82KB compressed, 851.46KB uncompressed
To me, the compressed size of the content is not excessive, but perhaps I'm wrong. Can anyone advise on what I can do to solve this problem, as I've had a really hard time finding any definitive information on this.
As far as the MVC is concerned 3G networks are no different than Wifi. However there is a limitation on the size of the files that mobiles can Cache. In that case those files will be requested from the server with each postback.
Since some of the pages do work, I think its a good idea to isolate the problem to specific point of failure than searching in wild.
You can debug problems easilt with a 3G dongle and firebug in Firefox or Chrome developer tools
Make sure there are no Java Script errors in the first place that's causing the problem
Confirm the Javascript/css/html files are indeed delivered to the client. (Firebug on the client). On the server check the IIS logs or MS Network Monitor or create a http proxy where you can monitor the traffic. Try which ever you are comfortable with.
You have nearly 30 requests just for css/java script/html and the count may be higher if you have images. Completeting all those requests may be taking forever on 3G. Try bundling the Java Script files and reduce the request count. Browsers have limitation on number of simultaneous requests they can make, adding to the over all time (Firefox I believe can make around 10 simultaneous requests).
We did actually resolve this issue, and it was down to the size and number of Set-Cookie items in the response header. We found we had a bug whereby there were nearly 100 Set-Cookie items in the header. Resolving this bug, and reducing the size of the values fixed the issue

How to monitor File Uploads without using Flash?

I've been looking for a way to monitor file uploading information without using flash, but probably using ajax, i suppose. I want to monitor speed and percentage of finished file upload.
Do you know of any resource that describes how to do that, or what i should follow to do it ?
In the pre-HTML5 world I believe this requires web-server support. I've used this Apache module successfully in the past:
http://piotrsarnacki.com/2008/06/18/upload-progress-bar-with-mod_passenger-and-apache/
The only way without flash is to do it on the server. The gist is:
Start the file upload
Open a streaming connection to the server
Have the server read the post headers to tell you how large the file is going to be
Have the server repeatedly check the file size (in /tmp generally) to see how complete it is
stream the % done back to the client
I've done it before in other languages, but never in ruby, so not sure of a project that's done it, sorry.

Resources