mochiweb: disable headers ordering - erlang

I checked mochiweb response headers, and they are always ordered (descending):
< Server: MochiWeb/1.0 (Any of you quaids got a smint?)
< last-modified: Sun, 30 Aug 2015 23:13:04 GMT
< Date: Sun, 30 Aug 2015 23:15:15 GMT
< Content-Type: text/html
< Content-Length: 89
This looks to be because of mochiweb headers are being handled with an erlang gb_tree, which is later converted to a list so it comes ordered.
Is there a way to change this? as this could resolve in problems (sometimes) like here and here
My problem is that I am trying to create a service that replicates some requests and I am using mochiweb for this, but Headers are being ordered so it isn't completely replicating the response correctly.

Fixed here:
mochiweb commit 952087e
It was a problem with formatting the headers in the response, and not in mochiweb_headers.

Related

IdHTTP Increase download speed using ranges

I have a program that downloads files and realized that by downloading using the program the speed hangs at 300kbps, however downloading the same file using a browser doubles the speed. I have seen in some posts about using ranges to separate the file into parts and download using threads, but I could not implement it.
My code:
IdHTTP1.Head(URL);
Range: = IdHTTP1.Request.Ranges.Add;
Range.StartPos: = 0;
Range.EndPos: = Trunc(IdHTTP1.Response.ContentLength / 2);
IdHTTP1.Post(URL, Parameters, FS);
When I run the code, it returns me the whole file, i'm not sure if this code is correct, I tried to do something simple to test, but it did not work.
When I use IdHttp.Head() the raw header returns this information:
Date: Mon, 29 May 2017 18:02:11 GMT
Pragma: No-cache
Cache-Control: no-cache, no-store, max-age = 0
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Content-Disposition: attachment; filename = "pecas.pdf"
Content-Type: application / pdf
Content-Length: 18507774

Firefox stored cached incomplete response

I just found a partial response being cached as complete in one of our customer's machines, which rendered the whole website unusable. And I have absolutely no idea, what could possible have gone wrong there.
So what could have possibly gone wrong in the following setup?
On the server-side, we have an ASP.NET-application running. One IHttpHandler handles requests to javascript-files. It basically minifies the files as they are requested and writes the result on the response-stream. It does also log the length of the string being written to the Response-Stream:
String javascript = /* Javascript is retrieved here */;
HttpResponse response = context.Response;
response.ContentEncoding = Encoding.UTF8;
response.ContentType = "application/javascript";
HttpCachePolicy cache = response.Cache;
cache.SetCacheability(HttpCacheability.Public);
cache.SetMaxAge(TimeSpan.FromDays(300));
cache.SetETag(ETag);
cache.SetExpires(DateTime.Now.AddDays(300));
cache.SetLastModified(LastModified);
cache.SetRevalidation(HttpCacheRevalidation.None);
response.Headers.Add("Vary", "Accept-Encoding");
Log.Info("{0} characters sent", javascript.length);
response.Write(javascript);
response.Flush();
response.End();
The content is then normally sent using gzip-encoding with chunked transfer-encoding. Seems simple enough to me.
Unfortunately, I just had a remote-session with a user, where only about 1/3 of the file was in the cache, which broke the file of course (15k instead of 44k). In the cache, the content-encoding was also set to gzip, all communication took place via https.
After having opened the source-file on the user's machine, I just hit Ctrl-F5 and the full content was displayed immediately.
What could have possibly gone wrong?
In case it matters, please find the cache-entry from Firefox below:
Cache entry information
key: <resource-url>
fetch count: 49
last fetched: 2015-04-28 15:31:35
last modified: 2015-04-27 15:29:13
expires: 2016-02-09 14:27:05
Data size: 15998 B
Security: This is a secure document.
security-info: (...)
request-method: GET
request-Accept-Encoding: gzip, deflate
response-head: HTTP/1.1 200 OK
Cache-Control: public, max-age=25920000
Content-Type: application/javascript; charset=utf-8
Content-Encoding: gzip
Expires: Tue, 09 Feb 2016 14:27:12 GMT
Last-Modified: Tue, 02 Jan 2001 11:00:00 GMT
Etag: W/"0"
Vary: Accept-Encoding
Server: Microsoft-IIS/8.0
X-AspNet-Version: 4.0.30319
Date: Wed, 15 Apr 2015 13:27:12 GMT
necko:classified: 1
Your clients browser is most likely caching the JavaScript files which would mean the src of your scripts isn't changing.
For instance if you were to request myScripts
<script src="/myScripts.js">
Then the first time, the client would request that file and any further times the browser would read its cache.
You need to append some sort of unique value such as a timestamp to the end of your scripts so even if the browser caches the file, the new timestamp will act like a new file name.
The client receives the new scripts after pressing Ctrl+F5 because this is a shortcut to empty the browsers cache.
MVC has a really nice way of doing this which involves appending a unique code which changes everytime the application or it's app pool is restarted. Check out MVC Bundling and Minification.
Hope this helps!

httpclient with twitter gives me unauthorized 401 strict-transport-security: max-age=631138519

When using httpClient to connect to twitter I Always get this response
responseString{StatusCode: 401, ReasonPhrase: 'Unauthorized', Version:
1.1, Content:System.Net.Http.StreamContent, Headers: { strict-transport-security: max-age=631138519 Date: Fri, 31 Jan 2014
00:35:10 UTC Set-Cookie: guest_id=v1%3A139112851013762159;
Domain=.twitter.com; Path=/; Expires=Sun, 31-Jan-2016 00:35:10 UTC
Server: tfe Content-Length: 63 Content-Type: application/json;
charset=utf-8 } }
System.Net.Http.HttpResponseMessage
I googled
strict-transport-security: max-age
found people suggested to change the access setting of the twitter app to Read, Write and Access direct messages, i Did so but nothing changed , so if any one faced the same problem or any body has suggestions , it would be appreciated
There are multiple reasons this might happen. I have this question on the LINQ to Twitter FAQ with several suggestions on how to debug:
https://linqtotwitter.codeplex.com/wikipage?title=LINQ%20to%20Twitter%20FAQ

Get server data in Delphi 2010

I have a problem I'm trying to use the component idhttp using indy in delphi 2010, the problem is I'm trying to get the following information when using idHTTP1.Head ():
HTTP/1.1 200 OK
Date: Mon, 16 Jun 2003 2:53:29 GMT
Server: Apache/1.3.3 (Unix) (Red Hat / Linux)
Last-Modified: Wed, October 7, 1998 11:18:14 GMT
ETag: "1813-49b-361b4df6"
Accept-Ranges: bytes
Content-Length: 1179
Connection: close
Content-Type: text / html
The problem is not that I have to do to get this information because I can not get with idHTTP1.Request.RawHeaders.Values​​, someone could say that I have to do.
You are looking in the wrong place. You need to look in IdHTTP1.**Response**.RawHeaders instead. Also, all of those values actually have individual properties associated with them, eg:
IdHTTP1.Response.ResponseVersion
IdHTTP1.Response.ResponseCode
IdHTTP1.Response.ResponseText
IdHTTP1.Response.Date
IdHTTP1.Response.Server
IdHTTP1.Response.LastModified
IdHTTP1.Response.ETag
IdHTTP1.Response.AcceptRanges
IdHTTP1.Response.ContentLength (also IdHTTP1.Response.HasContentLength)
IdHTTP1.Response.Connection
IdHTTP1.Response.ContentType

AFNetworking set image without extension

There is a method in AFNetworking that can set image conveniently:
- (void)setImageWithURL:(NSURL *)url
placeholderImage:(UIImage *)placeholderImage
but if the url image have no extension(like http://static.qyer.com/album/user/330/21/QkpVQBsHaA/670), there are some problems,sometimes the image can be displayed exactly some times it is not displayed.
I found a method
[AFImageRequestOperation addAcceptableContentTypes:<#(NSSet *)contentTypes#>];
how should I set the contentTypes?
If you curl the URL provided, you can see the problem:
curl -i -X HEAD http://static.qyer.com/album/user/330/21/QkpVQBsHaA/670
HTTP/1.0 200 OK
Server: nginx/1.0.11
Date: Fri, 29 Mar 2013 02:03:24 GMT
Content-Type: application/octer-stream
Last-Modified: Tue, 19 Mar 2013 09:40:23 GMT
ETag: "53430075-9814c-4d843e4fc6fc0"
Accept-Ranges: bytes
Content-Length: 622924
Powered-By-ChinaCache: MISS from 060531Q354
Powered-By-ChinaCache: MISS from 060532235y
Connection: close
Content-Type: application/octer-stream (which is, strangely, a misspelling of application/octet-stream), is not a valid image mime type. If you have any control over the server, I would strongly recommend you fix this to send real mime types—for the sake of everyone accessing the CDN.
Otherwise, I would recommend you add */* to the list of acceptable content types. This should accept anything thrown at it. You can also manually specify any content types you might expect the CDN to serve, including application/octer-stream.

Resources