Cookie Expired Indy - delphi

Recently i found that i am getting EConvertError after calling IDHTTP.GET.
I analyzed the traffic and saw the expire date on the cookie is 2000. Now my question is how to bypass this. I am using the Indy10 that is present in XE3. I know Indy follows strict standard about cookie handling but shouldn't there be a feature to turn this off ?
URL: https://graph.facebook.com/me?access_token=ACCESS_TOKEN
StackTrace:
:75a5c41f KERNELBASE.RaiseException + 0x58
System.SysUtils.ConvertErrorFmt($412994,(...))
System.SysUtils.StrToInt('')
IdGlobal.IndyStrToInt('')
IdGlobalProtocols.RawStrInternetToDateTime('',0)
IdGlobalProtocols.GMTToLocalDateTime('')
IdHTTPHeaderInfo.TIdEntityHeaderInfo.ProcessHeaders
IdHTTPHeaderInfo.TIdResponseHeaderInfo.ProcessHeaders
IdHTTP.TIdHTTPProtocol.RetrieveHeaders(???)
Response Headers:
(Status-Line) HTTP/1.1 200 OK
Access-Control-Allow-Origin *
Cache-Control private, no-cache, no-store, must-revalidate
Content-Type text/javascript; charset=UTF-8
ETag "676c539ac3cd7161f5492ce95d72d8b620c6fa6c"
Expires Sat, 01 Jan 2000 00:00:00 GMT
Last-Modified 2012-12-20T20:08:20+0000
P3P CP="Facebook does not have a P3P policy. Learn why here: http://fb.me/p3p"
Pragma no-cache
X-FB-Rev 702819
X-UA-Compatible IE=edge,chrome=1
Set-Cookie m_ts=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.facebook.com; httponly
Set-Cookie reg_ext_ref=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.facebook.com
Set-Cookie reg_fb_gate=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.facebook.com
Set-Cookie reg_fb_ref=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.facebook.com
X-FB-Debug kZmwuLCRhfhJBKfLoQEbTOBJNyKQGUKLEeJ2R2rcxXg=
Date Fri, 28 Dec 2012 10:02:19 GMT
Connection keep-alive
Content-Length 1932

The offending date/time string is not coming from a cookie, as the TIdCustomHTTP.ProcessCookies(), TIdCookieManager.AddServerCookies(), and TIdCookie.ParseServerCookie() methods are not included in the call stack you showed. It is actually the Last-Modified header that is at fault. Facebook is sending an ISO 8601 formatted date, which is not supported by the HTTP specs. That is a bug in Facebook's HTTP server. They should have known better than to use a non-conforming format for an HTTP date header. That bug needs to be reported to Facebook so they can fix it. In the meantime, I have checked in an update to the latest Indy SVN snapshot so TIdHTTP can now parse ISO 8601 dates.

Related

gzip compression doesn't work and can't get 304 in chrome

I'm working on a compression and caching mechanism in my asp.net mvc 5 app.
I'm sending files with the following cache headers:
Response.Cache.SetCacheability(HttpCacheability.Public);
Response.Cache.SetExpires(DateTime.UtcNow.AddYears(1).ToUniversalTime());
Response.Cache.SetLastModified(System.IO.File.GetLastWriteTime(serverPath).ToUniversalTime());
Response.AppendHeader("Vary", "Accept-Encoding");
IE11, Edge, Firefox, all sends the If-Modified-Since header on F5 refresh, but not Chrome. Why is that and how to workaround it? In Chrome I got 200 status code and file is loaded from cache.
The second problem I have is with enabling gzip compression.
I have a standard action filter for this:
public class CompressContentMvcAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
GZipEncodePage();
}
private bool IsGZipSupported()
{
string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"];
if (!string.IsNullOrEmpty(AcceptEncoding) &&
(AcceptEncoding.Contains("gzip") || AcceptEncoding.Contains("deflate")))
{
return true;
}
return false;
}
private void GZipEncodePage()
{
HttpResponse Response = HttpContext.Current.Response;
if (IsGZipSupported())
{
string AcceptEncoding = HttpContext.Current.Request.Headers["Accept-Encoding"];
if (AcceptEncoding.Contains("gzip"))
{
Response.Filter = //new GZipCompressionService().CreateCompressionStream(Response.Filter);
new System.IO.Compression.GZipStream(Response.Filter,
System.IO.Compression.CompressionMode.Compress);
Response.Headers.Remove("Content-Encoding");
Response.AppendHeader("Content-Encoding", "gzip");
}
else
{
Response.Filter =// new DeflateCompressionService().CreateCompressionStream(Response.Filter);
new System.IO.Compression.DeflateStream(Response.Filter,
System.IO.Compression.CompressionMode.Compress);
Response.Headers.Remove("Content-Encoding");
Response.AppendHeader("Content-Encoding", "deflate");
}
}
// Allow proxy servers to cache encoded and unencoded versions separately
Response.AppendHeader("Vary", "Content-Encoding");
}
}
I apply this filter on my action method returning application assets, but it got the Transfer-Encoding: chunked for each file, not gziped.
This filter is copied from my previous project and there it is still working at is expected. Could it be a problem with IIS server? Locally I have a IIS 10 and .NET 4.7, the the older app, where it works is hosted on IIS 8.5 and framework 4.5. Can't think of anything else. I'm googling the second day and can't find any clue.
I'm not interested in compressing in IIS.
[edit]
Header I got from response:
HTTP/1.1 200 OK
Cache-Control: public
Content-Type: text/javascript
Expires: Sat, 18 May 2019 08:58:48 GMT
Last-Modified: Thu, 10 May 2018 13:26:02 GMT
Vary: Content-Encoding
Server: Microsoft-IIS/10.0
X-AspNetMvc-Version: 5.2
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Fri, 18 May 2018 08:58:48 GMT
Transfer-Encoding: chunked
I always use Fiddler to inspect these kind of challenges.
The F5/If-Modified-Since issue.
Chrome just doesn't make a new request if the expires header has been set and its datetime value is still actual. So Chrome is respecting your expected caching behaviour. When navigating over your website via an other browser, you 'll see that also these don't send any requests for these assets. F5 is 'special', it's a forced refresh.
The chunked/gzip issue
Clear your browser cache and inspect the first response. Fiddler will show 'Response body is encoded', which means compressed (gzip or deflate).
Whether you see Transfer-Encoding chunked depends on whether the Content-Length header is present. See the difference in the responses below. If you don't want the chunked Transfer-Encoding set the Content-Length header.
Content-Type: text/javascript; charset=utf-8
Content-Encoding: gzip
Expires: Sat, 25 May 2019 13:14:11 GMT
Last-Modified: Fri, 25 May 2018 13:14:11 GMT
Vary: Accept-Encoding
Server: Microsoft-IIS/10.0
Content-Length: 5292
Content-Type: text/javascript; charset=utf-8
Transfer-Encoding: chunked
Content-Encoding: gzip
Expires: Sat, 25 May 2019 13:14:11 GMT
Last-Modified: Fri, 25 May 2018 13:14:11 GMT
Vary: Accept-Encoding
Server: Microsoft-IIS/10.0
Because you are handling the serving of assets via your own code, instead of via the IIS static files module, you have to deal with all response headers yourself.

Outlook REST API expiration

I implemented the outlook REST api on my Rails app following this official tutorial: https://dev.outlook.com/restapi/tutorial/ruby
By the way, it needs to be updated. Outlook now requires more permissions ('profile') to get the email of the user in the auth controller:
SCOPES = [ 'openid', 'profile', 'https://outlook.office.com/mail.read' ]
Anyhow, I am storing the email and token after authenticating, but that token is very short lived. I need a way to permanently store authentication for the user.
When I run things as suggested and want to get the emails the response is:
...
response_headers: !ruby/hash-with-ivars:Faraday::Utils::Headers
elements:
content-length: '0'
server: Microsoft-IIS/8.5
set-cookie: exchangecookie=afaeef5a8a6747aab24dad1ddb97a8fb; expires=Fri, 16-Jun-2017
00:07:54 GMT; path=/; HttpOnly
www-authenticate: Bearer client_id="00000002-0000-0ff1-ce00-000000000000", trusted_issuers="00000001-0000-0000-c000-000000000000#*",
token_types="app_asserted_user_v1 service_asserted_app_v1", authorization_uri="https://login.windows.net/common/oauth2/authorize",
error="invalid_token",Basic Realm="",Basic Realm=""
request-id: c59488ab-62b5-4a9f-a3f5-43bda739c9ab
x-calculatedbetarget: BLUPR07MB260.namprd07.prod.outlook.com
x-backendhttpstatus: '401'
x-ms-diagnostics: '2000010;reason="ErrorCode: ''PP_E_RPS_REASON_TIMEWINDOW_EXPIRED''.
Message: ''Failed the Validate call, reason: Time window expired.%0d%0a''";error_category="invalid_msa_ticket"'
x-diaginfo: BLUPR07MB260
x-beserver: BLUPR07MB260
x-powered-by: ASP.NET
x-feserver: BN3PR16CA0057
x-msedge-ref: 'Ref A: 3E2E02429DE244F7A738A7BE6CF9E06B Ref B: CAA6321D024D5B725BA8FFE7DAC85411
Ref C: Wed Jun 15 17:07:54 2016 PST'
date: Thu, 16 Jun 2016 00:07:54 GMT
connection: close
ivars:
:#names:
content-length: content-length
server: server
set-cookie: set-cookie
www-authenticate: www-authenticate
request-id: request-id
x-calculatedbetarget: x-calculatedbetarget
x-backendhttpstatus: x-backendhttpstatus
x-ms-diagnostics: x-ms-diagnostics
x-diaginfo: x-diaginfo
x-beserver: x-beserver
x-powered-by: x-powered-by
x-feserver: x-feserver
x-msedge-ref: x-msedge-ref
date: date
connection: connection
status: 401
How do I adjust this code to store a permanent token?
OK< I found a few answers on StackOverflow for different stacks. I would normally delete my post, but for Rails developer who may need this, the answer to my questions is simply adding 'offline_access' to SCOPES = [ 'openid', 'profile', 'offline_access', 'https://outlook.office.com/mail.read' ]

httpclient with twitter gives me unauthorized 401 strict-transport-security: max-age=631138519

When using httpClient to connect to twitter I Always get this response
responseString{StatusCode: 401, ReasonPhrase: 'Unauthorized', Version:
1.1, Content:System.Net.Http.StreamContent, Headers: { strict-transport-security: max-age=631138519 Date: Fri, 31 Jan 2014
00:35:10 UTC Set-Cookie: guest_id=v1%3A139112851013762159;
Domain=.twitter.com; Path=/; Expires=Sun, 31-Jan-2016 00:35:10 UTC
Server: tfe Content-Length: 63 Content-Type: application/json;
charset=utf-8 } }
System.Net.Http.HttpResponseMessage
I googled
strict-transport-security: max-age
found people suggested to change the access setting of the twitter app to Read, Write and Access direct messages, i Did so but nothing changed , so if any one faced the same problem or any body has suggestions , it would be appreciated
There are multiple reasons this might happen. I have this question on the LINQ to Twitter FAQ with several suggestions on how to debug:
https://linqtotwitter.codeplex.com/wikipage?title=LINQ%20to%20Twitter%20FAQ

How to make use of jsessionid together with basic authentication

I am using JBoss 7.1 and have secured my web application with Basic authentication but I want only the first call to require the Basic authentication header, sequent calls should use the jsessionid for authentication. How to accomplish this?
So far I have created a rest servlet enforcing the creation of a session with a call to request.getSession()
#Path("/rest/HelloWorld")
public class HelloWorld {
#GET()
#Produces("text/plain")
public String sayHello(#Context HttpServletResponse response,
#Context HttpServletRequest request) {
HttpSession session = request.getSession();
return "Hello World! " + request.getUserPrincipal().getName();
}
My idea was that any other calls should only require the jsessionid cookie, but when looking in fiddler I see that the first call is behaving as expected. First you get a 401 and the client is re-sending including the basic authorization header and a jsessionid is returned. On the second call the jsessionid cookie is included but I still get an 401 that triggers the client to re-send the Basic authorization header.
This is the returned headers from the successful authenticated first call.
HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Pragma: No-cache
Cache-Control: no-cache
Expires: Thu, 01 Jan 1970 01:00:00 CET
Set-Cookie: JSESSIONID=AFDFl2etiUNkn-mpM+DXr3KE; Path=/Test
Content-Type: text/plain
Content-Length: 18
Date: Tue, 29 Jan 2013 09:12:48 GMT
Hello World! test1
when I make a second call the jsessionid is included
GET /Test/index.html HTTP/1.1
Host: cwl-rickard:8080
Cookie: JSESSIONID=AFDFl2etiUNkn-mpM+DXr3KE
and I am getting a 401 enforcing the client to re-send the request including the basic authorization header.
HTTP/1.1 401 Unauthorized
Server: Apache-Coyote/1.1
Pragma: No-cache
Cache-Control: no-cache
Expires: Thu, 01 Jan 1970 01:00:00 CET
WWW-Authenticate: Basic realm="ApplicationRealm"
Content-Type: text/html;charset=utf-8
Content-Length: 958
Date: Tue, 29 Jan 2013 09:12:48 GMT
Any ideas what I am missing.

Problems embedding certain YouTube Videos into a UIWebView due to lack of eurl parameter

I am viewing an HTML page in a standard iOS UIWebView. Inside this page I have a standard YouTube embed, something like this:
<iframe id="video-play" width="624" height="350" src="http://www.youtube.com/embed/hBLf_N-T0vI" allowfullscreen class="hide fade"></iframe>
This works fine in all cases on the browser, and in most cases in a UIWebView. But for some videos I get the (I think misleading) message:
"The uploader has not made this video available in your country."
The problem is somewhere in the get_video_info call, and related to the fact that from a UIWebView it seems the eurl paramter is set to "unknown"
So this request works from within a UIWebView:
http://www.youtube.com/get_video_info?html5=1&video_id=hBLf_N-T0vI&eurl=unknown&ps=native&el=embedded&hl=en_GB
This fails, with errorcode 150 and the error message I mentioned above (it will work fine in a browser so use the raw HTTP request below)
http://www.youtube.com/get_video_info?html5=1&video_id=DldaCQby3j4&eurl=unknown&ps=native&el=embedded&hl=en_GB
If I change the eurl=unkown into eurl=http://rubbish.com/ then it works again:
http://www.youtube.com/get_video_info?html5=1&video_id=DldaCQby3j4&eurl=http://rubbish.com&ps=native&el=embedded&hl=en_GB
I've tried to look at the properties of the videos that are failing but can't figure out what is causing the difference between the good ones and the bad ones. I've looked at embed settings, privacy and tracking settings and anything else I can think of.
My other option is to figure out how to add an eurl paramter to the request.
Help!
Jon
If anyone wants the full HTTP requests/response:
BAD REQUEST
GET /get_video_info?html5=1&video_id=DldaCQby3j4&eurl=unknown&ps=native&el=embedded&hl=en_GB HTTP/1.1
Host: www.youtube.com
BAD RESPONSE
HTTP/1.1 200 OK
Date: Fri, 23 Nov 2012 15:42:47 GMT
Server: gwiseguy/2.0
X-Content-Type-Options: nosniff
Access-Control-Allow-Origin: *
Set-Cookie: use_hitbox=d5c5516c3379125f43aa0d495d100d6ddAEAAAAw; path=/; domain=.youtube.com
Set-Cookie: VISITOR_INFO1_LIVE=fH943IGDAFc; path=/; domain=.youtube.com; expires=Sun, 21-Jul-2013 15:42:47 GMT
Expires: Tue, 27 Apr 1971 19:44:06 EST
Cache-Control: no-cache
P3P: CP="This is not a P3P policy! See //support.google.com/accounts/bin/answer.py?answer=151657&hl=en-US for more info."
Content-Type: application/x-www-form-urlencoded
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Transfer-Encoding: chunked
status=fail&errorcode=150&reason=The+uploader+has+not+made+this+video+available+in+your+country.&storyboard_spec=http%3A%2F%2Fi1.ytimg.com%2Fsb%2FDldaCQby3j4%2Fstoryboard3_L%24L%2F%24N.jpg%7C48%2327%23100%2310%2310%230%23default%23w3GCNZfS0BvXcAQIB1BBnUJRlrY%7C80%2345%23112%2310%2310%232000%23M%24M%23VDmT14lMI4g0sdAxTbIujmEIkkI%7C160%2390%23112%235%235%232000%23M%24M%23c1uKcYcKallke_fsXgoUOkSHnwA&errordetail=0
GOOD REQUEST
GET /get_video_info?html5=1&video_id=DldaCQby3j4&eurl=http%3A%2F%2Frubbish.com%2F&ps=native&el=embedded&hl=en_GB HTTP/1.1
Host: www.youtube.com
GOOD RESPONSE
HTTP/1.1 200 OK
Date: Fri, 23 Nov 2012 15:42:34 GMT
Server: gwiseguy/2.0
X-Content-Type-Options: nosniff
Access-Control-Allow-Origin: *
Set-Cookie: use_hitbox=d5c5516c3379125f43aa0d495d100d6ddAEAAAAw; path=/; domain=.youtube.com
Set-Cookie: VISITOR_INFO1_LIVE=zG2n4ZwVAdY; path=/; domain=.youtube.com; expires=Sun, 21-Jul-2013 15:42:33 GMT
Expires: Tue, 27 Apr 1971 19:44:06 EST
Cache-Control: no-cache
P3P: CP="This is not a P3P policy! See //support.google.com/accounts/bin/answer.py?answer=151657&hl=en-US for more info."
Content-Type: application/x-www-form-urlencoded
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Transfer-Encoding: chunked
account_playback_token .... (chopped for brevity).
Could something to do with "claimed" videos. A contact at YouTube got back to us:
"The videos [that worked] are claimed and the others aren't, so
something in the claiming process is causing the issue."

Resources