Chrome losing cookies - asp.net-mvc

I am getting a error on my live site which i am not seeing on my Dev environment and it seems to only happen with Chrome. I have looked around a bit for a solution to this and i find issues only with the Auth cookie. (I actually raised an issue about chrome and the auth cookie in the past) but this is different.
I store the users cart in a cookie. I set the cookie like so
HttpCookie responseCookie = HttpContext.Response.Cookies[CartHelper.CART];
responseCookie.PackCartCookie(vm.Cart);
Where the extension method PackCartCookie set the cookie value like so
cookie.Value = HttpUtility.UrlEncode(cookieValue);
This results is a cookie being stored with the following settings
Domain = www.foo.com
RawSize = 230b
Path = /
Expires = Session
HttpOnly = HttpOnly
Value = Encrypted
When a user is interacting with the site it seems that the Cart Cookie is being created but it is being lost or dropped from time to time. When i look at the Elmah error and review HTTP_COOKIE I can see all the other cookies (I have others set in the same way which function fine) but i do not see the cart cookie.
I have had to change code to be more defensive because of this issue. But as you can imagine the cart cookie is used through out the purchase process and i have had fails when responding to a purchase where i accept payment but the system crashes as the cart is gone and the user is not notified of a successful buy. Luckily i caught this early and refunded users affected.
User Agents where I have seen the issue
Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.62 Safari/537.36
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.57 Safari/537.36
Mozilla/5.0 (Windows NT 6.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/29.0.1547.62 Safari/537.36

let me give you a solution. i have used the cookies for storing most of the values here and is very much working in all browsers and is stored for the particular mentioned time. for this i have used static classes to be accessible every where.
I have also encoded and decoded here. but you can store this by removing encoding and decoding and passing normal. Here's my code
Here i put my class with the static methods. I used HttpSecureCode with Encode and Decode using machine key cryptography. which might not be available by default in this case. you can directly put the value instead.
If you are very particular about using HttpSecureCode then use this link for building up your class
public class CookieStore
{
public static void SetCookie(string key, string value, TimeSpan expires)
{
HttpCookie encodedCookie = HttpSecureCookie.Encode(new HttpCookie(key, value));
if (HttpContext.Current.Request.Cookies[key] != null)
{
var cookieOld = HttpContext.Current.Request.Cookies[key];
cookieOld.Expires = DateTime.Now.Add(expires);
cookieOld.Value = encodedCookie.Value;
HttpContext.Current.Response.Cookies.Add(cookieOld);
}
else
{
encodedCookie.Expires = DateTime.Now.Add(expires);
HttpContext.Current.Response.Cookies.Add(encodedCookie);
}
}
public static string GetCookie(string key)
{
string value = string.Empty;
HttpCookie cookie = HttpContext.Current.Request.Cookies[key];
if (cookie != null)
{
// For security purpose, we need to encrypt the value.
HttpCookie decodedCookie = HttpSecureCookie.Decode(cookie);
value = decodedCookie.Value;
}
return value;
}
}
using these you can easily store values in cookie and fetch value whenever required
using these methods is as simple as
For Setting Cookie:
CookieStore.SetCookie("currency", "GBP", TimeSpan.FromDays(1)); // here 1 is no of days for cookie to live
For Getting Cookie:
string currency= CookieStore.GetCookie("currency");

Related

Open URL using Groovy receives status 403

I am trying to read the contents of a web page using a Groovy script. The page contains the readings from one of my temperature sensors that I want to save regularly. I have tried the simplest variant:
def url = "https://measurements.mobile-alerts.eu/Home/MeasurementDetails?deviceid=021B5594EAB5&vendorid=60122a8b-b343-49cb-918b-ad2cdd6dff16&appbundle=eu.mobile_alerts.mobilealerts&fromepoch=1674432000&toepoch=1674518400&from=23.01.2023%2000:00&to=24.01.2023%2000:00&command=refresh"
def res = url.toURL().getText()
println( res)
The result is:
Caught: java.io.IOException: Server returned HTTP response code: 403 for URL: (my url)
In any browser, this URL works without problems.
I would be very grateful for any tips on how to solve this problem.
HTTP code 403 means that a client is forbidden from accessing a valid URL. In other words, the server knows that you are not making a request via a web browser. To bypass this restriction, you need to specify a User-Agent in the request header.
For example:
def url = 'https://measurements.mobile-alerts.eu/Home/MeasurementDetails?deviceid=021B5594EAB5&vendorid=60122a8b-b343-49cb-918b-ad2cdd6dff16&appbundle=eu.mobile_alerts.mobilealerts&fromepoch=1674432000&toepoch=1674518400&from=23.01.2023%2000:00&to=24.01.2023%2000:00&command=refresh'
def res = url.toURL().getText(requestProperties:
['User-Agent': 'Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:47.0) Gecko/20100101 Firefox/47.0'])
println res
You can switch to other valid user-agent values.

Initiate SingleSignOn by Saml2PostBinding

I am having an issue when using ITfoxtec for ASP.NET Core 3.0.
As context I am trying to establish a connection between a webapplication and a third-party login service. To encapsulate some of the possibilities beforehand, the third-party has access to our metadata-url and configured their services for our webapplication.
Desired user workflow:
User enters the webapplication;
User clicks a button which redirects the user to the login service;
User logs in on the service and redirects back to the given returnURL;
Afterwards the webapplication determines permission based on the provided sso-cookie.
Steps taken so far:
Added Saml2 section in appsettings.json containing our metadata.xml and issuer. The issuer name equals the given EntityID provided within the metadata.xml. It is made anonymous in the given context, like so:
"Saml2": {
"IdPMetadata": "wwwroot/SAML/Metadata.xml",
"Issuer": "myIssuerName",
"SignatureAlgorithm": "http://www.w3.org/2000/09/xmldsig#rsa-sha1",
"CertificateValidationMode": "ChainTrust",
"RevocationMode": "NoCheck",
"SigningCertificateFile": "\\licenses\\certificate.pfx",
"SigningCertificatePassword": "password1"
},
Added Saml2Configuration in startup.cs;
services
.Configure<Saml2Configuration>(Configuration.GetSection("Saml2"))
.Configure<Saml2Configuration>(configuration =>
{
configuration.SigningCertificate = CertificateUtil.Load(
$"{Environment.WebRootPath}{Configuration["Saml2:SigningCertificateFile"]}",
Configuration["Saml2:SigningCertificatePassword"]);
configuration.AllowedAudienceUris.Add(configuration.Issuer);
var entityDescriptor = new EntityDescriptor();
entityDescriptor.ReadIdPSsoDescriptorFromFile(Configuration["Saml2:IdpMetadata"]);
if (entityDescriptor.IdPSsoDescriptor == null) throw new Exception("Failed to read the metadata.");
configuration.SignAuthnRequest = true;
configuration.SingleSignOnDestination = entityDescriptor.IdPSsoDescriptor.SingleSignOnServices
.Where(ed => ed.Binding.ToString() == "urn:oasis:names:tc:SAML:2.0:bindings:HTTP-POST")
.First().Location;
configuration.SignatureValidationCertificates.AddRange(entityDescriptor.IdPSsoDescriptor.SigningCertificates);
});
Here comes the tricky part; By default the sso initiation does a request with a RedirectBinding which does therefore send a GET request towards the sso service. However, the service I am trying to approach expects a SAMLRequest as a POST request. So I have changed the code by initiating with PostBinding request and afterwards directly submit the form, like so:
public IActionResult Initiate([FromQuery(Name = "returnUrl")] string returnUrl = "")
{
var binding = new Saml2PostBinding();
binding.SetRelayStateQuery(new Dictionary<string, string> { { "ReturnUrl", returnUrl } });
binding.Bind(new Saml2AuthnRequest(_saml2configuration)
{
ForceAuthn = false,
IsPassive = false,
NameIdPolicy = new NameIdPolicy() { AllowCreate = true },
AssertionConsumerServiceUrl = new Uri("https://localhost:44366/api/Authentication/Process"),
});
return binding.ToActionResult();
}
Issue:
However, after sending the base64 encoded AuthnRequest as SAML Request, I am receiving a 403 Forbidden from the third-party login. At this stage I am not certain whether is the identity provider not being configured properly or my request lacking something. What am I doing wrong?
Below is the (anonymously made) request headers.
Assume that the SAMLRequest is provided in formdata as base64 encoded.
:authority: myEntityDescriptorName
:method: POST
:path: mySsoURL
:scheme: https
accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3
accept-encoding: gzip, deflate, br
accept-language: nl-NL,nl;q=0.9,en-US;q=0.8,en;q=0.7
cache-control: no-cache
content-length: 3582
content-type: application/x-www-form-urlencoded
cookie: JSESSIONID=3D5FE88D55674C2F1E3646E6D8A0FFBE
origin: https://localhost:44366
pragma: no-cache
referer: https://localhost:44366/
sec-fetch-mode: navigate
sec-fetch-site: cross-site
sec-fetch-user: ?1
upgrade-insecure-requests: 1
user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.97 Safari/537.36
It is correct to change the Authn Request to a Post binding if that is required.
Your application is a Service Provider (also called Relying Party) which needs to be configured at the Identity Provider with a unique Issuer name.
I think the problem is that the Issuer name you have configured ("Issuer": "myIssuerName") is incorrect. The issuer name should be your Service Providers issuer name, not the Identity Provider Issuer name from the metadata.xml.

how to show a specific information from a string of User Agents string saved into my database

I am working with Geolocation and I have a string of User Agent saved into my database as shown below:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36
In this case of the string above, the information I would like to output in my views is Macintosh; Intel Mac OS X 10_13_6. So, in the frontend/views, I don't want to output the whole data as its in the database. All I want to pick from this information is just to show the devise the user used to login into the application, and I do not know how I will show only what I want. Any help or pointer to what I should do will be appreciated.
view
.row
.panel.panel-primary
.panel-heading
span = t('admin_header.traffics')
.panel-body
= table_for(#traffic, class: 'table table-condensed table-hover') do |t|
- t.column :ua, 'Device Used', class: 'col-xs-1' # But this shows all the string which I do not want, I only want specific details from it.
Here is the code that saves User Agent string into the Database:
def save_signup_history(member_id)
SignupHistory.create(
member_id: member_id,
email: #member.email,
ip: request.remote_ip,
accept_language: request.headers["Accept-Language"],
ua: request.headers["User-Agent"], #Here is User Agent
login_location: get_ip_location
)
end
The only thing I can think of is to use .remove method, but I don't think its a best solution to my problem.
How about using the user_agent gem?
In the specific example you gave you could use:
user_agent = request.env['HTTP_USER_AGENT']
user_agent.match(/\(.*?\)/)[0]
However that may not cover every case, and either using a gem or code which accounts for the various options is your best bet.
You can use the scan method to get the string you need.
Example:
user_agent = Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36
then,
user_agent.scan(/Mac.*_6/)
should give you your desired string. You can modify it as per your requirements.

files cannot be read after WINDEV mobile upload using microsoft graph api

I am trying to use the graph api upload feature (in my case it is using Windev Mobile 21).
The files are appearing in the appfolder. They are the right size and have the right extensions but they can not be opened
sCd1 is ANSI string = "Authorization: Bearer"+" "+gsTokenAcc
HTTPCreateForm("driveEnq")
sContentType is ANSI string = "image/jpeg"
HTTPAddFile("driveEnq","File",sAd,sContentType)=False
sEnding is ANSI string
sHTTPRes is string
sUserAgent is ANSI string = "'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.79 Safari/537.36 Edge/14.14393'"
IF HTTPSendForm("driveEnq",sEnding,httpPut,sUserAgent ,sCd1)=True THEN
bufResHTTP is Buffer = HTTPGetResult(httpResult)
I am convinced that this is something to do with the content type or the format by which the files are added
The key to this appears to be to add the file (or fragment) via an empty
HTTPAddParameter("form_name","",sFrag)
Adding the form as a application/XML appears to help with the boundaries of each field.
Hope this helps

youtube-mp3.org api not working properly, only downloads cached videos

Previously i was able to download YouTube videos as mp3 via youtube-mp3.org Using this method:
http://www.youtube-mp3.org/api/pushItem/?item=http%3A//www.youtube.com/watch%3Fv%3D<VIDEOID>&xy=_
Then it returned the video id and they started converting the video on their servers. Then this request would return a JSON string with info about the video and the current conversion status:
http://www.youtube-mp3.org/api/itemInfo/?video_id=<VIDEOID>&adloc=
After repeating the request until the value for status is 'serving' I then started the last request by taking the value for key h from the JSON response from the previous request, and this would download a the mp3 file.
http://www.youtube-mp3.org/get?video_id=<VIDEOID>&h=<JSON string value for h>
Now the first request always returns nothing. The second and third requests only succeed if the requested video is cached on their servers (like popular music videos). If thats not the case then the second request would return nil and so the 3rd request can't be started because of the missing hvalue from the second request. Could anybody help me with getting the website to start a conversion something needs to be wrong with the first URL i just dont know what. Thanks
I just tested it. For the first request, you need to send with it a header of:
Accept-Location: *
Otherwise, it will return a 500 (Internal Server Error). But with that header, it will return a string of the youtube video id, and you can use the 2nd api for checking the progress.
Here's the C# code I used for testing:
HttpWebRequest wr = (HttpWebRequest)WebRequest.Create("FIRST_API_URL");
wr.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.7 (KHTML, like Gecko) Chrome/16.0.912.75 Safari/535.7";
wr.Headers.Add("Accept-Location", "*");
string res = (new StreamReader(wr.GetResponse().GetResponseStream())).ReadToEnd();
Btw, you can keep track of the headers in the browser's Network (Chrome) debug tab.
Regards

Resources