How can I know the Timezone of my Web site Host - timezone

I know this one is the weirdest of all weird questions I have asked till date. But I have a reason.
The problem is that I have a no of websites hosted in diff servers (I dont own these servers) and every website has some pages where I have to enter some execution date and time. The date/time I enter should be of the timezone in which my website's host server is running.
Just want to know if there is any good utility/website which can tell me the timezone/location of my web host if I provide the web site address.

You can put on each of those servers your own "system" page that will display the current time.
<%= DateTime.Now.ToString() %>
And you will access it like: http://site-no-x.com/timecheck
Something like this. Simple and effective.
ADDED: Keep in mind also that web servers and database servers can also have different time settings. If a hosting company (theoretically) keeps the web server farm on East coast and has the database servers on West coast, you will see different dates returned by .NET's DateTime.Now and SQL's GETDATE().

This isn't likely to be possible, as timezones are set at the OS level. (i.e.: You'd need to check programatically on each server itself using your language of choice.)
As such, it might be easier in the long to fix the scripts.

So you can't modify the code to have the thing use UTC?

Related

Malicious Bots waking up heroku free app and using up all dyno hours

I have an app hosted on heroku which for the last 5 years has done fine on free dyno hours. There's a single user, and it doesn't get much use throughout the day.
As of the last couple of months, we seem to be targeted by bots who created fake accounts. we are getting so many of these bots now, that they are waking our app up so often that it has consumed our free dyno hours.
Does anyone know how to get rid of them? I had tried using invisible_captcha but that did not seem to help.
You should consider to use RackAttack https://github.com/kickstarter/rack-attack
It's a middleware that allows you to block/allow a request.
For example, if they are using the same email domain for each new registration, you could only accept ten registrations (because it's not a big website) with this domain by hour, until they calm down.
Or, if they come from the same place, you can limit the requests of this country thanks to their IP
EDIT:
If you check the country based on the IP address, the dyno will be wake up (because you'll call an external service to get the information), so it's not a good solution in this case
Simple solution:
If you have only a single user, consider hosting the app on a secret subdomain which will not get crawled. And everytime you get an issue like this you can just change your subdomain and inform your user.
A more complex solution:
Use a solution like cloudflare and prevent unwanted traffic to the application.

Are there any situations (e.g. failures) when browser clears cookies on its own?

We have two sites with different subdomains. Sometimes our employees lose their cookies (they are just gone) on both domains at the same time so they get logged out.
I don't really see how our app can be responsible, because we have different server configurations (and for each site there're multiple servers btw). I guess only nginx versions (1.10.3) are the same. Plus this does not explain why do they get logged out on both sites at the same time.
If it helps, we use rails (3/5), unicorn (4.8.3/5.3.0), on older app sessions are stored in redis and in the new one in cookies.
So I wonder maybe there're some browser (security) policies when it clears cookies. Maybe on some ssl connection error, ip changes or whatever.
I understand that this is not definitive problem description but it seems like magic to us atm so I hope that someone encountered something like this.
P.S. btw we tried to ask one of our employees to use firefox instead of Chrome (that is used by all of them) but it does not seem to be making any difference (he wasnt logged out for a week but then he was like every 20 minutes)

Can a Google Apps Script Web App get the user's language and time zone?

Is there any possibility for a GAS published as a Web App executing under the identity of the active user and using the Ui Service for user interface to get the preferred language and time zone of the user?
Session.getActiveUser() works but you only get the Email Session.getActiveUser().getEmail().
Session.getTimeZone() returns the time zone of the script, not of the user.
Could there be a trick to get the web browser ID string with the language preference?
Session.getActiveUserLocale() was introduced in 2014 to provide this capability.
This is a very interesting question. I think the short answer is that there is no good way for now and you have to ask the users for their locale/language.
I don't see a way to do this on the server side using the APIs you've already discussed. However, I was thinking maybe there is a clever way to do this on the client side and send send it up to the server using the google.script API after getting the locale information from the navigator.language JS call.
Unfortunately, since the HTML/JS you have in your web app gets sanitized for security through Caja, only portion of the normal window.navigator properties are exposed. It seems the only useful properties are userAgent, and platform. Language seems innocuous enough to expose, so this is worth logging a request in the Issue Tracker.

SEO Destroyed By URL Forwarding - Can't figure out another way

We design and host websites for our clients/sales force. We have our own domain: http://www.firstheartland.com
Our agents fill out a series of forms on our website that are loaded into a database. The database then renders the website as a database driven website.
/repwebsites/repSite.cfm?link=&rep=rick.higgins
/repwebsites/repSite.cfm?link=&rep=troy.thompson
/repwebsites/repSite.cfm?link=&rep=david.kover
The database application reads which "rep" the site is for and the appropriate page to display from the query string. The page then outputs the content and the appropriate CSS to style the page and give it its own individual branding.
We have told the user to use Domain Name Forwarding to get the users to their spot on our server. However, everyone seems to be getting indexed under our domain instead of their own. We could in theory assign an new IP to them, the cost is not the issue.
The issue is how we would possibly accomplish this.
With all of that said, them being indexed under our domain would still be OK as long as they would actually show up high in the ranking for their search term.
For instance, an agent owns TroyLThompson.com. If I search Troy L Thompson, It does not show up in my search. Only, "troy thompson first heartland" works (they show up third)
Apart from scrapping the whole system, I don't know what to do. I'm very open to ideas.
I'm sure you can get this to work as most hosting companies will host hundreds of websites on a single server (i.e. multiple domains on one IP).
I think you need your clients to update the nameservers for their domains (i.e. DNS) to return the IP address of your hosting server. Then you need to configure your server to return the right website based on the domain that was originally requested.
That requires your "database driven website" to look in the HTTP request and check which domain was originally requested, then it can handle the request accordingly.
- If you are using Apache, see how to configure Apache to host multiple domains on one IP address.
- If you are using Microsoft IIS, maybe Host-Header Routing is what you need.
You will likely need code changes on your "database driven website" to cope with these changes.
I'm not sure that having a dedicated IP address per domain will help much, as then you have to find a way to host all those IP addresses from a single web server. However, if your web server architecture already supports a shared database and multiple servers, then that approach might work well for you, especially if you expect the load from some domains to be so heavy that you need a dedicated web server for them.
Google does not include URL in its index which return a 301 status code. The reason is pretty obvious on second thought, because the redirect tells Google "Whatever was here before has moved there, please update your references". One solution I can see is setting up Apache virtual hosts on your server for each external domain, and have each rep configure their domain's DNS A record to point to the IP address of your server.

Ruby on rails (based on Mephisto) - Unable to contact server

I am completely new to ruby and I inherited a ruby system for a product catalogue. Most of my users are able to view everything as they should but overseas users (specifically Mexico) cannot contact the server once logged in. They are an active user. I'm sorry I cannot be more specific, and the system is private so I cannot grant access.
Has anyone had any issues similar to this before? Is it a user-end issue or a system error?
Speaking as somebody who regularly ends up on your user's side of the fence, the number one culprit for this symptom is "Clueless administrator". There are many, many sites which generically block either large blocks of IP space or which geolocate and carve out big portions of the world.
For example, a surprising number of American blogs block Asian countries (including Japan) out of a misplaced effort to avoid DDOS attacks (which actually probably originated in Russia or China but, hey, this species of administrator isn't very good on fine tuning solutions). I have to hop over to my American proxy server to access those sites.
So the first thing I'd do to diagnose your problems is to see whether your Mexican users are making it to the server at all, or whether they're being blocked somewhere earlier (router? firewall? etc). Then, to determine whether the problem is on your end or their end, I'd try to replicate the issue with you proxying your connection through a Mexican proxy and repeating the actions they took to cause the issue.
The fact that they get blocked after logging in could indicate that you have https issues , for example with an HTTPS accelerator installed [1], or it could be that your frontend server is properly serving up the static content but doing the checking on dynamic requests only.
[1] We've seen some really weird bugs at work caused by a malfunctioning HTTPS accelerator.
If it's working for everyone else then it would appear that the problem is not with Ruby or Rails working, since they are...
My first thought would be to check for a network issue: are the Mexican users all behind the same proxy server and/or firewall?
Is login handled within the Rails application or via some other resource? Can you see any evidence that requests from Mexican users are reaching your web server at all?
Login is handled by the rails app. Am currently trying to hunt down the logs, taking some time as again I am new to this system.
Cheers guys
Maybe INS is cracking down on cyber-immigration.

Resources