Read on before you say this is a duplicate, it's not. (as far as I could see)
I want to get the county code in php from the client.
Yes I know you can do this using external sites or with the likes of "geoip_record_by_name" but I don't want to be dependent on an external site, and I can't install "pear" for php as im using shard Dreamhost hosting.
I thought I could just do something like this:
$output = shell_exec('whois '.$ip.' -H | grep country | awk \'{print $2}\'');
echo "<pre>$output</pre>";
But dreamhost seems to have an old version of whois (4.7.5), so I get this error on allot of IPs:
Unknown AS number or IP network. Please upgrade this program.
So unless someone knows how to get a binary of a newer version of whois onto dreamhost im stuck.
Or is there another way I could get the country code from the client who is loading the page?
Whois is just a client for the whois service, so technically you are still relying on an outside site. For the queries that fail, you could try falling back to another site for the query, such as hostip.info, who happen to have a decent API and seem friendly:
http://api.hostip.info/country.php?ip=4.2.2.2
returns
US
Good luck,
--jed
EDIT: #Mint Here is the link to the API on hostip.info: http://www.hostip.info/use.html
MaxMind provide a free PHP GeoIP country lookup class (there is also a free country+city lookup one).
The bit you want is what is mentioned under "Pure PHP module". This doesn't require you to install anything, or be dependent on them, nor does it need any special PHP modules installed. Just save the GeoIP data file somewhere, then use their provided class to interact with it.
Can you just install a copy of whois into your home directory and pass the full path into shell_exec? That way you're not bound to their upgrade schedule.
An alternative, somewhat extreme solution to your problem would be to:
Download the CSV format version of MaxMind's country database
Strip out the information you don't need from the CSV with a script and ...
... generate a standard PHP file which contains a data structure containing the IP address as the key and the country code as the value.
Include the resulting file in your usual project files and you now have a completely internal IP => country code lookup table.
The disadvantage is that, regularly, you would need to regenerate the PHP file from the latest version of the database. Also, it's a pretty nasty way of doing it in general and performance might not be the best :)
Consider ipcountryphp (my site, my code, my honour) as it provides a local internet-lifetime freely updated database. It's fast and fully self-contained, pluggable into anything PHP 5.3, SQLite3 and beyond. Very fast seeks and no performance penalties.
Enough with shameless self-promotion, let's get serious:
Relying on querying remote services in real-time to get visitor country can become a major bottleneck for your site's functionality depending on the response speed of the queried server. As a rule of thumb you should never query external services for real-time site functionality (like page loading). Using APIs in the background is great but when you need to query the country of each visitor before the page is rendered, you open yourself up to a world of pain. And do keep in mind you're not the only one abusing free services :)
So queries to 3rd-party services stay in the background while only local functionality that relies on no 3rd-party go into the layers there users interact with. Just my slightly performance paranoid take on this :)
PS: Above mentioned script I wrote has IPv6 support too.
Here is a site with a script i just used. The only problem is that you would probably every now and then need to regenerate IPs by yourself... which might be pain and tahts why everyone is telling you to use external API. But for me that wasnt solution as i was pulling like 50 IPs at once, which means i would probably get banned. So solution was to use my own script or to do saves to DB, but i was again pulling images from external sites. Anyway here is the site i found script on:
http://coding-talk.com/f29/country-flag-script-8882/
Here's a few:
http://api.hostip.info/get_html.php?ip=174.31.162.48&position=true
http://geoiplookup.net/geoapi.php?output=json&ipaddress=174.31.162.48
http://ip-api.com/json/174.31.162.48?callback=yourfunction
http://ipinfo.io/174.31.162.48
All return slightly different results.
here is also one of them. just change the IP to the variable:
http://api.codehelper.io/ips/?callback=codehelper_ip_callback&ip=143.3.87.193
Related
Question :
Where to start to write an application which can work without internet connection? Exactly like this
Explanation :
Say we have an web application which is already deployed. Since internet is not great in INDIA, I would like to create an offline version of same web application which users/people can access without internet as well. I want them to experience similar stuff of web interface without much of the changes.
One idea that came to my mind is to create a tar ball of contents of application and ship to the people/users. Users will have to use that tar ball to install/configure on their machine so that they can use it. Contents of tar ball is also debatable that what should I enclose in that tar ball. Apache, Technology stack etc etc.
I will be happy to write more in case I have not written precisely. My question is not related to any technology stack but this might be of interest to everyone. Since I am not sure which is the right tag to append here, can anybody from stackoverflow team help to tag right tag. :)
My application is actually in RoR. So, Tagging ruby on rails community. May be they can help here?
As long as your web application contains only flat files (HTML, CSS, JS, text data, etc.) and does not depend on any components that need to be installed, then you can simply distribute those files in an archive (.zip will be more cross-platform-friendly) and the user could open the application by opening the front page in a browser. To make it better for the user, a small application which invokes the user's browser with the local URI should also be included.
I am using ejabberd, a jabber daemon written in Erlang. It is connected to our Active Directory using its LDAP interface and Erlang's eldap library.
Everything works so far with a small limitation causing a big problem:
A normal LDAP query receives up to 1000 elements and then stops.
We have more than 1000 employees and therefore receiver only a part of the whole query
Using *nix' ldapsearch tool, I can use the option -E pr=1000/noprompt for receiving multiple pages (which finally get concatenated to a single one) without any limitation.
How could I use this function using Erlang's eldap library?
I already read through the source code, but don't seem to find anything obvious.
#erlang had some nice ideas about this:
emauton: It looks to me like you're out of luck. Paged results are supplied by an LDAP extension, described in http://www.rfc-editor.org/rfc/rfc2696.txt. If you look at ldapsearch, you can see this being added to the query at http://goo.gl/lemNOS
emauton: Reading through the eldap source, this extension doesn't make an appearance. The good news is that it shouldn't be too hard to add, I think, by messing about with the 'controls' part of the LDAPMessage.
emauton: You should be able to set up your request according to the RFC using the right controlType & contents (referencing the ldapsearch code) and use it to make a paginating version of eldap:search
i have a website, its to exchange links, files... to say it quickly it's my 'version' of twitter+megaupload,
Well, users add links all the time and so on, but i would like user be able to syinch his bookmarks from the browser to the ones he has at his profile of mywebsite,
Where should i look into?
Basically i need to be able to:
- Acces bookmarks file (1)
- being able to send the urls to my service ( 2 )
- maybe adding the login feature (in the future)
I was google'ing about this for ages few weeks a go and i kind of give up, because i'm ok with PHP and JS, but with this plugin languages i'm very lost. So i decided posting here, wich always brings positive answers
(1) - > I don't even know where to start
(2) -> i was thinking to have a website.com/auto_import_no_confirm.php?url=[URL] and put it in a for each.
how many different languages and extension files do i have to work with? I really need any kind of tip with point (1)
feel like?
-edit-
Just found This -> https://developer.mozilla.org/En/Code_snippets/Bookmarks
wich really looks like i need, but where do i place this code?
thanks!
Might not be a bad question, but there are too many subtopics raised to answer that. (And there is too much tagspam as well. Break up your question into PHP- and Javascript-specific tasks, when you have devised the general application scheme.)
But to get started, download similar Firefox extensions (.xpi) and unzip them to inspect the general structure. You'll find examplary code for bookmark handling and invoking remote APIs pretty quickly. And basically you only need Javascript for the extension itself. (It sounds like your extension does not need much UI.)
And there are many tutorials on designing Firefox addons: http://roachfiend.com/archives/2004/12/08/how-to-create-firefox-extensions/ or http://www.google.com/search?q=firefox+develop+an+xpi
The good news first, you won't need much more than javascript if you just want to access bookmarks and send them to a server, neither on firefox nor on chrome.
But still you'll have to make yourself familiar with the apis of the browsers and learn how to develop extensions.
However, both Mozilla and Google provide all necessary information on their developer sites.
For Chrome, this is a good place to start, you'll find the api for bookmark access here.
The Corresponding site for Firefox can be found here, with information on bookmark access here.
I'm trying to find the best method to gather URLs, I could create my own little crawler but it would take my servers decades to crawl all of the Internet and the bandwidth required would be huge. The other thought would be using Google's Search API or Yahoo's Search API, but that's not really a great solution as it requires a search to be performed before I get results.
Other thoughts include asking DNS servers and requesting a list of URLs but DNS servers can limit/throttle my requests or even ban me all together. My knowledge of asking DNS servers is quite limited at the moment, so I don't know if this is the best method or not.
I just want a massive list of URLs, but I want to build this list without running into brick walls in the future. Any thoughts?
I'm starting this project to learn Python but that really has nothing to do with the question.
$ wget http://s3.amazonaws.com/alexa-static/top-1m.csv.zip
You can register to get access to the entire .com and .net zone files at Verisign
I haven't read the fine print for terms of use, nor do I know how much (if anything) it costs. However, that would give you a huge list of active domains to use as URLs.
How big is massive? A good place to start is http://www.alexa.com/topsites. They offer a download of the top 1,000,000 sites (by their ranking mechanism). You could then expand this list by going to Google and scraping the results of the query link: url for each url in the list.
modern terms now are URI and URN, URL is the shrunk/outdated. i'd scan for sitemap files that contain many addresses in one file and study the classic text spiders, wanderes, brokers and bots and RFC 3305 (appendix b. p 50) defining URI regex
I'm wondering if there is an easy way to look up a user's local time zone in Rails using only an IP address. I don't want users to have to input their time zone themselves. Do I have to use JavaScript or is there a different way?
The maxmind GeoLite IP->city database seems to support timezones and there's a FAQ on their site referring to this. You could either do a two-step process of IP->Location then Location->timezone using the Maxmind GeoLite City database and then use one of the solutions provided in the FAQ.
Or for a simple 1-step javascript, using getTimezoneOffset() seems to be the crux of the solution.
There appear to be several vendors offering APIs and callable services to go from ip address to location, and clearly once you have that determining the timezone is only a further lookup.
Your alternative of using javascript to ask the browser "where am I, what's the time zone" and Ajaxing that down to your server also sounds plausible.
Of course a sufficiently determined user can probably spoof their way to appearing to be at a different ip address, but presumably that doesn't matter too much to you ... their choice.
you can use the ip address-to-time API to find time by IP address.
Look here http://worldtimeengine.com/ for more details.
Dan