Enable Query Strings in Code Igniter - url

I am trying to implement Twitter's OAuth into my Code Igniter web application at which the callback URL is /auth/ so once you have authenticated with Twitter you are taken to /auth/?oauth_token=SOME-TOKEN.
I want to keep the nice clean URL's the framework provides using the /controller/method/ style of URL but I want to enable query strings as well, there will only ever be one name of the data oauth_token so it's ok if it has to be hard coded.
Any ideas?
I have tried tons of the things people are saying to do, but none work :(
PS: I'm using the .htaccess method of URL rewriting.

There are several ways to handle this.
Most People, and Elliot Haughin's Twitter Lib, extend the CI_Input library with a MY_Input library that sets allow_query_strings to true
You will also need to add ? to the allowed characters in config/config.php and set $config['url_protocal'] to PATH_INFO
see here: Enable GET in CodeIgniter

Codeigniter Reactor lets you access $_GET directly or via $this->input->get(). You don't need to use MY_Input or even change your config.php. This method leaves the query string in the URL, however.

I used a hacked index.php to recognise users coming back from Twitter, check for valid and safe values, then re-direct it to to a CodeIgniter friendly URL.
It may not be to everyones taste but I preferred it over allowing query strings throughout the entire application instead of just one particular circumstance.

Related

URL structure for multilingual websites

I'm developing a SPA web app and it will support various languages. It is build with AngularJS and I am using angular-translate to provide i18n.
But I am struggling a little bit with how the URL structure should be. I do no plan on using either gTLDs nor ccTLDs, so that leaves me with three options.
Use query params: ?locale=en-us
Use url paths: /en-us/page
Store the chosen locale in localStorage or a cookie
The first option is a no-go according to Google's guidelines for web apps SEO. So that leaves me with the last two options.
I have a hard time deciding which is more beneficial, though I am inclined to believe that using url paths would probably be more crawler friendly.
P.S: Not sure if this is the best place to ask such a question either.
The second option is your safest bet as according to https://webmasters.stackexchange.com/questions/59652/what-happens-if-i-try-to-set-a-cookie-on-a-bot cookies are ignored. You can test this yourself by going to the Google Console and fetching your website.
As of now most crawlers ignore cookies and DO NOT execute JavaScript. This means that they usually just download the html and make their judgements from there.
Some developers get around the no javascript problem by pre-rendering parts of their content. I haven't done it personally but you might want to check out https://prerender.io/
Edit
As rolandjitsu mentioned google crawls and executes javascript content.
You should go with second option: provide the language tag (and, optionally, region subtags) in the URL path as first segment.
For the simple reason that it allows you, visitors, and bots to link to specific translations.

How to restrict to access to non-SEO URLs in Joomla 3?

I've converted all my URLs to SEO friendly URLs.
But I want to restrict to be accessed to my non-seo friendly URLs.
As an example, you can access to www.example.com/article-1 with http://www.example.com/index.php?option=com_content&view=article&id=76&Itemid=113. But I don't want this. I just want you to be able to access with http://www.example.com/article-1
I wish that I'm clear to explain what I need.
I don't think it's possible for the simple reason that Joomla always uses the non-SEF links internally. That's why they always work.
Also there are links which are not converted to SEF links because the user will not see and Google will not index them. Like links used by AJAX scripts or similar things.
If you block non-SEF urls in your .htaccess file, expect your page to break sooner than later. Don't blame the extension developer then :-)

How to create url_shortner in ruby

I am developing a web application in which I have implemented Facebook and Twitter connectivity.I want to shorten the url when user post to facebook or twitter from my apllication.
For example if url is http://www.MyDomain.com/user/234545 then it should be somwthing like http://M.D/n2b
How can I do that. Please help & also please give me more info about how actually the url shorter works and how to implement it in rails
For starters, you would need to purchase the "M.D" domain, and I don't even think that exists. So your next option would be to use a subdomain of your "MyDomain.com", like "short.MyDomain.com" and stick a Rails app there that could map your shorter URLs. Ironically, the URL would be nearly as long.
It wouldn't surprise me if some of the URL shorteners already out there have some kind of HTTP API. If so, you're probably better off using them.
use bitly or tinyurl or you can create your own method for shorten url if you are using your own method then you cannot change the domain name you have to use some sort of API for doing this.

What is the advantage of putting the language indicator into the URL?

I'm doing a site which supports multiple languages. At the moment, I’m doing like /en/… in the URL path and using .htaccess to determine which language the user is on. Actually, this is very common for sites with multiple languages to either do http://en.example.com or http://example.com/en/.
My question is: Why is it so common to show in the URL which language the user is viewing? I can't see any technical advantages. Is it for optimizing user experience?
Because you could easily just use sessions/cookies and hide it from the user which I'm leaning to at the moment.
Thanks in advance :)
For easy bookmarking probably.
Specifying the language information in the URL is 1 way to indicate that you want to view in that particular language, ignoring your current locale.
Wrapping this information in the URL is better than using a cookie for example, as some users may delete all cookies after each browsing session.
And because of this pseudo REST like URL, /en/, it is easily bookmarkable, and search engine friendly
I think it's used as a substitute for not owning the domain within each TLD. (ie company.co.uk and company.com).
It's also usable because of the uri's possibility to be localised: ikea.com/se/stolar could be the localised variant of ikea.com/en/chairs; usable both for the end user and SEO.
It is not directory, but mod_rewrite - such url as:
http://google.pl/en
gets rewritten server side for:
http://google.pl?lang=en
and for every language it will be more handy.
Why? Because if client saves link to our page in favorites and sends it to his friend, he can pass also the language of the page he was viewing. If the default language was for example polish, and he changed it to english, he saves friend some time to search and click specific button.
If you put it in the URL the search engines will store every page in every language. If you use cookies, they will only store one. So it's more a SEO advantage I think.

How to get (scrape) the contents of a site that requires logging in through YQL?

Is it possible to get (scrape) data from a site that requires logging in using YQL? If yes, please tell the procedure.
You'll need the user to authorize your access via OAuth, as YQL's docs mention. In addition to the docs pointed to by links from the URL I just mentioned, you can learn all about OAuth here, then get libraries to help you use OAuth, depending of course on the programming language you want to use, from the links listed here.
Depending on how the remote site is set up, you could use a simple POST (there is an open data table for that1) or you could create your own small, custom data table and use <execute>2 to send whatever headers (including Cookie:) you need over one or more GET/POST requests.
htmlpost data table (example)
YQL Execute

Resources