Youtube gdata and search term having "&" - ruby-on-rails

I have the following code to let this program to search youtube gdata.
class Youtube
def search_url(term)
url = "https://gdata.youtube.com/feeds/api/videos"
url += "?q=#{term}&alt=json&restriction=US&max-results=50&orderby=viewCount"
url += "&fields=entry(id,title,yt:noembed,media:group(media:description),author(name),yt:statistics(#viewCount))"
url += "&key=#{DEV_KEY}"
However, when I tested this program, it seems it fails to search when the search term contains "&", like the popular duo artist "Macklemore & Ryan Lewis".
"&" might not be the cause of failure. But I just suspect it. If you think "&" is not the cause, what do you think is the cause of failure? If you think "&" is the cause, how can I fix it?

You need the escape the term before sending it as a URL parameter:
require 'cgi'
def search_url(term)
term = CGI.escape(term)
url = "https://gdata.youtube.com/feeds/api/videos"
url += "?q=#{term}&alt=json&restriction=US&max-results=50&orderby=viewCount"
url += "&fields=entry(id,title,yt:noembed,media:group(media:description),author(name),yt:statistics(#viewCount))"
url += "&key=#{DEV_KEY}"
escaping using CGI.escape results in a URI-safe parameter:
CGI.escape('Macklemore & Ryan Lewis')
# => "Macklemore+%26+Ryan+Lewis"

Related

Handling & in Youtube API query

I first tried
class Youtube
def search_url(term)
url = "https://gdata.youtube.com/feeds/api/videos"
url += "?q=#{term}&alt=json&restriction=US&max-results=50&orderby=viewCount"
url += "&fields=entry(id,title,yt:noembed,media:group(media:description),author(name),yt:statistics(#viewCount))"
url += "&key=#{DEV_KEY}"
But it wasn't able to handle queries like "Macklemore & Ryan Lewis"
Then someone suggested
require 'cgi'
class Youtube
def search_url(term)
term = CGI.escape(term)
url = "https://gdata.youtube.com/feeds/api/videos"
url += "?q=#{term}&alt=json&restriction=US&max-results=50&orderby=viewCount"
url += "&fields=entry(id,title,yt:noembed,media:group(media:description),author(name),yt:statistics(#viewCount))"
url += "&key=#{DEV_KEY}"
But it didn't solve the problem. But it just made queries that worked before not work anymore.
Opening these two links in Firefox (other browsers don't seem to be able to open them)
https://gdata.youtube.com/feeds/api/videos?q=Macklemore+%26+Ryan+Lewis&restriction=US&max-results=50&orderby=viewCount
https://gdata.youtube.com/feeds/api/videos?q=Macklemore & Ryan Lewis&restriction=US&max-results=50&orderby=viewCount
tells me that "&" in the query really affects badly.
How can this be really solved?
You can't use & in q= without escaping it. The escape of & with %26 in the URL seems to be correct. And if & is left out ?
When i search on youtube.com on only 1 term: &
Then my browser shows the url: http://www.youtube.com/results?search_query=%26

Replace.string with a URL as parameter

Below I have this code:
string _strTemplate = _strDownloadTemplate + IDReq + "/" + _strFileName;
Uri url = new Uri(_strTemplate);
As you can see, I'm converting the strTemplate (which carries the link of a page that I need to sent by email for the user) to a URL Format. My email body has several fields that I'm replacing with the correct value:
strMailMessage = strMailMessage.Replace("_LinkTemplate", url);
I'm getting an error because the method string.Replace takes strings as parameters only.
Is there a way to get around this?
I was thinking about pass the URL value through my page (page.aspx) but if there's a way to do so through this method, it would be better for me.
Thanks!
Assuming this is C# and .NET, yes, String.Replace() works with strings.
Did you try:
strMailMessage = strMailMessage.Replace("_LinkTemplate", url.ToString());

URL Encode - oauth_signature

I have successfully setup the oauth authentication to access my dropbox using sharpbox. Sharpbox is an open source "front end" that handles the nuts and bolts of the process. Using it i can return file info in a particular folder in my account.
I bind the filename and a generated URI to a gridview in a VS 2010 web app. I have a hyperlink with the text set to name and the DataNavigateUrlFields to the unique URL. It works great IF there is no "+" character in the oauth_signature part of the url string. If the plus is there, it returns "{"error": "Invalid signature. Expected signature base string:"
Thanks for your consideration.
Thank you for your help, here is my code
Public Sub MakeURL()
dbOpen()
Dim myfolder As ICloudDirectoryEntry = dropBoxStorage.GetFolder("/DIR/SUBDIR/")
Filename = Filename & "_POID_" & poid & ".pdf"
pdfurl = dropBoxStorage.GetFileSystemObjectUrl(Filename, myfolder).ToString
dbClose()
pdfurl = pdfurl.Replace("+", "%2B")
Response.Redirect(pdfurl)
End Sub
OAuth 1 Signature uses Percent Encoding (See RFC 5849). The specification clearly states that a space should not be encoded to a +, instead it should be encoded with %20. Replace your + with %20.

JSON API request app (rails), rendering results

I need to develop a small Rails app that makes a request to an JSON API, introducing the parameters into an initial form, check if we get a real response and then render the results into a view (html.erb).
Do you know where can I get good material to do these steps? Any help is welcome.
I'm reading some near example:
params_string = "whatever"
params_string_with_api_key = params_string + "&" + ##API_KEY
hashkey = Digest::SHA1.hexdigest(params_string_with_api_key)
params_string += "&hashkey=#{hashkey}"
res = Net::HTTP.get_response("api.somecompany.com", "/some/url.json?#{params_string}")
res_sha1 = Digest::SHA1.hexdigest(res.body + ##API_KEY)
#verified = res["X-Somecompany-Response-Signature"] == res_sha1
parsed_json = ActiveSupport::JSON.decode(res.body)
#results = parsed_json["results"]
Is it always needed to encode the parameters string when you do the Net::HTPP request? Is there another way?
What does exactly params_string += "&hashkey=#{hashkey}"?
Thank you!
What does exactly params_string += "&hashkey=#{hashkey}"?
params_string is a string that looks like ?param1=val&param2=val2.... Your last piece of code is just appending another param to the string. If your issue is with the #{} fragment, this syntax, in a ruby double-quoted string, allows you to use the value of a var.
Is it always needed to encode the parameters string when you do the Net::HTPP request? Is there another way?
I don't see the parameters string being encoded here. All I see is a checking of the results, done by comparing a response header with a SHA1'd response body.
Not really related to your questions : I went away from Net::HTTP a while back, having troubles with segfault. I now use Typhoeus for all requests through the network.

How would you parse a url in Ruby to get the main domain?

I want to be able to parse any URL with Ruby to get the main part of the domain without the www (just the example.com)
Please note there is no algorithmic method of finding the highest level at which a domain may be registered for a particular top-level domain (the policies differ with each registry), the only method is to create a list of all top-level domains and the level at which domains can be registered.
This is the reason why the Public Suffix List exists.
I'm the author of PublicSuffix, a Ruby library that decomposes a domain into the different parts.
Here's an example
require 'uri/http'
uri = URI.parse("http://toolbar.google.com")
domain = PublicSuffix.parse(uri.host)
# => "toolbar.google.com"
domain.domain
# => "google.com"
uri = URI.parse("http://www.google.co.uk")
domain = PublicSuffix.parse(uri.host)
# => "www.google.co.uk"
domain.domain
# => "google.co.uk"
This should work with pretty much any URL:
# URL always gets parsed twice
def get_host_without_www(url)
url = "http://#{url}" if URI.parse(url).scheme.nil?
host = URI.parse(url).host.downcase
host.start_with?('www.') ? host[4..-1] : host
end
Or:
# Only parses twice if url doesn't start with a scheme
def get_host_without_www(url)
uri = URI.parse(url)
uri = URI.parse("http://#{url}") if uri.scheme.nil?
host = uri.host.downcase
host.start_with?('www.') ? host[4..-1] : host
end
You may have to require 'uri'.
Just a short note: to overcome the second parsing of the url from Mischas second example, you could make a string comparison instead of URI.parse.
# Only parses once
def get_host_without_www(url)
url = "http://#{url}" unless url.start_with?('http')
uri = URI.parse(url)
host = uri.host.downcase
host.start_with?('www.') ? host[4..-1] : host
end
The downside of this approach is, that it is limiting the url to http(s) based urls, which is widely the standard. But if you will use it more general (f.e. for ftp links) you have to adjust accordingly.
Addressable is probably the right answer in 2018, especially uses the PublicSuffix gem to parse domains.
However, I need to do this kind of parsing in multiple places, from various data sources, and found it a bit verbose to use repeatedly. So I created a wrapper around it, Adomain:
require 'adomain'
Adomain["https://toolbar.google.com"]
# => "toolbar.google.com"
Adomain["https://www.google.com"]
# => "google.com"
Adomain["stackoverflow.com"]
# => "stackoverflow.com"
I hope this helps others.
Here's one that works better with .co.uk and .com.fr - type domains
domain = uri.host[/[^.\s\/]+\.([a-z]{3,}|([a-z]{2}|com)\.[a-z]{2})$/]
if the URL is in format http://www.google.com, then you could do something like:
a = 'http://www.google.com'
puts a.split(/\./)[1] + '.' + a.split(/\./)[2]
Or
a =~ /http:\/\/www\.(.*?)$/
puts $1

Resources