Ruby: failed to make a successful GET request - ruby-on-rails

I am trying to send a http.get request to different websites. Here is the code I am using:
def makePing
begin
url = URI.parse(#URI)
req = Net::HTTP::Get.new(url.to_s)
res = Net::HTTP.start(url.host, url.port) {|http|
http.read_timeout = #request_timeout_limit
http.request(req)
}
# debugger
rescue Exception => echo
puts "Error is: Failed to open TCP connection to #{#URI}"
end
end
It returns the result of 200 for 'http://www.example.com'
but
for http://www.google.com or http://www.facebook.com
it returns
<Net::HTTPNotFound 404 Not Found readbody=true>
1-I am wondering why it happens like this?
2-How can I get the body of the response?
3- I expect that, the request get expired exactly after #request_timeout_limit, and it stop trying, but it is not working in this way?

Related

How do you retry an HTTP request to an API if the response code is other than "200 OK"?

I'm making requests to an external API that 9 out of 10 times returns a JSON string that I use to create records in my own App.
That one time though, it will return an "Internal Server Error" (code 500), crashing the rest of my App as it tries to parse a nil JSON String.
How can I retry the external API call if the response.code is other than "200 OK" ?
External API #connector (HTTP Request)
def fetch_client(client_identification)
url = URI("#{BASE_URL}/clients/#{client_identification}")
https = Net::HTTP.new(url.host, url.port)
https.use_ssl = true
request = Net::HTTP::Get.new(url)
request["Authorization"] = "Bearer 1234567890qwertyuiopasdfghjklzxcvbnm"
response = https.request(request)
end
My Adapter that does something with the response (it fails as it tries to parse an empty string)
def get_client(client_identification)
response = #connector.fetch_client(client_identification)
# how to retry if response.code != "200" ?
JSON.parse(response.body) # crash
end
I've tried something like this but my code fails to compile throwing an "Invalid retry" error
retry if response.code != "200"
First you are missing a lot of error handling such as Net::ReadTimeout, Errno::ECONNRESET, Errno::ECONNABORTED, Errno::EPIPE, OpenSSL::SSL::SSLError, Timeout::Error and probably others. For those Net::HTTP has max_retries but you'll need to rescue if not ok after max_retries is reached.
But for the best case scenario, you could call get_client again (recursion) BUT you need to set a retry limit to not get a stack overflow OR if you want to retry only once and don't care about DRY:
def get_client(client_identification)
response = #connector.fetch_client(client_identification)
if response.code != "200"
response = #connector.fetch_client(client_identification)
end
response.code != "200" ? {} : JSON.parse(response.body)
end

NET::HTTP on rails anyway to increase timeout

I have making a NET:HTTP call to a third party payment API using form_post method from NET:HTTP using rails.
res = Net::HTTP.post_form(uri, payment_details)
logger.info "RESPONSE: #{res}"
descrypted_res = JSON.parse(res.body)
logger.info "RESPONSE: #{descrypted_res}"
sometimes, I got the response back saying HTTPOK, sometimes i dont get anything at all. I am suspecting the payment server did not respond to my request in time and the connection is terminated. My question is, is there any way to increase the post_form timeout duration?
I want to increase it to 60 seconds.
Thanks.
You can adapt the ruby method:
https://docs.ruby-lang.org/en/2.0.0/Net/HTTP.html#method-c-post_form
like this:
def _post_form(url, params)
req = Net::HTTP::Post.new(url)
req.form_data = params
req.basic_auth url.user, url.password if url.user
Net::HTTP.start(
url.hostname,
url.port,
use_ssl: url.scheme == 'https',
read_timeout: 600
) do |http|
http.request(req)
end
end

Bad Request trying to call service with Digest Auth from ruby

I'm trying to call a service with Digest Auth from a rails application and it always returns a 400 bad request error.
I've used net-http-digest_auth gem to create the headers but I think I've missed something.
def get_digest(url)
uri = URI.parse(url)
http = Net::HTTP.new uri.host, uri.port
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_PEER
req = Net::HTTP::Get.new(uri.request_uri)
# Fist call with the 401 and auth headers
digest_response = http.request(req)
digest_auth_request = Net::HTTP::DigestAuth.new
uri.user = digest_auth[:user]
uri.password = digest_auth[:password]
auth = digest_auth_request.auth_header uri, digest_response['www-authenticate'], 'GET', true
req.add_field 'Authorization', auth
response = http.request(req)
# Response is always #<Net::HTTPBadRequest 400 Bad Request readbody=true>
if response.code.to_i == 200
response_body = response.body
else
error
end
response_body
end
The request's headers look like this:
Digest username=\"myuser#mydomain.com\", realm=\"Digest\", algorithm=MD5-sess, qop=\"auth\", uri=\"/path/WS/my%20user/path/path/path/path/service.svc\", nonce=\"+Upgraded+v1e3f88bce1c32bd15avn421e440ca6622ebadd4522f7ed201fab1421c39d8fd15b771b972c9eb59894f8879307b9e6a5544476bc05cc7885a\", nc=00000000, cnonce=\"d42e6ea8a37aadsasdbea1231232456709\", response=\"7fbfc75cc3aasdasd342230ebf57ac37df\""
I can't figure out what's happening, is there any other gem to make this easier?
Finally found the problem by comparing browser header vs ruby header.
I wasn't calculating "nc" (calls counter) correctly. After adding +1 it started to return a 401 error (now I have a different problem ;)).

Ruby Net:HTTP::Get weird behavior with JSON

I am using Net::HTTP for sending GET request to an API Server to fetch some data based on some configurations as below:
# method to fetch shirt sample details from api server
def fetch_shirt_sample(shirt, query_params)
path = "http://myapiserver.com/shirts/#{shirt.id}/shirt_sample"
url = URI.parse(path)
req = Net::HTTP::Get.new(url.path + '?' + query_params)
res = Net::HTTP.start(url.host, url.port) { |http| http.request(req) }
JSON.parse(res.body)
end
What is wrong or weird I found is above method works for following query params:
"&name=medium_shirt&color=red&configs=[\"full_length\",\"no_collar\",\"casual\"]"
but doesn't even send GET request for following query params:
"&name=medium_shirt&color=red&configs=[\"full_length\",\"15\",\"43\",\"30\"]"
Could anyone help me to understand what is wrong in the above setup?

How to specify a read timeout for a Net::HTTP::Post.new request in Ruby 2

I have a post happening to a rails application from a ruby script. The script creates a variable request as
request = Net::HTTP::Post.new(url.path)
which is then used as follows
request.content_type = "application/json"
request.body = JSON.generate( params )
response = Net::HTTP.start(url.host, url.port) {|http| http.request(request)}
There is quite a lot of processing happening on the server side, and I'm getting a Net::ReadTimeout error
I tried to specify a timeout period
request.read_timeout = 500
as per this stackoverflow answer but I got a
undefined method `read_timeout=' for #<Net::HTTP::Post POST> (NoMethodError)
error. I assume that I'm missing something simple somewhere. All clues gratefully received
Technical info:
Ruby 2.0.0p247
Rails 4.0.0
Windows 7 32 bit ruby
Solved via this stackoverflow answer
I've changed my
response = Net::HTTP.start(url.host, url.port) {|http| http.request(request)}
line to be
response = Net::HTTP.start(url.host, url.port, :read_timeout => 500) {|http| http.request(request)}
and this seems to have got around this problem.
The read_timeout is available with a plain Net::HTTP object:
url = URI.parse('http://google.com')
http = Net::HTTP.new(url.host, url.port)
http.read_timeout = 5 # seconds
http.request_post(url.path, JSON.generate(params)) do |response|
# do something with response
p response
end
One thing to keep in mind is that if read_timeout is set to a small value such that a timeout does occur...Net::HTTP will "helpfully" retry the request. For a slow HTTP server, a timeout error may not be raised to the code calling Net::HTTP until 2x the read_timeout value.
This certainly was not the behavior I expected.
More info on this topic and how possible solutions differ for Ruby < 2.5 and >= 2.5 may be found here:
https://stackoverflow.com/a/59186209/5299483
I catch both OpenTimeout and ReadTimeout and it's work. test in Ruby:2.6.5
def ping(host, port)
begin
url = URI.parse("http://#{host}:#{port}/ping")
req = Net::HTTP::Get.new(url.to_s)
# setting both OpenTimeout and ReadTimeout
res = Net::HTTP.start(url.host, url.port, :open_timeout => 3, :read_timeout => 3) {|http|
http.request(req)
}
if JSON.parse(res.body)["ok"]
# return true
STDERR.puts "#{host}:#{port} is reachable"
else
STDERR.puts "#{host}:#{port} is NOT reachable"
end
rescue Net::ReadTimeout => exception
STDERR.puts "#{host}:#{port} is NOT reachable (ReadTimeout)"
rescue Net::OpenTimeout => exception
STDERR.puts "#{host}:#{port} is NOT reachable (OpenTimeout)"
end
end
ping("#{ENV['FIRST_HOST']}", 2345)
ping("#{ENV['SECOND_HOST']}", 2345)
If anyone is still facing timeout setting issue and Net::HTTP timeout not working as expected, then you may follow below approach as well:
begin
Timeout::timeout(10) {
####
## YOUR REQUEST CODE WILL BE HERE
####
}
rescue
408
end

Resources