I have coded a rake file to monitor and fetch data from a website that has this data in JSON format. The following is the actual source of this data
https://www.thegazette.co.uk/company/07877158/filings/data.json
The rake file monitors the "total_count" in the above json and when it changes the rake will fetch and save any new information
The issue I have is after the first time it monitors that page, it simply doesn't update. As a real-world current example, the above json source was updated overnight with two new records, and consequently, the "total_count" increased from 40 to 42, but my rake is still telling me there is 40 (and subsequently doing nothing because it thinks nothing has changed)
I think it is a cache issue but have cleared my rails cache with no success. It is strange because I don't have this issue with other similar rakes I have created for other sites
My rake code is as follows
desc "Monitor"
task :S_01 => :environment do
require 'rubygems'
require 'open-uri'
require 'openssl'
def g_api(url)
uri = URI.parse(url)
request = Net::HTTP::Get.new(uri)
request.content_type = "application/json"
req_options = {
use_ssl: uri.scheme == "https",
}
response = Net::HTTP.start(uri.hostname, uri.port, req_options) do |http|
http.request(request)
end
data = JSON.parse(response.body)
end
company = CompanyBorrower.where(id: 43)
company.each do |f|
begin
#scrape source
tg_fh_url = "https://www.thegazette.co.uk/company/"+f.ch+"/filings/data.json"
gf_scrape = g_api(tg_fh_url)
ch_s = gf_scrape.fetch('total_count', nil) #scrape
puts ch_s
if not f.filing_count == ch_s # has teh cound change - if not, skip
f.update_attributes(cwdetail1: ch_s, filing_update: ch_fh3)
gf_scrape['items'].first(3).each_with_index do |f1, index|
#fetch & save data here
end
end
rescue
next
end
end
end
EDIT
Added the following to the code, but get an error
response["Cache-Control: no-cache"]
NoMethodError: undefined method `fetch' for nil:NilClass
def g_api(url)
uri = URI.parse(url)
request = Net::HTTP::Get.new(uri)
request.content_type = "application/json"
req_options = {
use_ssl: uri.scheme == "https",
}
response = Net::HTTP.start(uri.hostname, uri.port, req_options) do |http|
http.request(request)
end
data = JSON.parse(response.body)
response["Cache-Control: no-cache"]
end
Related
Hoping for some help as this one has me baffled...
I created a user account and API credentials at FTX.com.
They have an interesting Auth setup which is detailed here: https://docs.ftx.com/?python#authentication
They only provide code examples for python, javascript and c#, but I need to implement the integration on a RoR app.
Here's a link which also provides an example for both GET and POST calls: https://blog.ftx.com/blog/api-authentication/
I'm using:
ruby '3.0.1'
gem 'rails', '~> 6.1.4', '>= 6.1.4.1'
also,
require 'uri'
require 'net/https'
require 'net/http'
require 'json'
I got the authentication working for GET calls as follows:
def get_market
get_market_url = 'https://ftx.com/api/markets/BTC-PERP/orderbook?depth=20'
api_get_call(get_market_url)
end
def api_get_call(url)
ts = (Time.now.to_f * 1000).to_i
signature_payload = "#{ts}GET/api/markets"
key = ENV['FTX_API_SECRET']
data = signature_payload
digest = OpenSSL::Digest.new('sha256')
signature = OpenSSL::HMAC.hexdigest(digest, key, data)
headers = {
'FTX-KEY': ENV['FTX_API_KEY'],
'FTX-SIGN': signature,
'FTX-TS': ts.to_s
}
uri = URI.parse(url)
http = Net::HTTP.new(uri.host, uri.port)
http.read_timeout = 1200
http.use_ssl = true
rsp = http.get(uri, headers)
JSON.parse(rsp.body)
end
This works great and I get the correct response:
=>
{"success"=>true,
"result"=>
{"bids"=>
[[64326.0, 2.0309],
...
[64303.0, 3.1067]],
"asks"=>
[[64327.0, 4.647],
...
[64352.0, 0.01]]}}
However, I can't seem to authenticate correctly for POST calls (even though as far as I can tell I am following the instructions correctly). I use the following:
def create_subaccount
create_subaccount_url = 'https://ftx.com/api/subaccounts'
call_body =
{
"nickname": "sub2",
}.to_json
api_post_call(create_subaccount_url, call_body)
end
def api_post_call(url, body)
ts = (Time.now.to_f * 1000).to_i
signature_payload = "#{ts}POST/api/subaccounts#{body}"
key = ENV['FTX_API_SECRET']
data = signature_payload
digest = OpenSSL::Digest.new('sha256')
signature = OpenSSL::HMAC.hexdigest(digest, key, data)
headers = {
'FTX-KEY': ENV['FTX_API_KEY'],
'FTX-SIGN': signature,
'FTX-TS': ts.to_s
}
uri = URI.parse(url)
http = Net::HTTP.new(uri.host, uri.port)
http.read_timeout = 1200
http.use_ssl = true
request = Net::HTTP::Post.new(uri, headers)
request.body = body
response = http.request(request)
JSON.parse(response.body)
end
Also tried passing headers via request[] directly:
def api_post_call(url, body)
ts = (Time.now.to_f * 1000).to_i
signature_payload = "#{ts}POST/api/subaccounts#{body}"
key = ENV['FTX_API_SECRET']
data = signature_payload
digest = OpenSSL::Digest.new('sha256')
signature = OpenSSL::HMAC.hexdigest(digest, key, data)
uri = URI.parse(url)
http = Net::HTTP.new(uri.host, uri.port)
http.read_timeout = 1200
http.use_ssl = true
request = Net::HTTP::Post.new(uri)
request['FTX-KEY'] = ENV['FTX_API_KEY']
request['FTX-SIGN'] = signature
request['FTX-TS'] = ts.to_s
request.body = body
response = http.request(request)
JSON.parse(response.body)
end
This is the error response:
=> {"success"=>false, "error"=>"Not logged in: Invalid signature"}
My feeling is the issue is somewhere in adding the body to signature_payload before generating the signature via HMAC here..?:
signature_payload = "#{ts}POST/api/subaccounts#{body}"
Thinking this because, if I leave out #{body} here, like so:
signature_payload = "#{ts}POST/api/subaccounts"
the response is:
=> {"success"=>false, "error"=>"Missing parameter nickname"}
I have tried several iterations of setting up the POST call method using various different net/https examples but have had no luck...
I have also contacted FTX support but have had no response.
Would truly appreciate if anyone has some insight on what I am doing wrong here?
try this headers
headers = {
'FTX-KEY': ENV['FTX_API_KEY'],
'FTX-SIGN': signature,
'FTX-TS': ts.to_s,
'Content-Type' => 'application/json',
'Accepts' => 'application/json',
}
Here's a working example of a class to retrieve FTX subaccounts. Modify for your own purposes. I use HTTParty.
class Balancer
require 'uri'
require "openssl"
include HTTParty
def get_ftx_subaccounts
method = 'GET'
path = '/subaccounts'
url = "#{ENV['FTX_BASE_URL']}#{path}"
return HTTParty.get(url, headers: headers(method, path, ''))
end
def headers(*args)
{
'FTX-KEY' => ENV['FTX_API_KEY'],
'FTX-SIGN' => signature(*args),
'FTX-TS' => ts.to_s,
'Content-Type' => 'application/json',
'Accepts' => 'application/json',
}
end
def signature(*args)
OpenSSL::HMAC.hexdigest(digest, ENV['FTX_API_SECRET'], signature_payload(*args))
end
def signature_payload(method, path, query)
payload = [ts, method.to_s.upcase, "/api", path].compact
if method==:post
payload << query.to_json
elsif method==:get
payload << ("?" + URI.encode_www_form(query))
end unless query.empty?
payload.join.encode("UTF-8")
end
def ts
#ts ||= (Time.now.to_f * 1000).to_i
end
def digest
#digest ||= OpenSSL::Digest.new('sha256')
end
end
I'm attempting pull json data from an api, iterate through a hash of JSON data, append the new data to an empty hash and then check the pulled data for whether the key 'next_page' has any value..breaking the loop if the value is returned nill. I'm pulling the first batch of data from the api successfully, but my loop seems to be incorrect as no data is being inserted into the empty hash i've created.
Any ideas would be deeply appreciated. Thanks for giving my question a read!
require 'net/http'
require 'uri'
require 'json'
#imports APR User data from the zendesk api and populates the database with it.
uri = URI.parse("https://tester.zendesk.com/api/v2/users.json")
request = Net::HTTP::Get.new(uri)
request.content_type = "application/json"
request.basic_auth("cbradford#tester.com", "tester32")
req_options = {
use_ssl: uri.scheme == "https",
}
#response = Net::HTTP.start(uri.hostname, uri.port, req_options) do |http|
http.request(request)
end
puts #response.body
puts #response.message
puts #response.code
res = #response.body
users = res["users"]
data = {}
if users["next_page"]
newUri = users.fetch('next_page')
uriLoop = URI.parse(newUri)
request = Net::HTTP::Get.new(uriLoop)
request.content_type = "application/json"
request.basic_auth("cbradford#tester.com", "tester32")
req_options = {
use_ssl: uriLoop.scheme == "https",
}
#responseLoop = Net::HTTP.start(uriLoop.hostname, uriLoop.port, req_options) do |http|
http.request(request)
end
resLoop = JSON.parse(#responseLoop.body)
puts resLoop
users = resLoop["users"]
data.concat(users)
end
puts data
puts "hash created Successfully!"
You always omit the first fetched users. Before checking for the presence of next_page you never store the fetched users, and thus data remains empty.
See here:
users = res["users"]
data = {} ## ---data is initialised but never filled?
if users["next_page"]
Secondly I suspect that res["users"] is an array, and since you write data.concat(users) later that actually data should also be an array.
So your code should be fixed be replacing the above mentioned three lines as follows:
data = res["users"]
if users["next_page"]
I’m using Rails 4.2.7 and this code for making a Net::Http get request
req = Net::HTTP::Get.new(url)
if !headers.nil?
headers.each do |k, v|
req[k] = v
end
end
res = Net::HTTP.new(uri.host, uri.port).start do |http|
http.use_ssl = (uri.scheme == "https")
http.request(req)
end
status = res.code
content_type = res['content-type']
content_encoding = res['content-encoding']
content = res.body
However, when I make one in which the scheme is “https”, I get the following error
Error during processing: use_ssl value changed, but session already started
/Users/davea/.rvm/rubies/ruby-2.3.0/lib/ruby/2.3.0/net/http.rb:758:in `use_ssl='
/Users/davea/Documents/workspace/myproject/app/helpers/webpage_helper.rb:118:in `block in get_content'
/Users/davea/.rvm/rubies/ruby-2.3.0/lib/ruby/2.3.0/net/http.rb:853:in `start'
How do I set https while still being able to make my GET request?
According to docs, use_ssl
must be set before starting session.
This is my usual flow:
uri = URI 'some endpoint with encoded params'
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
headers = headers.each_with_object({}) { |(k, v), hash| hash[k] = v }
http.get(uri.request_uri, initheader = headers)
See the docs on get.
Sidenote on your
if !headers.nil?
It would be more readable if you just check for presence:
if headers.present?
Or even shorter:
if headers # would return true unless it's nil or false
I'm working with the Microsoft Emotion API for processing emotions in video in a Rails app. I was able to make the call to the API to submit an operation, but now I have to query another API to get the status of the operation and once it's done it will provide the emotions data.
My issue is that when I query the results API, the response is that my operation is not found. As in, it doesn't exist.
I first sent the below request through my controller, which worked great:
#static controller
uri = URI('https://api.projectoxford.ai/emotion/v1.0/recognizeinvideo')
uri.query = URI.encode_www_form({})
request = Net::HTTP::Post.new(uri.request_uri)
request['Ocp-Apim-Subscription-Key'] = ENV['MEA_SubscriptionKey1']
request['Content-Type'] = 'application/octet-stream'
request.body = File.read("./public/mark_zuck.mov")
response = Net::HTTP.start(uri.host, uri.port, :use_ssl => uri.scheme == 'https') do |http|
http.request(request)
end
# Get response headers
response.each_header do |key, value|
p "#{key} => #{value}"
end
# Get operation location and id of operation
operation_location = response["operation-location"]
oid = operation_location.split("/")[6]
The response of this first call is:
"operation-location => https://api.projectoxford.ai/emotion/v1.0/operations/e7ef2ee1-ce75-41e0-bb64-e33ce71b1668"
The protocol is for one to grab the end of the "operation-location" url, which is the operation id, and send it back to the results API url like below:
# parse operation ID from url and add it to results API url
url = 'https://api.projectoxford.ai/emotion/v1.0/operations/' + oid
uri = URI(url)
uri.query = URI.encode_www_form({})
request = Net::HTTP::Get.new(uri.request_uri)
request['Ocp-Apim-Subscription-Key'] = ENV['MEA_SubscriptionKey1']
response = Net::HTTP.start(uri.host, uri.port, :use_ssl => uri.scheme == 'https') do |http|
http.request(request)
end
# Get response headers
response.each_header do |key, value|
p "#{key} => #{value}"
end
The result I get is:
"{\"error\":{\"code\":\"Unspecified\",\"message\":\"Operation not found.\"}}"
I get the same result when I query the Microsoft online API console with the operation id of an operation created through my app.
Does anyone have any ideas or experience with this? I would greatly appreciate it.
You do not need parse the "oid" out of "operation-location" header, as it is already the URL you should GET the status.
The following code works for me. Use it to see if you still see the issue.
require 'net/http'
require 'uri'
uri = URI('https://api.projectoxford.ai/emotion/v1.0/recognizeinvideo')
uri.query = URI.encode_www_form({})
request = Net::HTTP::Post.new(uri.request_uri)
request['Ocp-Apim-Subscription-Key'] = '<your key>'
request['Content-Type'] = 'application/octet-stream'
videoFile = File.open("c:\\1mb.mp4", "rb")
request.body = videoFile.read
videoFile.close
response = Net::HTTP.start(uri.host, uri.port, :use_ssl => uri.scheme == 'https') do |http|
http.request(request)
end
puts response.message
puts response.read_body
# Get response headers
response.each_header do |key, value|
p "#{key} => #{value}"
end
# Get operation location url for subsequent calls
operation_location = response["operation-location"]
operation_url = operation_location
uri = URI(operation_url)
uri.query = URI.encode_www_form({})
loop do
request = Net::HTTP::Get.new(uri.request_uri)
request['Ocp-Apim-Subscription-Key'] = '<your key>'
response = Net::HTTP.start(uri.host, uri.port, :use_ssl => uri.scheme == 'https') do |http|
http.request(request)
end
puts response.read_body
response_msg = response.read_body
break if response_msg.include?("Succeeded") or response_msg.include?("Failed")
sleep 20
end
puts response.message
puts response.read_body
I'd like to open my stackoverflow.com page via ruby.
And I'd like to see it as if I am authenticated.
I took usr cookie from Google Chrome and created the following snippet:
require 'net/http'
require 'cgi'
url = "http://stackoverflow.com/users/1650525/alex-smolov"
uri = URI(url)
http = Net::HTTP.new(uri.host, 80)
request = Net::HTTP::Get.new(uri.request_uri)
cookie = CGI::Cookie.new("usr", "[my cookie is here]")
request['Cookie'] = cookie
r = http.request(request)
puts r.body
It does output a page, but I'm not authenticated there.
Is it possible to make a Net::HTTP::Get request in Ruby with cookie?
You need to call CGI::Cookie.to_s method.
request['Cookie'] = cookie.to_s
Try following code with / without .to_s.
require 'net/http'
require 'cgi'
uri = URI("http://httpbin.org/cookies")
http = Net::HTTP.new(uri.host, 80)
request = Net::HTTP::Get.new(uri.request_uri)
cookie1 = CGI::Cookie.new('usr', 'blah')
request['Cookie'] = cookie1.to_s # <---
r = http.request(request)
puts r.body
UPDATE
As the other answer mentioned, the resulted string is for server output. You need to strip out ; path= part.
CGI::Cookie.new('usr', 'value').to_s.sub(/; path=$/, '')
The accepted answer is imho incorrect. CGI::Cookie#to_s generates
string which should SERVER send to client, not something Net::HTTP should
use. It can be easily demonstrated:
[1] pry(main)> require 'cgi'
=> true
[2] pry(main)> CGI::Cookie.new('usr', 'value').to_s
=> "usr=value; path="
Code like this should work better.
require 'net/http'
require 'cgi'
uri = URI("http://httpbin.org/cookies")
http = Net::HTTP.new(uri.host, uri.port)
request = Net::HTTP::Get.new(uri.request_uri)
request['Cookie'] = "usr=#{CGI.encode cookie_value}"
r = http.request(request)
puts r.body
Or in case you have multiple cookies in a hash:
h = {'cookie1' => 'val1', 'cookie2' => 'val2'}
req['Cookie'] = h.map { |k,v| "#{k}=#{CGI.encode v}" } .join('; ')