I have a Rails 3 application. One of the controller method is for parsing large number of Twitter search results and storing them to the database. All works fine if the number of URLs to parse is small. But once the URLs reach 1000 or more, I have the following error after a few seconds (copied from the log file):
Address: http://search.twitter.com/search?q=+Chas%20Salon+near:%22Baltimore%22+within:15mi
Completed in 111436ms
OpenURI::HTTPError (420 unused):
app/controllers/twitter_reviews_controller.rb:41:in block in new'
app/controllers/twitter_reviews_controller.rb:20:ineach'
app/controllers/twitter_reviews_controller.rb:20:in `new'
I am using Hpricot to parse the Twitter search results.
The line number 41 in the above message is this:
doc = Hpricot(open(address))
Does anyone knows where is the problem?
Thank you very much for any help.
Cheers,
Tony.
You are being rate limited by Twitter server. read more here: http://dev.twitter.com/pages/rate-limiting
Manage your code so it should not exceed the rate limit in an hour. and look at this response code return by http://apiwiki.twitter.com/w/page/22554652/HTTP-Response-Codes-and-Errors
Related
I am trying to get some basic information from the Steam Community via the steam-condenser gem and so far the Steam.new seems to work just fine with all the players information.
however when I do this (example)
player = SteamId.new("tiger")
stats = player.fetch_games
I get the following error message
Traceback (most recent call last):
1: from lib/assets/ruby/test.rb:15:in `<main>'
/home/zigs/.rbenv/versions/2.6.6/lib/ruby/gems/2.6.0/gems/steam-condenser-1.3.11/lib/steam/community/steam_id.rb:326:in `fetch_games': undefined method `[]' for nil:NilClass (NoMethodError)
A lot of the information I need seems to be connected to the fetch_games (for example the method total_playtime(id))
Not sure why this is not working. I am lost. Any help or ideas are highly appreciated! Thank you!
TLDR; it looks like this gem no longer works.
the particular module that you're having trouble with is:
def fetch_games
games_data = parse "#{base_url}/games?xml=1"
#games = {}
#recent_playtimes = {}
#total_playtimes = {}
games_data['games']['game'].each do |game_data|
app_id = game_data['appID'].to_i
#games[app_id] = SteamGame.new app_id, game_data
recent = game_data['hoursLast2Weeks'].to_f
total = (game_data['hoursOnRecord'] || '').delete(',').to_f
#recent_playtimes[app_id] = (recent * 60).to_i
#total_playtimes[app_id] = (total * 60).to_i
end
true
end
with the particular problem statement being games_data['games']['game'].each
If we were looking to get information for a particular user, it downloads an XML document about the user from a URL looking like:
http://steamcommunity.com/id/demomenz?xml=1
and this file does not seem to contain any games objects in it.
Having looked at the codebase for the steam-condenser gem; it hasn't really been updated in about 6 years. I can only assume that the XML format has been modified since this time and that the gem will no longer work.
Valve has added more privacy options to Steam Community profiles which are not reflected in the old XML APIs.
Apparently, the profile in question (tiger) has it‘s game details set to “Friends Only” or ”Private” as games are also unavailable in the browser.
The code from the released 1.x versions is no longer guaranteed to work when it comes to Steam Community. Valve deprecated the old XML APIs several years ago. Sadly, the modern Web API hasn‘t gotten much attention from Valve‘s side either. So development of Steam Condenser has mostly come to halt, too.
You might have more luck using the code from the master branch of the GitHub repository which uses Web API for most of the Community features.
You will have to register for a Steam Web API key, though: https://steamcommunity.com/dev/apikey
I have been using the following URL for the past 3 years without issue. However, it has stopped returning results.
URL:
https://query.yahooapis.com/v1/public/yql?q=select * from yahoo.finance.xchange where pair in ("ARSARS")&env=store://datatables.org/alltableswithkeys
Now returns the following:
<?xml version="1.0" encoding="UTF-8"?>
<query xmlns:yahoo="http://www.yahooapis.com/v1/base.rng" yahoo:count="0" yahoo:created="2017-11-02T09:33:25Z" yahoo:lang="en-AU">
<results/>
</query><!-- total: 9 -->
Notice how there are no results, it simply has the tag "results".
I have tried on this many different computers and browsers. I have also tried changing the currency combinations, but no luck.
Can anybody spot what I am doing wrong?
Yes, it looks like Yahoo has discontinued their Currency Converter API service.
I found a different site : https://currencylayer.com/
They let you request currency rates (1 USD = 168 other currencies). If you make less than 1000 requests per month, it is free (if you need more, they have different subscriptions : https://currencylayer.com/product).
You just need to sign up and receive your own Access Code. Once you have that, then you simply make the call http://apilayer.net/api/live?access_key= and it will return JSON of the other currency rates.
They also have code examples in PHP, JavaScript, and Java at https://currencylayer.com/documentation
Found it very easy to get started using their API right away.
I think the API is down.
I am similarly receiving "results"=nullfor the query:
http://query.yahooapis.com/v1/public/yql?q=select+%2A+from+yahoo.finance.xchange+where+pair+in+%28%22GBPEUR%22%29&format=json&env=store%3A%2F%2Fdatatables.org%2Falltableswithkeys
Setting diagnostics=true in the request yields:
`
[execution-start-time] => 7
[execution-stop-time] => 12
[execution-time] => 5
[http-status-code] => 999
[http-status-message] => Request denied
[content] => http://download.finance.yahoo.com/d/quotes.csv?s=GBPEUR=X&f=snl1d1t1ab
`
If anyone knows more than me about what this might imply I'd be glad to hear it!
Yahoo is aware of this issue and their engineers are working on it:
ref: https://forums.yahoo.net/t5/Yahoo-Finance-help/http-download-finance-yahoo-com-d-quotes-csv-s-GOOG-amp-f/td-p/387096
Very sadly and quite outrageously Yahoo decided to stop this service without any warning.
See admin message here
So many services depend on it, it's like Google saying they would suddenly stop their maps API... At this point I am blocking yahoo in our DNS so no one in our company will ever use Yahoo again since they are not a reliable entity.
I am sorry if this doesn't belong here but I dont know where it does.
I tried to play a youtube video and got an http 500 error with the following text:
Sorry,
something went wrong.
A team of highly trained monkeys has been dispatched to deal with this situation. If you see them, send them
this information as text (screenshots frighten them):
> AB38WEMQgYMwAsSndv0qRVwB2h6TTDLZbomJEWU6qMCSOlXSieCCjveU
7hDWK8oeERuik9Fw-jLjNczrdngnBDUZpExmTWm6BdtwUW26ecXKhmYi
NnZ91eCdo8ejZ5bzzhv9pYvCPlTDFN6vq6pHgGMqBsfH7EMrHYFjlwq_
7yWIObcAM2Ni1tTP1fR8EkhGYIwD1gpYuNLBnVT-imW3RBgFijeA8iLA
v0TEZ_HkJP7rRGS0txolsJEP_vMnhJZKdEd3VP1zfhoB6Vl_cUhtr69
yB6be9FMd6kRpnSiItgbxPvyrFa75aJlkBe0H72CdAXeRifbGaDc1_q2
44Tms5HH6dfasPa9kVkzhy3GkpTbXZGIPRynzhRdrb0R8uMv-Y7kyhC6
JgEAQoe5zld02-rlMu6DpN6fFtkGWZ3Lt-e-hECLRuYqHX-eCW567f8c
mmxd8YEGRS4qURfh4eFl1uvQ_b0rEOZTHrkzUNlTLbSIW6FesFccjX83
ybB5fP-21S60JudEj1ZzM6m50GPFTV8gRFDz5r7-cFR1uMkKfnq1U0FC
b1Tehcova6OabgROHDljpM3J8jJlLEmQKqxOSugMzmelDZLVw30pMyjb
Wm4U4fin2J4sTIN9AYFo-aGzmv4JvbSURzKGamVPXt2R6v2PvB5SPeBb
yJ8oVAv5j34bClZksFlShm5xTCUEAB3q9FfoUSvqqT6pOOoeHqFEmiGl
ggJcl79sAFfjSVeSRpU5qjJ4np7ve3-jImsQNLO7lCC-FDq3Ao8peKPc
wVQDNTXcXXTSpMkk5mWJG4LRgaVQG4mFg-gcRTphCK_hWeFsoEgrdX_D
j_bdM13tBeQFFv8lIXNcMX6bME36U8PaBWCMjGBlhZVpZ9vArH0vehC0
FO5SPxWJsAjCJU1XY2f6AMPQioQS9h2OOZA6p6EFmGFNw-d5jeqvLIfx
Cp3BHa_cddgBHQabZI77Db62hSoAzI-8FrJWy9dDTWjuj_MntCZcfUNY
URSrXvXsiKRaz_5sJ-S2rSX2aCxJSne_gsgT1Y-YIzO1xBD9wuLts20G
4YX2Scpyj8mui7I8bn_Sx6H3iyFMrEVJwhneCKbeaXMiIOk6CiJNgS3s
jdcLUB_Nv5tR35j8BvAbW5ZgNT57SKp9SDFDdTZyDfJ0Eg9aCjxARVBc
T4V6Hs_c31yZ5SDnu0e9drBS5ynf_ZydZAJR9AEiEAQfzayUjJgy9NCB
viA0_EPY2dVPOlW3VWJ7C5HhKksSXeWQ45fE0IS1WOwOlq50yvKF0SoB
asoT4ziflWe92IA56Ds3_XXeWMBwcOjXIb2KCkqE_hdChOyZpiufLB4t
Ngl5WAglVNfV1Z45omJv4MGiBgUcpICvYmkJ_OcTVi03BbS9cuXlBbE7
ivuqu1AfHPnKp8RBGpY_mKpLrYKYhJ5faXi2INdZ-9xh6MAqehN6cKQj
SgweGtclPuX0teQye3h8P5vp425ARfQEzbFYmRULRQiITz16az-JgVmP
Di3XaE4Qq7rFpumWPuuHGqlTjmurFIuh0SZ2sklMxPQFGOYIEjNNsEtE
UPuL0DApAqnP1AH3mHMqHKzdgzFvg7k9Z86M5yt4CDatYmjktsuNzdBJ
wArAWFRfr86x7XBdkr9eVa8Hp-6x86CzR38nypBBddU2Jr4E8RsG8-7h
iCvxlHwRoPXoGouIXflUIe5mHZveEAtQGYZzeJKFvZ8YojTLLx8bKFgu
zsJ0Cll3bWmHZS5ZPHzppuvZauwaVKFxLX0EjjdMOiFaueno1pXfp4jm
3QqdllBQSzai4Z2wXfhU9Ql_cFzqpzfPMBNxt6mge-0ARsMsEeR5PDF
2wNglV05GkaVp4JgWSTo7lQ3OnJKQNmECTGaYNbGR7IsGigTMc3QM0iH
9ueR3-l75e3YYMlRl9tZWquJd54eGh5vYTLrbW60CrBZQLSKpxqpxX_W
9x_aaEhESZSXVx0
Do you know the meaning of the text? I have a feeling it has something special about it because when I google search for it I get an error 400:
any one know? thanks
After upgrading to Ruby-1.9.3-p392 today, REXML throws a Runtime Error when attempting to retrieve an XML response over a certain size - everything works fine and no error is thrown when receiving under 25 XML records, but once a certain XML response length threshold is reached, I get this error:
Error occurred while parsing request parameters.
Contents:
RuntimeError (entity expansion has grown too large):
/.rvm/rubies/ruby-1.9.3-p392/lib/ruby/1.9.1/rexml/text.rb:387:in `block in unnormalize'
I realize this was changed in the most recent Ruby version:
http://www.ruby-lang.org/en/news/2013/02/22/rexml-dos-2013-02-22/
As a quick fix, I've changed the size of REXML::Document.entity_expansion_text_limit to a larger number and the error goes away.
Is there a less risky solution?
This issue is generated when you send too much content as XML response.
To fix this issue : You need to restrict the data(< 10k) in the individual node (Instead of sending the whole data, show truncated data and provide a seperate link to view full content)
The error is being raised from the below file :
ruby-2.1.2/lib/ruby/2.1.0/rexml/text.rb
# Unescapes all possible entities
def Text::unnormalize( string, doctype=nil, filter=nil, illegal=nil )
sum = 0
string.gsub( /\r\n?/, "\n" ).gsub( REFERENCE ) {
s = Text.expand($&, doctype, filter)
if sum + s.bytesize > Security.entity_expansion_text_limit
raise "entity expansion has grown too large"
else
sum += s.bytesize
end
s
}
end
The limit ruby-2.1.2/lib/ruby/2.1.0/rexml/text.rb defaults to 10240 which means 10k data per node.
REXML already defaults to only allow 10000 entity substitutions per document, so the maximum amount of text that can be generated by entity substitution will be around 98 megabytes. (Refer https://www.ruby-lang.org/en/news/2013/02/22/rexml-dos-2013-02-22/ )
That sounds like a LOT of XML. Do you really need to get all of it? Maybe you can just request certain fields from the remote server? One option might be to try another XML parser (Nokogiri for example). Another option to maybe use something other than XML as a transport (JSON? Binary?).
I'm implementing a Google Talk listener that updates me with all my contact list items' presence.
require 'xmpp4r'
require 'xmpp4r/roster'
require 'xmpp4r/roster/helper/roster'
sender_jid = Jabber::JID.new('email')
client = Jabber::Client.new(sender_jid)
client.connect('talk.google.com')
client.auth('password')
client.send(Jabber::Presence.new.set_type(':available'))
#Presence updates:
client.add_presence_callback do |pres|
puts pres.from.to_s.split("/")[0] unless pres.nil?
puts pres.show.to_s.inspect unless pres.nil?
end
Thread.stop
client.close
The code works fine and the thread continues to listen on one gmail account but gives me this error after a few contacts appear:
client.rb:33:in `stop': deadlock detected (fatal)
from client.rb:33:in `<main>'
This other account for which this error appears has a lot more contacts with varying statuses. Can't seem to figure out why this is happening. Any help would be amazing.
Thanks.
Solved the problem through the logger. It was throwing a deadlock because there was a parsing error. There are certain contacts I have with characters that weren't able to get parsed.
It seems xmpp4r has not been updated in a while and my solution was to move over to a repo that some people have updated.
If anyone is having a similar problem check out:
https://github.com/whitehat101/xmpp4r
The parsing is done through nokogiri.
UPDATE:
there are a bunch of new maintainers who have forked over many of the updates from above + fixing other issues:
https://github.com/xmpp4r/xmpp4r