I've just started using HTTParty, and i've encountered a problem in the way it builds the Hash from the XML replied by the server.
If i setup the following Builder template on the server:
xml.thunt :sendSubscriptionResult, :"xmlns:thunt" => "http://example.com/thunt", :status => #status
everything works well, i.e. the Hash built by HTTParty matches the XML generated by Builder, (the latter can be observed by making the same request via curl):
curl Request
curl -s -H "Accept: text/xml" -d "xml=`cat vendor/testxml/requests/sendsubscription.xml`" $SERVER/${name}
Reply as seen by curl
'<thunt:sendSubscriptionResult xmlns:thunt="http://example.com/thunt" status="alreadySubscribed" />'
HTTParty request
TreasureHunt.post('/sendsubscription', :query => { :xml => sub } )
Reply in HTTParty
{"thunt:sendSubscriptionResult"=>{"status"=>"alreadySubscribed", "xmlns:thunt"=>"http://example.com/thunt"}}
But, if in the Builder i specify that i want the sendSubscriptionResult element to have a text node:
xml.thunt :sendSubscriptionResult, "Hello, World", :"xmlns:thunt" => "http://example.com/thunt", :status => #status
(note the "Hello, World" addition) the two tools suddenly disagree.
curl
'<thunt:sendSubscriptionResult xmlns:thunt="http://example.com/thunt" status="alreadySubscribed">Hello, World</thunt:sendSubscriptionResult>'
HTTParty
{"thunt:sendSubscriptionResult"=>"Hello, World"}
As you can see, HTTParty has stripped all of the element's attributes, and has put only the text node in the resulting Hash
Is this a bug in HTTParty or am I doing something wrong?
Thanks!
Check out my post on the github issue for a resolution to your problem. http://github.com/jnunemaker/httparty/issues/#issue/14
I would go ahead and post your problems on the issues page for their Github project.
http://github.com/jnunemaker/httparty/issues
It already looks like there are some issues surrounding some XML issues. But it's definitely the best way to communicate directly with the developers and to provide them feedback.
Under the hood, httparty currently uses multi_xml for XML parsing. multi_xml will use, based on the speed of the available parsing gem: Ox, LibXML, Nokogiri, REXML. That is, it will choose Ox first, if you have it installed. You can also specify which parser to use.
There were some bugs recently resolved in multi_xml, particularly with regard to arrays.
I suggest you point bundler to the GitHub repo to get the very latest version of multi_xml, in your Gemfile like this:
gem 'multi_xml', :git => 'https://github.com/sferik/multi_xml'
gem 'ox'
gem 'httparty'
Then where you are going to use httparty (for example, in your Sinatra server) you would do this:
require 'bundler/setup'
Note that with this setup, multi_xml will not show up in your "gem list" output, but it will work.
Related
I have the website on Ruby on Rails 3.2.11 and Ruby 1.9.3.
What can cause the following error:
(JSON::ParserError) "{N}: unexpected token at 'alihack<%eval request(\"alihack.com\")%>
I have several errors like this in the logs. All of them tries to eval request(\"alihack.com\").
Part of the log file:
"REMOTE_ADDR" => "10.123.66.198",
"REQUEST_METHOD" => "PUT",
"REQUEST_PATH" => "/ali.txt",
"PATH_INFO" => "/ali.txt",
"REQUEST_URI" => "/ali.txt",
"SERVER_PROTOCOL" => "HTTP/1.1",
"HTTP_VERSION" => "HTTP/1.1",
"HTTP_X_REQUEST_START" => "1407690958116",
"HTTP_X_REQUEST_ID" => "47392d63-f113-48ba-bdd4-74492ebe64f6",
"HTTP_X_FORWARDED_PROTO" => "https",
"HTTP_X_FORWARDED_PORT" => "443",
"HTTP_X_FORWARDED_FOR" => "23.27.103.106, 199.27.133.183".
199.27.133.183 - is CLoudFlare IP.
"REMOTE_ADDR" => "10.93.15.235" and "10.123.66.198" and others, I think, are fake IPs of proxy.
Here's a link guy has the same issue with his web site from the same ip address(23.27.103.106).
To sum up, the common ip from all errors is 23.27.103.106 and they try to run the script using ruby's eval.
So my questions are:
What type of vulnerability they try to find?
What to do? Block the ip?
Thank you in advance.
Why it happens?
It seems like an attempt to at least test for, or exploit, a remote code execution vulnerability. Potentially a generic one (targeting a platform other than Rails), or one that existed in earlier versions.
The actual error however stems from the fact that the request is an HTTP PUT with application/json headers, but the body isn't a valid json.
To reproduce this on your dev environment:
curl -D - -X PUT --data "not json" -H "Content-Type: application/json" http://localhost:3000
More details
Rails action_dispatch tries to parse any json requests by passing the body to be decoded
# lib/action_dispatch/middleware/params_parser.rb
def parse_formatted_parameters(env)
...
strategy = #parsers[mime_type]
...
case strategy
when Proc
...
when :json
data = ActiveSupport::JSON.decode(request.body)
...
In this case, it's not a valid JSON, and the error is raised, causing the server to report a 500.
Possible solutions
I'm not entirely sure what's the best strategy to deal with this. There are several possibilities:
you can block the IP address using iptables
filter (PUT or all) requests to /ali.txt within your nginx or apache configs.
use a tool like the rack-attack gem and apply the filter there. (see this rack-attack issue )
use the request_exception_handler gem to catch the error and handle it from within Rails (See this SO answer and this github issue)
block PUT requests within Rails' routes.rb to all urls but those that are explicitly allowed. It looks like that in this case, the error is raised even before it reaches Rails' routes - so this might not be possible.
Use the rack-robustness middleware and catch the json parse error with something like this configuration in config/application.rb
Write your own middleware. Something along the lines of the stuff on this post
I'm currently leaning towards options #3, #4 or #6. All of which might come in handy for other types of bots/scanners or other invalid requests that might pop-up in future...
Happy to hear what people think about the various alternative solutions
I saw some weird log entries on my own site [which doesn't use Ruby] and Google took me to this thread. The IP on my entries was different. [120.37.236.161]
After poking around a bit more, here is my mostly speculation/educated guess:
First, in my own logs I saw a reference to http://api.alihack.com/info.txt - checked this link out; looked like an attempt at a PHP injection.
There's also a "register.php" page there - submitting takes you to an "invite.php" page.
Further examination of this domain took me to http://www.alihack.com/2014/07/10/168.aspx (page is in Chinese but Google Translate helped me out here)
I expect this "Black Spider" tool has been modified by script kiddies and is being used as a carpet bomber to attempt to find any sites which are "vulnerable."
It might be prudent to just add an automatic denial of any attempt including the "alihack" substring to your configuration.
I had a similar issue show up in my Rollbar logs, a PUT request to /ali.txt
Best just to block that IP, I only saw one request on my end with this error. The request I received came from France -> http://whois.domaintools.com/37.187.74.201
If you use nginx, add this to your nginx conf file;
deny 23.27.103.106/32;
deny 199.27.133.183/32;
For Rails-3 there is a special workaround-gem: https://github.com/infopark/robust_params_parser
I'm optimizing some slow transactions in our Rails application and a I see significant time spent rendering JSON views:
Rendered welcome/index.json.rabl (490.5ms)
Completed 200 OK in 1174ms (Views: 479.6ms | ActiveRecord: 27.8ms)
Assuming that the API call is returning exactly the data it needs to return, What is the fastest way to render JSON in rails?
We are using Rabl because of the ability to share code easily, but we aren't tied to it.
Currently oj seems to be the fastest renderer - beating yajl (according to the oj author's comparison).
Oj is used by default in the latest multi_json (and rails uses mutli_json by default), so swapping to oj should be as simple as adding the following to your Gemfile:
# Gemfile
gem "oj"
Then each time you call render, it will now use oj.
render :json => { ... } # uses multi_json which uses oj
Oj also provides additional specific interfaces, if you want even more performance, but sticking to multi_json makes it easier to swap out gems in the future.
Note that if you have any { ... }.to_json calls - these will not be upgraded to use oj unless you call Oj.mimic_JSON in an initializer.
Rails 3 uses multi_json, but it only uses it for json decoding, not encoding. Json encoding/rendering/generation uses ActiveSupport JSON library's to_json method, therefore is always slow (even if you uses Oj gem).
You can explicitly rendering using multi_json by doing:
render :json => MultiJson.dump(#posts)
Or you can try rails-patch-json-encode gem (by me) which will use multi_json by default. It will affect all build-in to_json methods, so make sure all the tests passes.
Rabl uses multi_json for compatibility across platforms and doesn't use the quite fast Yajl library by default. Rabl's config documentation explains the solution:
# Gemfile
gem 'yajl-ruby', :require => "yajl"
In the event that still isn't performant enough, you might want to explore a different JSON serializer like oj. You could also instrument your render and see where the bottleneck exists.
Netflix recently released a new JSON rendering library which is supposedly 25-40 times faster than the default library. Announcement. Code. You'll need to create a new Serializer to take advantage of it, but for people who are impacted, that doesn't seem to be a big hurdle.
I've written functional tests for API endpoints built in Rails using shoulda testing framework.
An example looks like the following:
setup do
authenticated_xml_request('xml-file-name')
post :new
end
should respond_with :success
authenticated_xml_request is a test helper method that sets
#request.env['RAW_POST_DATA'] with XML content.
After upgrading the app from rails 2.3.3 to rails 2.3.8, the functional tests are failing because the XML content received is not merged in the params hash.
I'm setting the request with the correct mime type via #request.accept =
"text/xml"
I'm able to inspect the content of the request using request.raw_post but i'd like to keep the current setup working.
Also, running a test from the terminal using cURL or any other library (rest_http) in development mode, the API works perfectly well.
Any tips or help is much appreciated.
Now it's simpler:
post "/api/v1/users.xml", user.to_xml, "CONTENT_TYPE" => 'application/xml'
Note that you have to specify appropriate "CONTENT_TYPE". In other case your request will go as 'application/x-www-form-urlencoded' and xml won't be parsed properly.
I solved the issue by adding a custom patch to rails (test_process.rb file) to convert incoming xml to hash then merge into parameters hash.
on line 439:
parameters ||= {}
parameters.merge!(Hash.from_xml(#request.env['RAW_POST_DATA'])) if #request.env['RAW_POST_DATA'] && #request.env['CONTENT_TYPE']=='application/xml'
I am using the linkedin gem https://github.com/pengwynn/linkedin
I authorize using Omniauth and store the access token and secrets.
I then authorize by access with the client.
I appear to get something useful when I type client.profile -- but it looks like mostly Nokogiri but in a LinkedIn::Profile class.
How do I specifically access fields, and will I be able to use method calls from the View in rails or do I need to do all the parsing in the controller and pass those values to the View from there.
Example of how to access the profile image url, title, name, company that sort of thing once I have established client.profile would be great.
When I use the (:fields =>) I get back something like this:
#<LinkedIn::Profile:0xb682c72c #doc=#<Nokogiri::XML::Document:0x..fdb41630a name="document" children=[#<Nokogiri::XML::Element:0x..fdb415fae name="person" children=[#<Nokogiri::XML::Text:0x..fdb415d88 "\n ">, #<Nokogiri::XML::Element:0x..fdb415d24 name="picture-url" children=[#<Nokogiri::XML::Text:0x..fdb415aae "http://media.linkedis:
I just want the string associated with the node "picture-url"...how do I do that?
From controller:
7 `def show`
8 #user = User.find(params[:id])
9 #client = LinkedIn::Client.new(ENV["LINKEDIN_KEY"], ENV["LINKEDIN_SECRET"])
10 #client.authorize_from_access(#user.atoken, #user.asecret)
11 #client.profile(:id => #user.uid, :fields => ["picture-url", "headline"])
12
13 end
New error:
undefined method `downcase' for nil:NilClass
Here is a related question: "https://stackoverflow.com/questions/5821549/how-do-i-pass-a-a-tag-through-ruby-to-linkedin-using-the-gem"
I did it by adding:
client.profile(:fields => [:positions]).positions
This would then allow me to access specific positions or fields without going into the raw xml, just using the methods in the gem. The gem works nicely once I get the format...
I suggest you get the latest version of linked_in gem from github. It uses Hashie/Mashie syntax which is much simpler than dealing with Nokogiri output and XPath.
If youre using bundler add this to your Gemfile(removing any other linked_in gem reference)
gem 'linkedin', :git => "git://github.com/pengwynn/linkedin.git"
this version of the gem basically sticks the output of your Linked In search into a hash, so you would access your picture-url string as follows: profileHash["picture-url"]
P.S. if you do decide to stick with your version of linked_in gem, get familiarized with XPath syntax, you will need it. Based on the information you provided, the picture url string will be available via profileXML.xpath("//person/picture-url").first.text
Rails 2.3.6 started using the fast new json library, yajl-ruby, "if available".
In the "JSON gem Compatibility API" section of the yajl-ruby readme it outlines a method to just drop in yajl-ruby inclusion and have the rest of the app seamlessly pick it up.
So, ideally, I'd like
Rails to use it
My gems to use it
My application code to use it
What's the easiest way to achieve this? My guess:
config.gem 'yajl-ruby', :lib => 'yajl/json_gem'
As the very first gem in environment.rb. Doing this doesn't result in any errors, but I'm not sure how to know if rails is picking it up for its own use.
Thanks!
John
I'd recommend using yajl-ruby's API directly instead of the JSON gem compatibility API mainly for the reason that the JSON gem's to_json method conflict with ActiveSupport and has had long-standing issues making them work together.
If you just do config.gem 'yajl-ruby', :lib => 'yajl' instead, you'll need to use Yajl::Parser and Yajl::Encoder directly to parse/encode objects. The advantage of this is you'll be certain there won't be any conflicts with method overrides and as such, be guaranteed your JSON encoding/parsing code will work as expected.
The disadvantage is if you're using any gems that use the JSON gem, they'll continue to do so but you're own code will use yajl-ruby.
If you wanted to, you could use your config.gem line, then in an initializer require 'yajl' so you'd have both API's loaded. The yajl/json_gem include will override anything that's using the JSON gem with yajl - to ensure this overrides those methods try to make sure require 'yajl/json_gem' happens last.
If you're using Rails 3, you can add this to an initializer:
ActionController::Renderers.add :json do |json, options|
json = Yajl.dump(json) unless json.respond_to?(:to_str)
json = "#{options[:callback]}(#{json})" unless options[:callback].blank?
self.content_type ||= Mime::JSON
self.response_body = json
end
To make sure render :json => ... calls use yajl-ruby as well.
Sorry if this isn't really answering your question but I wanted to at least give the suggestion of using yajl-ruby's API directly :)