[selenoid][capybara][rails] Can we use selenoid with Rack App - ruby-on-rails

In Rails when we use capybara with selenium then by default they run the rack server for our rails app and we can test it without running the actual app.
So let me explain what does it mean:
so when we configure capybara rspec and selenium for rails then don't need to run your rails server separately, when we run the spec and when we visit any URL of our app in the specs then its working. its happen because it create a rack server in the background .
Now I use Selenoid instead of selenium but the rack server not working.
so does selenoid only work with remote url and it not create any rack app?
Let me know if any other info. is required.
Thanks!

Selenoid is a replacement for the selenium grid, and manages the browser instances you're using for testing. It has nothing to do with running the application under test.
The issue you're running into is that Capybara runs the application on the machine you're running the tests on, but when using selenoid the browsers are running on other machines (containers). This means that when Capybara starts up the application and tells the browser to visit https://localhost:<some port>/some/path, the localhost reference is no longer correct for browsers running on other machines. To correct that you need to set Capybara.app_host to the url where the tests are being run as seen from the machines/containers the browser instances are being run on. Depending on exactly how your container network is configured you'll also need to either fix the port used by Capybara to run the app on or specify the Capybara.always_include_port option.
Capybara.app_host = "http://local_machine_as_seen_from_containers"
Capybara.always_include_port = true
or
Capybara.server_port = 1234 # some port number
Capybara.app_host = "http://local_machine_as_seen_from_containers:#{Capybara.server_port}"

Related

Setting up Capybara to have an alias for the server it boots up

So I have some features I'm wanting to test.
For a portion of the features I'm wanting to have requests placed to a different host.
When I'm testing my code locally (outside of Capybara) I'm able to use localhost:3000 for a portion of the requests, and 0.0.0.0:3000 for another portion of the requests.
Any thoughts on the best approach to replicating this approach with Capybara?
If I set the default host to localhost:3000 and spin up a server myself it works. But haven't yet been able to get it to work otherwise.
FWIW I'm using poltergeist as the js driver.
The two Capybara settings that control the app under test server it starts are Capybara.server_host (defaults to 127.0.0.1) and Capybara.server_port (default to a random free port). One way to get what you want is to fix the port and then pick a domain that resolves everything to 127.0.0.1 for all hostnames (.test on your machine possibly - or add a new name to your hosts file that resolves to 127.0.0.1)
Capybara.server_port = 9999
visit('/') # will go to http://127.0.0.1:9999/
visit('http://127.0.0.1:9999')
visit('http://my_app.test:9999')
The other option is to set Capybara.always_include_port = true which will override the port for any visit call that doesn't specifically include a port with the port Capybara has run its server on (which can therefore remain a random port Capybara picks)
Capybara.always_include_port = true
visit('/') # http://127.0.0.1:<server_port>/
visit('http://my_app.test') # http://my_app.test:<server_port>
visit('http://google.com') # http://google.com:<server_port>
visit('http://google.com:80/') # http://google.com

How to run Capybara specs on Nginx+Node+Rails servers?

An application is running on Rails (localhost:3000) + Node (localhost:8080) servers and the Nginx is used to redirect requests either to Rails server or Node.
In Nginx config file we have set constant IP (nginx.localhost) to send requests to Nginx.
Also the Rails and Node servers IP are constant for nginx.
And now there is a problem to run tests on this environment.
I need to run tests with this nginx.localhost as a root_path or Capybara.app_host so that I could test application via Nginx.
Do you have any ideas? All what came to my mind was to update nginx config file with port and server generated by rspec on suite start.
But this is a bit awkward.

Cucumber failing in chrome with "Retry later" message

I'm trying to run my cucumber tests and they seem to stop randomly. A new page is visited but nothing renders on the page except Retry later as in the following screenshot.
I'm on OS X 10.9.3, Chrome 35.0.1916.114, and running with bundle exec cucumber. It's happening in and Firefox also if I change the javascript driver.
The problem was not with Chrome, Cucumber, or Capybara. It was with Rack::Attack. 127.0.0.1 was whitelisted but according to this github issue
it wasn't whitelisting ipv6 and transitional loopback ip addresses
To simplify things I just moved Rack Attack to be production only.
tl;dr
Rack::Attack was to blame. Unless you need it in your test environment, just make the gem production only.

Capybara / Poltergeist Internals: Running the Server in a Separate Process/Environment

I'm attempting to use Capybara and Poltergeist to automate taking screenshots of my Rails application. I already have this sort of working, and I've integrated the functionality with Rails' asset pipeline. (See this question for more on that.)
While testing my current setup, however, I've noticed lots of weird issues that seem to be caused by Capybara and my application running in the same process. Is there a way to make Capybara run its server in a separate process, in a different environment?
You can make Capybara run against an external server by following the directions in the "Calling remote servers" section of the README. I think your best bet would be to fork an external server yourself before calling Capybara (e.g., in a rake task or a hook in your testing framework) and then treat it as a remote server in Capybara.

cucumber javascript enabled features inside vagrant

i am a big fan of both cucumber and vagrant. i am using cucumber to drive development with a BDD style and have configured and setup my dev environment once in a vagrant vm so it can be easily distributed to other developers.
i've hit a snag when it comes to testing js on the vagrant vm. i'm using capybara alongside cucumber and have tried both selenium and capybara-webkit as js drivers.
selenium wanted me to install firefox. so i packaged up the vagrantbox fully expecting it not to work and installed firefox on the vm.. after that it complained about being unable to get a stable connection with firefox. exact error message: 'unable to obtain stable firefox connection in 60 seconds'
webkit complained about 'webkit_server: cannot connect to X server'.. i installed xserver-org and still no dice.
would much prefer if i didn't have to install my whole environment on my host in order to do testing as that would kind of defeat the purpose of having a distributable vagrant vm that has everything a dev needs to work on the app.
any ideas? i've encountered a similar problem with the notifications from guard, but that seems like not nearly as big a deal as this issue.
I think that all drivers require X, to be installed.
So on linux server I do not have how do it.
Other way it using Firefox from host machine via selenium remote.
It mean javascript will be run on remote Firefox on any machine.
Remote Selenium WebDriver not responding to Cucumber tests
http://code.google.com/p/selenium/wiki/RubyBindings
It work.
Run selenium server on host
java -jar selenium-server-standalone.jar
Changes in spec_helper.rb
require "selenium-webdriver"
profile = Selenium::WebDriver::Firefox::Profile.new
profile["network.http.use-cache"] = false
Capybara.register_driver :firefox_host do |app|
Capybara::Selenium::Driver.new(app,
:browser => :remote,
:url => 'http://10.0.2.2:4444/wd/hub',
:desired_capabilities => Selenium::WebDriver::Remote::Capabilities.firefox(:firefox_profile => profile)
)
end
Capybara.server_port = 9300
Capybara.app_host = 'http://localhost:9300'
Capybara.javascript_driver = :firefox_host
Changes in Vagrantfile
config.vm.forward_port 9300, 9300
You may want to try running firefox headless.
http://www.alittlemadness.com/2008/03/05/running-selenium-headless/
This way, you can run tests without seeing them as well.
I ran into a similar issue. Try ssh'ing into your VM and executing xhost + from the command line. This disables access control and allows clients to connect from any host to access your display.
Also, you might need to export DISPLAY=:0 as well, because the webdriver will default to using this display when it launches Firefox.

Resources