Realtime logging with capybara and chrome browser - capybara

It seems that chrome brwoser have capability to log everything to the bash console while running with chromedriver and capybara.
How can we stream that log real time while running test using capybara (perhaps with cucumber test runner)? The log should include network request + info/warn/error that get logged in the chrome console window.

Related

Rails headless browser testing without installing browser

We're doing some testing in Rails using a headless browser for our feature tests using RSpec and the web drivers gem: https://github.com/titusfortner/webdrivers to get the Chrome driver.
However we've had some issues whereby some developers don't have Chrome installed (and don't intend to) and also we have our app running on a Jenkins pipeline and we want to avoid having to install Chrome on the server for the tests...
Is it possible to run a headless browser without installing the actual browser as well as a hard dependancy on the system that the tests run on?
I've read conflicting articles that state headless allows you to still do your testing either when the system doesn't have the browser installed... but at the same time the documentation for Chrome and Firefox state the browser also needs to be installed when using their drivers.
https://github.com/rubycdp/cuprite I also came across this which bypasses the need for WebDriver etc... but still seems to have a hard dependancy on having Chrome installed on the system the tests run on.
You have to choose a browser:
Chrome: chromedriver
Firefox: geckodriver
Edge: edgedriver
Internet Explorer: iedriver
See https://github.com/titusfortner/webdrivers

Capybara Timeout when waiting for S3 images

I'm running tests with Capybara and chrome (not headless for debugging) and some of my tests fail with a timeout error:
Net::ReadTimeout
However, those pages work fine outside of tests.
When the testing browser is opened on those failing pages, I can always see it waiting for S3 images, which might be causing the timeout.
This issue appeared not long ago, it used to work perfectly fine.
Any idea on how to fix this ?

Performance testing UI with YSlow and Jenkins

I have a web application, on which I would like to run Yslow. The tests would need to be integrated with Jenkins. The application has a login page. If I provide the application's url, if a valid user is not logged in the login page will be displayed. So how do I test performance using YSLow & Jenkins? Is it possible to automate the login part?
Since YSlow can generate a performance report from an input HAR file, I would use a proxy server to record the performance data while navigating the web-site with Selenium.
This way, you can independently measure the performance on a real browser (Chrome, Firefox, Safari...) or on a headless one like PhantomJS.
To proceed, first download browsermob proxy and unzip it:
https://github.com/lightbody/browsermob-proxy/releases
Then write the code to launch the proxy server and to run your scenario with a Selenium client. This example was written in Python, but you could write it the same way with some Java, Ruby, Javascript or PHP.
from browsermobproxy import Server
from selenium import webdriver
import json, sys
# setup the proxy server
server = Server(r"C:\Download\browsermob-proxy-2.1.0-beta-5\bin\browsermob-proxy")
server.start()
proxy = server.create_proxy()
try:
# setup the browser
profile = webdriver.FirefoxProfile()
profile.set_proxy(proxy.selenium_proxy())
driver = webdriver.Firefox(firefox_profile=profile)
# start collecting the data
proxy.new_har("google")
# login to a twitter account
driver.get("https://twitter.com/login?lang=en")
driver.find_element_by_css_selector(".js-username-field").send_keys("my name")
driver.find_element_by_css_selector(".js-password-field").send_keys("my password")
driver.find_element_by_css_selector("button.submit").click()
# save the collected data to a file
with open(sys.argv[1], 'w') as file:
file.write(json.dumps(proxy.har, indent=2))
finally:
driver.quit()
server.stop()
Finally, to run the script and generate the performance report with a command line:
python perf-login.py perf-login.har
yslow perf-login.har
yes you can, but you still miss this automated part. You need something more than a YSlow & Jenkins. It's highly depended on application's network architecture - if you run tests locally, or remotely.
Robotframework + Selenium Server
With Robotframework's selenium2library can use Selenium Server bindings to manipulate with browsers DOM. In other words, you can create very simple automated login tests and actions. Afterwards the SSHLibrary or Terminal library can run YSlow commands and you just need to provide output files to Jenkins readable location. It's pretty complex solution suitable for advanced network architecture with many dependencies.
PhantomJS + CasperJs
If you need to run your performance tests locally, you can use advantage of headless browser PhantomJS. In combination with CasperJS you are able to manipulate with DOM to create automated login processes. PhantomJS is also compatible with YSlow, so once again, you just need to define the output files location for Jenkins.
For both solutions (or any other) you will need additional jenkins plugins to read output files generated by YSlow.

Performance monitoring of production site using Shell script and Selenium Web Drivers

I will shortly try to explain what I am trying to do here. I need to periodically check the response time of the my site by logging into the system and noting the time to load the welcome page.
I am doing this using Selenium WebDriver and Java. I am currently checking the response time using the org.apache.commons.lang3.time.StopWatch which start when user hits the login button and stops when welcome page renders completely. I check weather this response time is above threshold level and send mail to admin alerting him in case of slow response of system.
Currently, I have created the executable jar file which opens the web browser using Selenium WebDriver and check the response time. I have also created the job in Jenkins using DOS commands which runs periodically using cron schedular. This I'm doing in my Windows 7 pc and I have Jenkins installed on my localhost. The scheduled job builds on Jenkins periodically but I can't see any activity like opening the web browser and the further task explained above. It runs perfectly when I use windows scheduler to execute batch file. The ultimate goal I have, is to run the Selenium WebDriver tests on the Linux system via jenkins while Jenkins server has been installed on a Linux machine.
Any help will be great! Also let me know if anybody wants to see the code.

Cucumber failing in chrome with "Retry later" message

I'm trying to run my cucumber tests and they seem to stop randomly. A new page is visited but nothing renders on the page except Retry later as in the following screenshot.
I'm on OS X 10.9.3, Chrome 35.0.1916.114, and running with bundle exec cucumber. It's happening in and Firefox also if I change the javascript driver.
The problem was not with Chrome, Cucumber, or Capybara. It was with Rack::Attack. 127.0.0.1 was whitelisted but according to this github issue
it wasn't whitelisting ipv6 and transitional loopback ip addresses
To simplify things I just moved Rack Attack to be production only.
tl;dr
Rack::Attack was to blame. Unless you need it in your test environment, just make the gem production only.

Resources