Capybara with rspec in Jenkins - how to output to console - ruby-on-rails

I'm trying to output text to console during tests, to know what happens and have an history of the tests, but nothing seems to work, not printf, neither $stdout.write.
Should I just use a text log file and be done with it or it's possible to output to jenkins console?

As explained in https://content.pivotal.io/blog/what-happened-to-stdout-on-ci and https://github.com/ci-reporter/ci_reporter#environment-variables, you need to set CI_CAPTURE=off

Here is a copy of my jenkins config for running RSpec tests of a Rails app in Jenkins:
[ -d jenkins ] && rm -rf jenkins
mkdir jenkins
cp ~/configs/yourApp/default-db-config config/database.yml
rake db:migrate
rake db:test:prepare
export RAILS_ENV=test
export SPEC_OPTS="--no-drb --format documentation --format html --out jenkins/rspec.html"
rake spec
First it deletes any previous test history from the workspace if it exists.
Next it creates a jenkins directory in the workspace for storing the test output.
Then it sets up the app for testing with a working DB config (I don't store DB config files in my git repo).
Finally it migrates the dev DB if required, prepares the test DB, sets the RAILS_ENV to test and runs the tests with the specified SPEC_OPTS.
The important bits are as follows:
--format documentation ... this sends sensible output of test progress to your console log. Write your tests properly and this will be infinitely more useful than any puts commands you might have considered using.
--format html ... this outputs HTML files of the test results to the jenkins directory created earlier and specified in the --out attribute. Add the following to your job description to show those results on the main page of this job:
<iframe src='http://jenkins.your-domain.com/job/your-tests/ws/jenkins/rspec.html' width="100%" height="600" frameborder="0"/>
Hopefully that should get you up and running with a more useful jenkins test job for RSpec.

Not sure if Jenkins will change this (I don't think so), but in Rspec you can write
puts response.body
or
puts "My mommy made me mash my M&M's"
or whatever else you want and it will be put in the console/results

Related

Exporting RSpec tests from Docker

I am testing using Docker to run my ruby on rails Rspec tests. This will allow me greater flexibility to test against different databases etc.
In our Bamboo pipeline, its all working - except I assume the rspec.xml file is being placed inside the docker container, and not into the Bamboo working directory. If any tests fail - the bamboo job fails and number of tests is now not reported in bamboo and I assume its because of the 'missing' rspec.xml file.
We have a JUnit XML parser task which now also fails since it cannot find the XML output, and since the docker container is deleted at the end of the tests I assume the file will be deleted also.
Is there anyway to output this file to the Bamboo working directory?
Running the specs like this:
docker run --volume /home/bamboo/bamboo-agent-home/xml-data/build-dir/DIR-ABS2711-UTJGB:/usr/src/app --rm --env RAILS_ENV=test bond:latest bundle exec rake db:migrate rspec_tests:model_tests:run
Note that I am using a rake task to run the specs.
Thanks
So in order to have rspec output results in xml you need something like this:
rspec -r rspec_junit_formatter --format RspecJunitFormatter -o rspec.xml
But since I am using a rake task I cannot use that, I instead need to modify my rake task to out to xml:
require 'rspec/core/rake_task'
RSpec::Core::RakeTask.new(:spec) do |t|
t.fail_on_error = false
t.rspec_opts = "--no-drb -r rspec_junit_formatter --format RspecJunitFormatter -o rspec.xml"
end

Shortcut to run all rails tests?

I have to type
bundle exec rspec spec lib/crucible_kit/spec
every time I want to run all 700 of my rspec tests for my rails application. Is there anyway I could shorten down this to just typing "rr" to run all tests?
If so, where would I put this file in my rails application? And would I be able to push it to git branch so my teammates can use it?
Enter the code below in your command-line.
alias rr="bundle exec rspec"
It will be appended to this file ~/.bash_rc

Minitest Exit on first failure?

I'm using Minitest with Rails.
How do I get it to exit on the first failure if I'm running a bunch of tests? I want this to happen when I'm writing the tests because it's a waste of time for later tests to run after the failed one.
I've just found this answer while searching for the solution, but at least in Rails 5.1 you have one option:
rails test -h Usage: bin/rails test [options] [files or directories]
You can run a single test by appending a line number to a filename:
bin/rails test test/models/user_test.rb:27
You can run multiple files and directories at the same time:
bin/rails test test/controllers test/integration/login_test.rb
By default test failures and errors are reported inline during a run.
Rails options:
-w, --warnings Run with Ruby warnings enabled
-e, --environment Run tests in the ENV environment
-b, --backtrace Show the complete backtrace
-d, --defer-output Output test failures and errors after the test run
-f, --fail-fast Abort test run on first failure or error
-c, --[no-]color Enable color in the output
So you just just need to run rails test -f
You can probably use the minitest-fail-fast gem. You may also modify the Minitest reporters engine to exit after reporting a failure.

Scheduling rake task with cron

I'm trying to set daily cron job to update my site stats, but it looks like it doesn't work.
Cron entry (for deployer user):
0 0 * * * cd /var/www/my_site/current && rake RAILS_ENV=production stats:update
I'm running ubuntu server, with rbenv.
Any idea what's wrong?
Many times $PATH is defined differently when cron runs compared to when you are working in your own shell. Do "whereis rake" to find the full path to rake and then replace "rake" with its full path. (I am assuming that the "cd" command is working, so I am focusing on whether "rake" is found / running properly.)
Has cron sent you any emails with error messages after you added your command to your crontab?
You might want to run "crontab -l" under the proper user account to make sure that your cron command is actually registered within the crontab, especially if you aren't receiving any emails.
The presence of a Gemfile can also affect the ability to properly run rake. See, for example, Error: "Could not find rake", yet Rake is installed

Why Doesn't My Cron Job Work Properly?

I have a cron job on an Ubuntu Hardy VPS that only half works and I can't work out why. The job is a Ruby script that uses mysqldump to back up a MySQL database used by a Rails application, which is then gzipped and uploaded to a remote server using SFTP.
The gzip file is created and copied successfully but it's always zero bytes. Yet if I run the cron command directly from the command line it works perfectly.
This is the cron job:
PATH=/usr/bin
10 3 * * * ruby /home/deploy/bin/datadump.rb
This is datadump.rb:
#!/usr/bin/ruby
require 'yaml'
require 'logger'
require 'rubygems'
require 'net/ssh'
require 'net/sftp'
APP = '/home/deploy/apps/myapp/current'
LOGFILE = '/home/deploy/log/data.log'
TIMESTAMP = '%Y%m%d-%H%M'
TABLES = 'table1 table2'
log = Logger.new(LOGFILE, 5, 10 * 1024)
dump = "myapp-#{Time.now.strftime(TIMESTAMP)}.sql.gz"
ftpconfig = YAML::load(open('/home/deploy/apps/myapp/shared/config/sftp.yml'))
config = YAML::load(open(APP + '/config/database.yml'))['production']
cmd = "mysqldump -u #{config['username']} -p#{config['password']} -h #{config['host']} --add-drop-table --add-locks --extended-insert --lock-tables #{config['database']} #{TABLES} | gzip -cf9 > #{dump}"
log.info 'Getting ready to create a backup'
`#{cmd}`
# Strongspace
log.info 'Backup created, starting the transfer to Strongspace'
Net::SSH.start(ftpconfig['strongspace']['host'], ftpconfig['strongspace']['username'], ftpconfig['strongspace']['password']) do |ssh|
ssh.sftp.connect do |sftp|
sftp.open_handle("#{ftpconfig['strongspace']['dir']}/#{dump}", 'w') do |handle|
sftp.write(handle, open("#{dump}").read)
end
end
end
log.info 'Finished transferring backup to Strongspace'
log.info 'Removing local file'
cmd = "rm -f #{dump}"
log.debug "Executing: #{cmd}"
`#{cmd}`
log.info 'Local file removed'
I've checked and double-checked all the paths and they're correct. Both sftp.yml (SFTP credentials) and database.yml (MySQL credentials) are owned by the executing user (deploy) with read-only permissions for that user (chmod 400). I'm using the 1.1.x versions of net-ssh and net-sftp. I know they're not the latest, but they're what I'm familiar with at the moment.
What could be causing the cron job to fail?
When scripts run correctly interactively but not when run by cron, the problem is usually because of the environment environment settings in place ... for example the PATH as alrady mentioned by #Ted Percival, but may be other environment variables.
This is because cron will not invoke .bash_profile, .bashrc or /etc/profile before executing.
The best way to avoid this is to ensure any scripts invoked by cron do not make any assumptions about the environment when executing. Over-coming this can be as simple as including a few lines in your script to make sure the environment is setup properly. For example, in my case I have all the significant settings in /etc/profile (for RHEL), so I will include the following line in any scripts to be run under cron:
source /etc/profile
Looks like your PATH is missing a few directories, most importantly /bin (for /bin/rm). Here's what my system's /etc/crontab uses:
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
Are you sure the temporary file is being created correctly when running as a cron job? The working directory for your script will either be specified in the HOME environment variable, or the /etc/passwd entry for the user that installed the cron job. If deploy does not have write permissions for the directory in which it is executing, then you could specify an absolute path for the dump file to fix the problem.
Is cron sending emails with logs?
If not, pipe the output of cron to a log file.
Make sure to redirect STDERR to the log.

Resources