Repo: https://github.com/samsoncms/cms
Trying to run web-application automated behat tests, locally everything works fine, but when I install apache on travis, then set then default host, it always uses "PHP Version 5.3.10-1ubuntu3.18"
What should I configure in .travis.yml to set apache to work to current environment php version?
Manual travis-ci.com/user/languages/php/#Apache-%2B-PHP does not work
Related
I am facing issue while running jmeter using docker container. The script works fine when I run it through GUI or CLI on my local machine. But when I execute same script using container it getting failed.
Below is the issue.
So I am using beanshell postprocessor for capturing response cookies. Below is the code for same.props.put("MyCookie1","${COOKIE_one}");
props.put("MyCookie2","${COOKIE_two}");
props.put("MyCookie3","${COOKIE_three}");
And this parameterized value works fine in my local machine(windows 10). But when I run the same in container these parameterized value doesn't gets resolved.
I am using "alpine:3.12" base image in container.
NOTE : Jmeter version in my local machine is "5.4.1" and java version is "java 11". In docker container Jmeter version is "5.3" and java version is "java 8". The API which I am hitting is hosted in AWS Lambda.
You forgot the most important detail: your Dockerfile
Blind shot: in order to be able to access cookies as COOKIE_one, etc. - you need to add an extra property to wit CookieManager.save.cookies=true either to user.properties file or to pass it to JMeter startup script via -J command-line argument like:
./jmeter -JCookieManager.save.cookies=true -n -t test.jmx -l result.jtl
Also according to JMeter Best Practices:
Since JMeter 3.1 you should be using JSR223 Test Elements and Groovy language for scripting
You should always be using the latest version of JMeter
So maybe it worth consider migrating to Groovy, you will only need to amend your code from:
props.put("MyCookie1","${COOKIE_one}")
to
props.put("MyCookie1",vars.get("COOKIE_one"))
where vars stands for JMeterVariables class instance, see Top 8 JMeter Java Classes You Should Be Using with Groovy for more information if needed.
And update your Dockerfile to use the latest stable version of JMeter
Technology used:
- Windows 10
- Docker for Windows
- DevilBox
- Drupal 8.6.4
(Optional tech: cygwin to simulate linux commands).
When attempting to add a new module via URL or file upload in Drupal 8, the site asks me for FTP credentials and I have no more ideas where to find or set them.
I have a basic install of Devilbox running a brand new installation of Drupal 8. (Devilbox is a dockerized php stack).
To solve my problem I bypassed finding the FTP credentials.
I will change the accepted answer to the first correct answer that is not mine and a bypass.
First step, stopped using cygwin. Started using Powershell.
Next step, navigate to the site's installation within devilbox:
/devilbox/data/wwww/<yoursite>/htdocs
Then run command: composer self-update
Followed by:
composer require drupal/<drupal module to add>
Magically, module is under the modules page on drupal 8.
TL;DR: Any idea about how to properly configure capybara to be able to drive a remote selenium browser in a docker container with default Rails minitest system test?
I'm running Rails in a dockerized env..
Now I want to start some "system tests" but since I'm running inside Docker I come up with some issues.
I'm using the default test suite (minitest?) with capybara and selenium-webdriver gems.
I've already installed the chromedriver packet in the container using the following:
RUN apt-get install -y chromedriver \
&& ln -s /usr/lib/chromium-browser/chromedriver /usr/local/bin
But running rails test:system outputs the following error Selenium::WebDriver::Error::WebDriverError: Unable to find chromedriver.
In fact I don't know if chrome itself is installed or not?
which chrome outputs nothing.
which chromium outputs /usr/bin/chromium.
I also tried with xvfb without success.
So (since I had no clue) I tried to go further and actually go with a dockerized system test environment as well.
I found some Docker images from selenium. So I ran among my rails and database containers a selenium-standalone-chrome container (the actual docker-compose.yml I'm using is here)
Then I found some useful information about the configuration to be done to let capybara driver the remote selenium browser.
All the examples I found on internet use rspec, but since I'm using the default minispec I tried to adapt the capybara driver to minispec but I had some doubt about how to do it and where to put the configuration.
For system tests I guessed that the best location is the file application_system_test_case.rb. Also I found and I tried many different capybara configurations and I end up with the following which seems to be the most complete (available here)
At that moment the test seems to perform well since I have no error but it always fails.
It fails regardless of making a call to the driver configuration (the setup_remote method where I defined the server host and port) before the tests case.
With or without the call I got the "site can't be reached" error (here is the screenshot)
Here is the test file I used. (Testing some react dynamic display)
However I can access to the selenium container with the given URL from the browser from my host machine. And both containers sees each others. I did some ping from within the containers shell.
The following SO questions being helpful don't work for me:
Dockerized selenium browser cannot access Capybara test url
How can I run headless browser system tests in Rails 5.1?
Any idea about how to properly configure capybara to be able to drive a remote selenium browser in a docker container with default Rails minitest system test?
Thank you very much.
You have to override the host method so Capybara uses the container's IP address. Check out this post: https://medium.com/#pacuna/using-rails-5-1-system-tests-with-docker-a90c52ed0648
I am new to Ruby on Rails. My questions is:
Is it possible to setup Redmine and install the plugins on my local machine (macOS Sierra), test the Redmine application on localhost, once everything has been done successfully, then only deploy it on a Linux server?
If it is possible, which part of the code should I modify in order to deploy it on Linux server successfully? (Both of my local machine and Linux server are running MySQL database)
Yes it is possible and you don't need to change any part of the Redmine code to do so. Deployment of Rails apps is often done with a tool called Capistrano (http://capistranorb.com/), which executes through ssh on your server, checks out the code and does any additional installation steps necessary. This approach requires you to have your app (Redmine and plugins in your case) in a git repository (or subversion etc). In the simplest case, fork redmine on github and add any plugins as git submodules.
As you're unfamiliar with the platform I'd suggest to start with a simple rails app that you create locally. once you have worked out deployment of that to a remote server, tackle Redmine.
Sounds like a lot of upfront effort but it's worth it since it enables you to work on your local machine, make changes and then deploy the changed code with a single command.
If the Redmine installation on your local host has the same installation path as on the production server, then you can just copy the installation files to the production server. You will also have to copy the database to the production server.
If the installation path is different on your local host and production server, then you will have to install the Redmine and plugin on your production server
I'd like to setup a rails development environment where everything is running in a Ubuntu VM but I use windows 7 for display.
Under ububtu (Which is a virtualbox VM)
ruby
rails environment (rvm, rails, rails webserver...)
git
Under Windows (native OS)
browser
IDE
I SSH into the VM for executing commands, and files are shared with nfs.
Everything works great, but I would like to try RubyMine as my editor. So in this setup I have two choices:
Install RubyMine on windows, and configure it so it executes it's commands in the VM via SSH.
Install RubyMine on ubuntu, and display it in windows with a X server running on windows (such as Xming) : rubymine --display windows_ip:0
So my question is : Is it possible to configure rubyMine so that it executes commands via SSH ?
I have seen this feature request, so my guess it that it's not currently possible.
Currently we're working on this feature and it will be available in the next 5.0 EAP