How to get see stdout on Travis CI? - travis-ci

I want to see the table names in a database created via the .travis.yml file. Locally I just type
mysql myapp_test -e 'show tables;'
In the .travis.yml file I put
- mysql myapp_test -e 'show tables;'
but this doesn't show a list of tables on the Travis Console.
How do I get stdout to show up in the Travis console?

Related

Set elasticsearch initial password via docker-compose

Can i know, how to set initial password for elasticsearch database using docker-compose
bin/elasticsearch-setup-passwords auto -u "http://192.168.2.120:9200
See this:
The initial password can be set at start up time via the ELASTIC_PASSWORD environment variable:
docker run -e ELASTIC_PASSWORD=MagicWord docker.elastic.co/elasticsearch/elasticsearch-platinum:6.1.4
Also, for newest image (docker.elastic.co/elasticsearch/elasticsearch:7.14.0), the ELASTIC_PASSWORD_FILE environment added mentioned in Configuring Elasticsearch with Docker:
For example, to set the Elasticsearch bootstrap password from a file, you can bind mount the file and set the ELASTIC_PASSWORD_FILE environment variable to the mount location. If you mount the password file to /run/secrets/bootstrapPassword.txt, specify:
-e ELASTIC_PASSWORD_FILE=/run/secrets/bootstrapPassword.txt
So add these environment in docker-compose.yaml I guess could work for you.

Could not find Docker hostname on Gitlab CI

I have an app inside a Docker container based on Elixir image, that need to connect to a database and run tests using a Gitlab runner.
The build stage works fine but there is a problem to connect to a database to run tests. I tried both connecting to a service and running another database container, but from the logs it looks like the problem is with the Phoenix app:
** (RuntimeError) :database is nil in repository configuration
lib/ecto/adapters/postgres.ex:121: Ecto.Adapters.Postgres.storage_up/1
lib/mix/tasks/ecto.create.ex:40: anonymous fn/3 in Mix.Tasks.Ecto.Create.run/1
(elixir) lib/enum.ex:675: Enum."-each/2-lists^foreach/1-0-"/2
(elixir) lib/enum.ex:675: Enum.each/2
(mix) lib/mix/task.ex:301: Mix.Task.run_task/3
(mix) lib/mix/cli.ex:75: Mix.CLI.run_task/2
This is how the config/test.exs file looks like
config :app, App.Repo,
adapter: Ecto.Adapters.Postgres,
username: System.get_env("POSTGRES_USER"),
password: System.get_env("POSTGRES_PASSWORD"),
database: System.get_env("POSTGRES_DB"),
hostname: System.get_env("POSTGRES_HOSTNAME"),
pool: Ecto.Adapters.SQL.Sandbox
This is the output from the runner:
$ docker run --rm -t $CONTAINER echo $MIX_ENV $POSTGRES_USER $POSTGRES_HOSTNAME $POSTGRES_DB
test username db test_db
I'm trying to figure out why I get this error :database is nil, and if it is related to Gitlab, Ecto or Phoenix.
Edit
I wrote static values in the config/*.exs files (for some reason it didn't pick them up), but now it can't find the postgresql hostname. Although the postgresql instance is running it can't find it.
I checked if the instance is running with docker ps
Based on the message :database is nil in repository configuration it seems to me like your POSTGRES_DB variable is not set. You can try to change that configuration line to
database: System.get_env("POSTGRES_DB") || "postgres"
to see whether you still get the same error. If you don't, you can debug from there.

GitLab CI for Rails App Using Postgres and Elasticsearch (searchkick gem)

How does one go about configuring a .gitlab-ci.yml file for a Rails app that depends on PosgreSQL and Elasticsearch via the searchkick gem to run my tests when I push it to GitLab?
I wanted to post this question, as it took me far too long to find the answer and don't want others to feel my pain. The below example not only builds my application, but also runs all my specs.
Setup
Rails 5+
PostreSQL 9.6
Rspec gem
Searchkick gem (handles Elasticsearch queries and configuration)
Configuration
Add the following files to your Rails app with the configurations listed.
config/gitlab-database.yml
test:
adapter: postgresql
encoding: unicode
pool: 5
timeout: 5000
host: postgres
database: test_db
user: runner
password: ""
.gitlab-ci.yml
image: ruby:2.4.1
services:
- postgres:latest
- elasticsearch:latest
variables:
POSTGRES_DB: test_db
POSTGRES_USER: runner
POSTGRES_PASSWORD: ""
ELASTICSEARCH_URL: "http://elasticsearch:9200"
stages:
- test
before_script:
- bundle install --without postgres production --jobs $(nproc) "${FLAGS[#]}"
- cp config/gitlab-ci/gitlab-database.yml config/database.yml
- RAILS_ENV=test bundle exec rails db:create db:schema:load
test:
stage: test
script:
- bundle exec rspec
And that's it! You're now configured to auto-run your specs on gitlab for each push.
Further Explaination
Let's start with PostgreSQL. When we start our new runner, the application we're copying in won't know how to properly connect to Postgres. Thus we create a new database.yml file, which we prefixed with gitlab- so it doesn't conflict with our actual configuration, and copy that file into runner's config directory. The cp command not only copy's the file, but will replace the file if it currently exists.
The items we're connecting to via GitLab are database:, user:, and password:. We do this by specifying those same names within our environment variables, ensuring everything connects properly.
Okay, connecting to PostgreSQL is well explained and documented on GitLab's website. So how did I get Elasticsearch working, which isn't explained very well anywhere?
The magic happens again in variables. We needed to set the ELASTICSEARCH_URL environmental variable, made available to us through the Searchkick gem, as Elasticsearch looks for http://localhost:9200 by default. But since we're using Elasticsearch through a service, we need to explicitly tell it to not use the default and use our service's hostname. So we then replaced http://localhost:9200 with http://elasticsearch:9200, which does map properly to our service.

How to set up Travis CI and postgresql using custom db credentials?

I'm trying to set up custom Postgres credentials with Travis CI as I'd like to avoid the existing credentials definition in the code to be tested.
The testing code defines that the database should be accessed on:
'sqlalchemy.url': 'postgresql://foo:bar#localhost/testing_db'
I've therefore created a database.travis.yml file:
postgresql: &postgresql
adapter: postgresql
username: foo
password: bar
database: testing_db
...and added the following to my .travis.yml:
services:
- postgresql
before_script:
- psql -c 'create database stalker_test;' -U postgres
- mkdir config && cp database.travis.yml config/database.yml
However, I am still getting this during testing:
OperationalError: (psycopg2.OperationalError) FATAL: role "foo" does not exist
What am I doing wrong?
Adding the following to .travis.yml solved my issue. No need for a database.travis.yml file.
before_script:
- psql -c "CREATE DATABASE testing_db;" -U postgres
- psql -c "CREATE USER foo WITH PASSWORD 'bar';" -U postgres
Database.yml seems to be a Ruby On Rails thing. Travis CI started with rails / ruby testing, so the docs might be reflecting that.
You most probably need to do your setup in a separate script or migration setup, and not rely on travis, except for running the service.

Deploying local database to production and the other way around

I have a rails app that has a model called 'Opportunity'. Say this model has several records stored in the database on my local development environment and I now want to deploy the app. With it, I want all the data in my local database to be deployed as well.
Is this possible? I have looked at rake tasks and seeding but both do not seem to be quite what I want.
Thanks so much for your help.
For postgres the command to export your database is called pg_dump
To dump a database:
$ pg_dump mydb > db.out
To reload this database:
$ psql -d database -f db.out
why cant you take table dump of development db and import it in production db
For mysql
in development :
mysqldump -u username -p developmentdb tablename > for_production.sql
in production:
mysql -u username -p productiondb < for_production.sql

Resources