Unable to query in code, but artisan migrate works - database-connection

I have set up a new Laravel 5.4 project being served by php artisan serve with a MySQL database all on a Windows machine (I'm not using Homestead or a VM). I am able to run database migrations just fine using php artisan migrate, but if I try to query the database from a Controller (DB::connection()->select('select * from users');, for example) I receive the following error:
PDOException in Connector.php line 68: SQLSTATE[HY000] [2002] No connection could be made because the target machine actively refused it.
I can't figure out why Laravel is unable to query the database. I have tried the following things:
I can connect to the database through Sqlyog with the same credentials in my .env file.
I can run php artisan migrate:reset and php artisan migrate successfully.
I've tried pointing to both a Microsoft SQL Server running on my local machine as well as a MySQL server running on my local machine. In both cases php artisan migrate works fine but I'm not able to run queries through my app.
I've tried switching the DB_HOST variable in my .env file between 127.0.0.1 and localhost without luck.
I've tried running php artisan config:clear without luck.
Is there a step I'm missing somewhere?

Related

How to dockerize Foxx services?

I use ArangoDB as a backend server for my web application. So far I have used the Foxx CLI to deploy my code to the ArangoDB server. I wanted to deploy my entire application using Docker, but I can't figure out how to add my Foxx service source codes to an ArangoDB using containers. Is it possible? If it is so what would be the correct way to do this?
So far I have tried a docker-compose approach: running the official ArangoDB image and building another image equipped with Foxx CLI to install the source files, but I got "connection refused" error from the database server when I ran the "foxx install" method from the container. (The ArangoDB server was working fine and I could run the "foxx install" command successfully outside virtualization).
For development purposes, I just keep the Foxx services in development mode and map the Foxx folder in my ArangoDB container (/var/lib/arangodb3-apps/_db/) to a folder in my machine using docker-compose volume definition.
Here is what a sample docker-compose service for ArangoDB could look like:
services:
arangodb_dev:
image: arangodb
container_name: my_arangodb_dev
environment:
- ARANGO_ROOT_PASSWORD=XXXXXX
ports:
- "8529:8529"
volumes:
- ./Arango/db:/var/lib/arangodb3
- ./Arango/apps_db_system:/var/lib/arangodb3-apps/_db/
Above we map both the Foxx service directory and the DB files directory to a local folder for persistency purposes.
Beyond development, you probably want to copy the files to the correct folder in the container instead of mapping the folder

Rails container cannot connect to mysql container with gitlab ci

I am setting up a simple gitlab ci for a Rails app with build, test and release stages:
build:
stage: build
script:
- docker build --pull -t $TEST_IMAGE .
- docker push $TEST_IMAGE
test:
stage: test
services:
- docker:dind
script:
- docker pull $TEST_IMAGE
- docker run -d --name mysql -e MYSQL_ROOT_PASSWORD=mysql_strong_password mysql:5.7
- docker run -e RAILS_ENV=test --link mysql:db $TEST_IMAGE bundle exec rake db:setup
build succeeds building the docker image and pushing to registry
test launches another mysql container which I use as my host db, but fails when establishing connection to mysql.
Couldn't create database for {"host"=>"db", "adapter"=>"mysql2", "pool"=>5, "username"=>"root", "encoding"=>"utf8", "timeout"=>5000, "password"=>"mysql_strong_password", "database"=>"my_tests"}, {:charset=>"utf8"}
(If you set the charset manually, make sure you have a matching collation)
rails aborted!
Mysql2::Error: Can't connect to MySQL server on 'db' (111 "Connection refused")
I also tried creating seperate docker network using --network instead of link approach, did not help.
That happens only on Gitlab runner instance. When I perform those steps on local machine it works fine.
After much reading I get to think it is a bug with docker executor. Am I missing something?
Connection refused indicates that the containers know how to reach each other, but the target container does not have anything accepting connections on the selected port. This most likely means you are starting your application up before the database has finished initializing. My recommendation is to update/create your application or create an entrypoint in your application container that polls the database for it to be up and running, and fail after a few minutes if it doesn't start up. I'd also recommend using networks and not links since links are deprecated and do not gracefully handle containers being recreated.
The behavior you're seeing is documented in the mysql image:
No connections until MySQL init completes
If there is no database initialized when the container starts, then a default database will be created. While this is the expected behavior, this means that it will not accept incoming connections until such initialization completes. This may cause issues when using automation tools, such as docker-compose, which start several containers simultaneously.
If the application you're trying to connect to MySQL does not handle MySQL downtime or waiting for MySQL to start gracefully, then a putting a connect-retry loop before the service starts might be necessary. For an example of such an implementation in the official images, see WordPress or Bonita.
From the linked wordpress example, you can see their retry code:
$maxTries = 10;
do {
$mysql = new mysqli($host, $user, $pass, '', $port, $socket);
if ($mysql->connect_error) {
fwrite($stderr, "\n" . 'MySQL Connection Error: (' . $mysql->connect_errno . ') ' . $mysql->connect_error . "\n");
--$maxTries;
if ($maxTries <= 0) {
exit(1);
}
sleep(3);
}
} while ($mysql->connect_error);
A sample entrypoint script to wait for mysql without changing your application itself could look like:
#!/bin/sh
wait-for-it.sh mysql:3306 -t 300
exec "$#"
The wait-for-it.sh comes from vishnubob/wait-for-it, and the exec "$#" at the end replaces pid 1 with the command you passed (e.g. bundle exec rake db:setup). The downside of this approach is that the database could potentially be listening on a port before it is really ready to accept connections, so I still recommend doing a full login with your application in a retry loop.

Artisan migrate docker error

I have Laradock setup and serving a website in larval, but when I try to run php artisan migrate I get this error.
SQLSTATE[HY000] [2002] No such file or directory (SQL: select * from information_schema.tables where table_schema = yt and table_name = migrations)
DB_CONNECTION=mysql
DB_HOST=localhost
DB_PORT=3306
DB_DATABASE=yt
DB_USERNAME=root
DB_PASSWORD=root
I can not seem to find a solution to my issue.
First thing you should check which container run the mysql service :
sudo docker ps
Maybe it not expose the port from mysql container to localhost (127.0.0.1) so laravel can't connect to it .
Find the mysql container name then change the DB_HOST .Let take an example:
app-container 172.0.0.1
mysql-container 172.0.0.2
Because when docker run up ,it will create a virtual networking for itself ,then it will expose to your computer .So if you want laravel can work with msql ,you should change the DB_HOST to 172.0.0.2 in this example case .
I had same issue with Laradock on MacOS, couldn't connect to MariaDB container.
My way:
Get correct name for MariaDB container:
docker ps
Inspect container (for example container name is: container_mariadb_1)
docker inspect container_mariadb_1
At very bottom of long list of parameters you can see IPAddress
"IPAddress": "172.26.0.3"
I put this IP in Laravel's .env config file as DB_HOST and this is it.
Of course I'm not sure if this way is really correct, but I know that it's work for me at least twice.
UPDATE: Also in my case Laravel connects to MariaDB normally if I use DB_HOST=mariadb in .env file.

How to run auto-upgrade of ArangoDB 3.1 to 3.2 on docker image

When trying to upgrade a docker container with ArangoDB 3.1 to 3.2 I run into the issue with the database needing upgrade:
FATAL Database '_system' needs upgrade. Please start the server with the --database.auto-upgrade option
FATAL Database '_system' upgrade failed. Please inspect the logs from the upgrade procedure
How do I actually pass the setting? I tried setting command: 'arangod --database.auto-upgrade true in my docker-compose.yml, but that does nothing.
I also use docker-compose for my system in which arangodb is run under service name database like this:
version: '2.1'
services:
database:
image: arangodb:3.1.3
ports:
- 8529:8529
volumes:
- /opt/my-system/Database/arangodb:/var/lib/arangodb3
- /opt/my-system/Database/arangodb-apps:/var/lib/arangodb3-apps
restart: always
healthcheck:
test: curl -f my-system:8529/_api/version || exit 1
.
.
. other services
Before upgrading, I also have to stop my system.
I have just upgraded my arangodb container with the following steps (3.1.3 => 3.2.5)
docker pull arangodb:3.2.5 => get the image you want to upgrade to
docker-compose stop => stop my system which uses the database
backup the database volumes (I just make a copy of /opt/my-system/Database folder)
docker-compose rm -f database => remove the container running old arangodb
update docker-compose.yml file with new arangodb image => so image:arangodb:3.1.3 becomes image:arangodb:3.2.5
docker-compose run --rm database arangod --database.auto-upgrade => this will create the database container running v3.2.5, upgrade the database files, then remove the container when it is done.
docker-compose up -d database => start the upgraded database to see if everything is OK
docker-compose start => start the rest of the system, which now uses the upgraded database
If I had errors during upgrade, I could have easily rolled back to v3.1.3, as I always keep the prev image and the database files.
Hope this helps!

Laravel 5 application can't connect MariaDB engine in docker container

I create a new Laravel 5 application in a docker container. I can access the home url and get the welcome message. I try create new routes and they are working too. Then I run a MariaDB docker container to link to the Laravel 5 application. Here is where the problems begin.
When I'm trying to run migrations in Laravel 5 with the following command:
php artisan migrate --force
And I get the following error message:
Can't connect to MySQL server on '127.0.0.1'
My .env file are like this:
DB_CONNECTION=mysql
DB_HOST=127.0.0.1
DB_DATABASE=blog
DB_USERNAME=blog
DB_PASSWORD=123456
I know that these variables are used by Laravel to connect the data base because they exists in Laravel log file like this:
PDO->__construct(‘mysql:127….’, ‘blog’, ‘123456’, Array)
The database engine is MariaDB and it is running in a docker container. This docker container exposes the port 3306 and is linked to the container that run Laravel. To link the container I use the following docker command:
docker run –i –t - - link mariadb:mysql miguelbgouveia/laravel:v3 /bin/bash
I also know that my MariaDB docker container is running with the correct configurations because I use a phpmyadmin docker container that is linked to it and I can connect with to the data base with success. I link the MariaDB container with the phpmyadmin container in the same manner that I link it to Laravel container (--link mariadb:mysql)
Why I can’t connect to the database? There is any configuration or php module to install that are missing?
After all is very simple. If I use the mysql host in my environment variables it just work without having to known the IP address of the MariaDB docker container.
The .env file goes like this:
DB_CONNECTION=mysql
DB_HOST=mysql
DB_DATABASE=blog
DB_USERNAME=blog
DB_PASSWORD=123456
Now I can connect the MariaDB engine with success.

Resources