Generate auto build and upload to ftp server using bitbucket pipelines - bitbucket

I have been trying to generate the build using bitbucket pipelines and upload it to my FTP server.
But every time I try to run the pipeline, I get this error.
git ftp init --user $FTP_USERNAME --passwd $FTP_PASSWORD $FTP_HOST_PATHfatal: Can't access remote 'ftp://ubuntu:**#mywebsite.com', exiting...*
My whole pipeline script looks like this
image: samueldebruyn/debian-git
pipelines:
default:
- step:
caches:
- composer
script:
- apt-get update
- apt-get -qq install git-ftp
- git ftp init --user $FTP_USERNAME --passwd $FTP_PASSWORD sftp://$SFTP_HOST:22/$SFTP_FOLDER

used Rsync instead of git-FTP and it worked fine.

Related

SonarScanner fails with apt-get not found

I have installed SonarQube on a ubuntu machine via a docker image. All working fine and I'm able to log in without issues.
Have connected to our GitLab installation and see all available projects, when I try to configure the existing pipeline with the following, I got stuck.
I have the following pipeline.yml in use (partially shown here):
sonarqube-check:
stage: sonarqube-check
image: mcr.microsoft.com/dotnet/core/sdk:latest
variables:
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar" # Defines the location of the analysis task cache
GIT_DEPTH: "0" # Tells git to fetch all the branches of the project, required by the analysis task
cache:
key: "${CI_JOB_NAME}"
paths:
- .sonar/cache
script:
- "apt-get update"
- "apt-get install --yes openjdk-11-jre"
- "dotnet tool install --global dotnet-sonarscanner"
- "export PATH=\"$PATH:$HOME/.dotnet/tools\""
- "dotnet sonarscanner begin /k:\"my_project_location_AYDMUbUQodVNV6NM7qxd\" /d:sonar.login=\"$SONAR_TOKEN\" /d:\"sonar.host.url=$SONAR_HOST_URL\" "
- "dotnet build"
- "dotnet sonarscanner end /d:sonar.login=\"$SONAR_TOKEN\""
allow_failure: true
only:
- master
All looking good, but when it runs it gives me this error:
$ apt-get update
bash: apt-get: command not found
I just don't know how to fix this and can't find a solution on the internet somewhere
dotnet/core/sdk image has apt (not apt-get):
$ docker run -ti --rm mcr.microsoft.com/dotnet/core/sdk:latest sh
# apt update
Following SonarCube documentation, you can use their docker image with the CLI already installed:
image:
name: sonarsource/sonar-scanner-cli:latest
variables:
SONAR_TOKEN: "your-sonarqube-token"
SONAR_HOST_URL: "http://your-sonarqube-instance.org"
SONAR_USER_HOME: "${CI_PROJECT_DIR}/.sonar" # Defines the location of the analysis task cache
GIT_DEPTH: 0 # Tells git to fetch all the branches of the project, required by the analysis task
cache:
key: ${CI_JOB_NAME}
paths:
- .sonar/cache
sonarqube-check:
stage: test
script:
- sonar-scanner -Dsonar.qualitygate.wait=true
allow_failure: true
only:
- master
Apt /apt-get command not found - Problem fixed:
I think in your /usr/bin have no the apt and apt-get, you can download it and install it on that https://packages.debian.org/stretch/apt, like this
wget http://ftp.cn.debian.org/debian/pool/main/a/apt/apt_1.4.9_amd64.deb
dpkg -i apt_1.4.9_amd64.deb

jailshell: composer: command not found

When I SSH into my server and run composer install it works without any issues. However when the command is issued from my CI/CD that would SSH on my server to get the changes I get this error:
jailshell: composer: command not found
How do I fix this?
My CI/CD:
deploy:
stage: deploy
script:
- ssh -tt user#domain.com
"cd /path/to/public_html/ &&
git checkout master &&
git pull &&
composer install &&
exit"
Edit
To answer #NicoHaase's question. There was a guide about adding an alias on my .bashrc but it still didn't work so I ended up downloading composer.phar and added it to my repository and instead of composer install, I use php composer.phar install instead. I mean it works now but that's not the solution to this question so I'm not writing that as an answer.

How to run bitbucket pipeline to deploy php based app on nanobox

I am trying to setup bitbucket pipeline for a php based (Laravel-Lumen) app intended to be deployed on nanobox.io. I want this pipeline to deploy my app as soon as code changes are committed.
My bitbucket-pipelines.yml looks like this
image: php:7.1.29
pipelines:
branches:
staging:
- step:
name: Publish to staging version
deployment: staging
caches:
- composer
script:
- apt-get update && apt-get install -y unzip
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
# - vendor/bin/phpunit
- bash -c "$(curl -fsSL https://s3.amazonaws.com/tools.nanobox.io/bootstrap/ci.sh)"
- nanobox deploy
This gives Following error
+ nanobox deploy
Failed to validate provider - missing docker - exec: "docker": executable file not found in $PATH
Using nanobox with native requires tools that appear to not be available on your system.
docker
View these requirements at docs.nanobox.io/install
I then followed this page and changed second last line to look like this
sudo bash -c "$(curl -fsSL https://s3.amazonaws.com/tools.nanobox.io/bootstrap/ci.sh)"
when done that, I am getting following error
+ sudo bash -c "$(curl -fsSL https://s3.amazonaws.com/tools.nanobox.io/bootstrap/ci.sh)"
bash: sudo: command not found
I ran out of tricks here, also I don't have experience in this area. Any help is very much appreciated.
First, you can't use sudo in pipelines, but that's probably not relevant here. The issue is that nanobox cli wan't to execute docker, which isn't installed. You should enable the docker service for your step.
image: php:7.1.29
pipelines:
branches:
staging:
- step:
name: Publish to staging version
deployment: staging
# Enable docker service
services:
- docker
caches:
- composer
script:
- docker version
You might wan't to have a look at Pipelines docs as well: Run Docker commands in Bitbucket Pipelines

Using BitBucket Pipelines to Deploy onto VPS via SSH Access

I have been trying to wrap my head around how to utilise BitBucket's Pipelines to auto-deploy my (Laravel) application onto a Vultr Server instance.
I have the following steps I do manually, which I am trying to replicate autonomously:
I commit my changes and push to BitBucket repo
I log into my server using Terminal: ssh root#ipaddress
I cd to the correct directory: cd /var/www/html/app/
I then pull from my BitBucket repo: git pull origin master
I then run some commands: composer install, php artisan migrate etc..
I then log out: exit
My understanding is that you can use Pipelines to automatise this, is this true?
So far, I have set up a SSH key pair for pipelines and my server, so my server's authorized_keys file contains the public key from BitBucket Pipelines.
My pipelines file bitbucket-pipelines.yml is as follows:
image: atlassian/default-image:latest
pipelines:
default:
- step:
deployment: staging
caches:
- composer
script:
- ssh root#ipaddress
- cd /var/www/html/app/
- git pull origin master
- php artisan down
- composer install --no-dev --prefer-dist
- php artisan cache:clear
- php artisan config:cache
- php artisan route:cache
- php artisan migrate
- php artisan up
- echo 'Deploy finished.'
When the pipeline executes, I get the error: bash: cd: /var/www/html/app/: No such file or directory.
I read that each script step is run in it's own container.
Each step in your pipeline will start a separate Docker container to
run the commands configured in the script
The error I get makes sense if it's not executing cd /var/www/html/app within the VPS after logging into it using SSH.
Could someone guide me into the correct direction?
Thanks
The commands you are defining under script are going to be run into a Docker container and not on your VPS.
Instead, put all your commands in a bash file on your server.
1 - Create a bash file pull.sh on your VPS, to do all your deployment tasks
#/var/www/html
php artisan down
git pull origin master
composer install --no-dev --prefer-dist
php artisan cache:clear
php artisan config:cache
php artisan route:cache
php artisan migrate
php artisan up
echo 'Deploy finished.'
2 - Create a script deploy.sh in your repository, like so
echo "Deploy script started"
cd /var/www/html
sh pull.sh
echo "Deploy script finished execution"
3 - Finally update your bitbucket-pipelines.yml file
image: atlassian/default-image:latest
pipelines:
default:
- step:
deployment: staging
script:
- cat ./deploy.sh | ssh <user>#<host>
- echo "Deploy step finished"
I would recommend to already have your repo cloned on your VPS in /var/www/html and test your pull.sh file manually first.
The problem with the answer marked as the solution is that the SH process won't exit if any of the commands inside fails.
This command php artisan route:cache for instance, can fail easily! not to mention the pull!
And even worse, the SH script will execute the rest of the commands without stop if any fail.
I can't use any docker command because after each, the CI process stops and I can't figure out how to avoid those commands to not exit the CI process. I'm using the SH but I'll start adding some conditionals based on the exit code of the previous command, so we know if anything went wrong during the deploy.
I know this may be an old thread, but bitbucket does provide a pipeline to do all that is mentioned above in a much cleaner way.
Please have a look at https://bitbucket.org/product/features/pipelines/integrations?p=atlassian/ssh-run
Hope this helps.

Bitbucket pipelines Buffers some files but it doesn't upload

So I have setup on my bitbucket a pipeline to push to my demo server
image: php:7.1.1
pipelines:
default:
- step:
caches:
- composer
script:
- echo "Pipeline Init"
- apt-get update
- apt-get -qq install git-ftp
- echo "Initiating Push site:Source."
- git config git-ftp.syncroot wordpress/wp-content/themes/ip-callcenters/
- git ftp init --user $FTP_USER --passwd $FTP_PASSWORD ftp://myip/ipcc/wp-content/themes/myfolder/
So far everything works..
Tells me There are 143 files to sync:
and starts [1 of 143] Buffered for upload etc etc...
So for some reason stops buffering after 26-30 files being buffered it doesn't continue, then says Uploading.... and after some minutes I get a
fatal error: Could not upload files., exiting...
Any idea how can I get this working ?
- git ftp push --user $FTP_USER --passwd $FTP_PASSW ftp://myip/ipcc/wp-content/themes/myfolder/
use "ftp push" when you sync repository.
and "ftp init" when you clone repository to FTP for first time.

Resources