I have been trying to wrap my head around how to utilise BitBucket\'s Pipelines to auto-deploy my (Laravel) application onto a Vultr Server instance.
I have the followin
The commands you are defining under script
are going to be run into a Docker container and not on your VPS.
Instead, put all your commands in a bash file on your server.
1 - Create a bash file pull.sh
on your VPS, to do all your deployment tasks
#/var/www/html
php artisan down
git pull origin master
composer install --no-dev --prefer-dist
php artisan cache:clear
php artisan config:cache
php artisan route:cache
php artisan migrate
php artisan up
echo 'Deploy finished.'
2 - Create a script deploy.sh
in your repository, like so
echo "Deploy script started"
cd /var/www/html
sh pull.sh
echo "Deploy script finished execution"
3 - Finally update your bitbucket-pipelines.yml
file
image: atlassian/default-image:latest
pipelines:
default:
- step:
deployment: staging
script:
- cat ./deploy.sh | ssh <user>@<host>
- echo "Deploy step finished"
I would recommend to already have your repo cloned on your VPS in /var/www/html
and test your pull.sh
file manually first.
The problem with the answer marked as the solution is that the SH process won't exit if any of the commands inside fails.
This command php artisan route:cache
for instance, can fail easily! not to mention the pull!
And even worse, the SH script will execute the rest of the commands without stop if any fail.
I can't use any docker command because after each, the CI process stops and I can't figure out how to avoid those commands to not exit the CI process. I'm using the SH but I'll start adding some conditionals based on the exit code of the previous command, so we know if anything went wrong during the deploy.