Over the past few years I’ve been trying to get away from using FTP and deploying an entire directory to just update a web app. I’ve used services like Springloops and even tried a tool like Capistrano, but nothing really was 1) Simple and 2) Free until I wrote my own bash script. Here’s basically what you do.
- Create SSH keys between your local machine and your remote box.
- Create a bash script with the following code.
#!/bin/bash LOCAL=YOUR LOCAL DIRECTORY REMOTE=user@remotesite.com:/home/useraccount/public_html rsync -rtv --progress --exclude-from 'exclude-list.txt' $LOCAL $REMOTE ssh user@remotesite.com 'chown -R user:nobody /home/useraccount/public_html' ssh user@remotesite.com 'chmod 750 /home/useraccount/public_html'
So set your LOCAL to your local directory path, REMOTE to your remote path with your user that you have your deployed files.
Note the rsync parameters, I’ve not chosen -a because that will also copy my local permissions, users and groups vs. the remote permissions users and groups.
Finally I’ve added an excludes list, which I’ll add things like “.git/” so that the git repository does not get sync’d and copied.
Now if you’re stuck into using FTP and just want to export what’s changed. I use this little script:
tar czf ~/Temp/new-files.tar.gz `git diff --name-only SHA1 SHA1 | grep -w '.php'`
This allows me to create a tar.gz of just what’s changed between two commits and in the case above, just grab .php files. You can remove everything past | grep if you want everything.