Search
Close this search box.

Automated Salesforce Metadata Backup Using Jenkins and Git

Red Argyle Logo without name
Red Argyle logo

A year ago, we were looking for a more comprehensive way to track changes and save the history of all things in our Salesforce environments. The approach we ended up with was to use the Force.com Migration tool to retrieve the latest metadata from a Salesforce environment, a git repository to track historical changes to that metadata and a server running Jenkins to automate the whole thing. I figured others might be interested in setting up their own version of this so I decided to whip up a little how-to.

First you will need to get access to a server running Jenkins. There are tons of resources on the interwebs to help you get started. I wrote a post last year about setting up a Jenkins server on an Amazon EC2 instance to use with Continuous Integration, head on over and check it out, same setup applies to our purposes here. There’s also a great article on the Salesforce developer site that walks you through the process of creating a Jenkins server on EC2 step by step.

Now that you have that taken care of, the fun can begin. There are probably several different ways to setup a job that will retrieve all of your metadata from an org and push those files into a git repository. Jenkins comes with git integration so you can enter your credentials and have a job’s workspace linked to the repository. You can define build steps that call a retrieve ant command using the Force.com Migration Tool and then add another command that executes a shell script that commits and pushes those changes into your repository.

I’ll give you a quick glimpse into the setup that we are currently using. I’m going to stay fairly high level here as we can get into the weeds pretty quickly. In our backup jobs we define build parameters, urls for the repository and salesforce org, and a username and password. Those build parameters are turned into environment variables that we can access during build-time. Moving the configuration out of the build script allows us to call the same script for any job. We’ve put together a shell script that will retrieve all metadata then commit and push those changes to a bitbucket repository. Here is a simplified, untested, version of what we are using.

[gist id=”f1d7ac22c42c9235e57a” file=”synch.sh”]

You can see the use of the $REPO environment variable that we established in the build parameters while configuring the job. Our build action boils down to one simple “Execute shell” action that runs “bash synch.sh”

Along with this we have an extremely simple build.properties file that also looks at the environment variables to get login information.

[gist id=”9c1386be1195629894d0″ file=”build.properties”]

And a build.xml file that defines a basic retrieve target that calls the Force.com Migration Tool’s sf:retrieve target.

When we put all of that together and set the job to build periodically, we now have an automated metadata backup system that stores changes in a git repository. Piece of cake! Well, it may not be that easy but this should get you on your way. We’ve been running this setup for almost a year and, barring a few minor hangups, it has proven to be extremely helpful. I can’t tell you how many times I’ve been thankful that I could dig through the history of an Apex class or get a previous version of a validation rule.

Please leave a comment if you have questions about our specific setup or if you have a similar setup of your own and would like to collaborate.

Red Argyle logo
Red Argyle logo

Related Blog Posts