The aim of this handbook is to describe a very simple Salesforce CI/CD scenario involving several tools: Bitbucket Pipelines, Source Tree, Force.com Migration Tool, Mavensmate.
As a result of reading this doc, you will be able to configure a CI/CD environment putting all the above functionalities together.
Notice: the goal of the handbook is not to give you the theoretical concept behind Bitbucket Pipelines such as: the model behind Bitbucket, what is Git, what is Source Tree, the concept of Forking, Cloning, Pipelines, how does Apache Ant works and so on… but it is likely to be a “step by step” guide to easy setup the environment. I would strongly suggest you to first watch and learn the following videos/tutorials as prerequisites to successfully go through these topics:
Please refer to the following https://www.sales4k.com/sfdc/salesforce-automating-deployments-svn-ant-migration-tool/ to configure/install at least Java and Apache Ant in your local machine.
We are going to setup our local repository (a Mavensmate project) linked to an existing sandbox from which we do any kind of changes and push them to another sandbox from Source Tree through Bitbucket Pipelines. This model utilizes repo forks for different Salesforce environments. The Production repo should be the main repo. A fork from Prod should then be created for UAT/Staging. Lastly, a fork for dev should be created from the UAT/Staging fork.
The prerequisite is that, at the beginning, the sandbox environments we would like to play with need to be the same. In the following paragraphs, I’m going to simulate the workflow from a personal DEV to another DEV (shared Sandbox).
The first thing you need to do is to create a Bitbucket account and install Source Tree (one of the best Git client offering a graphical interface for Git repositories):
Our assumption is that we are going to fork the following repository https://bitbucket.org/mklinski/salesforce-bitbucket-pipelines which is our starting point to setup the environment. Why do we start from this repo instead of starting from scratch? Well, this repository has a predefined directory structure for Pipelines to work properly. In a real scenario you start from scratch by creating a new repo linked to Prod, then Fork the repo and link to UAT env, then Fork again and so on until Shared Dev Repo.
From Actions menu on the left → click Fork
Once you have created your Fork, you are ready to Clone it to your local repository (a Mavensmate project).
Locate your Mavensmate default Workspace directory and create an empty directory in that.
From Actions menu on the left → click Clone choosing HTTPS and click the Clone in Source Tree button (use HTTPS protocol).
Then, in Source Tree, you should see a screen like the following.
Please change the Destination Path to the directory you have just created in your Mavensmate workspace and click Clone:
From the left menu in Source Tree, locate the repo you’ve just created → Right click → Show in Explorer
Go to the parent dir → Drag the whole directory into Mavensmate. You should see something like this:
Right Click on the parent directory → Mavensmate → Create Mavensmate Project.
NOTICE: Please insert your personal DEV credentials (the sandbox designed for development)
After creating the project, the directory structure should be similar to the following:
Let’s see the key files of this structure:
The Force.com Migration settings for executing the deployment to Salesforce. Most of this has merge variables which we set via the BitBucket Pipelines Environment Variables. Note that there are <target> tags in which you find core functions telling bitbucket “how” does perform the deployment.
Following some samples:
This is a manifest in which you define the subset of metadata you’d like to retrieve. If you’ve written an Apex Class, Visualforce Page, Object, Field, or any of the other Metadata types you’re familiar with, its name gets listed in here. Please find a reference here: https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/manifest_samples.htm
This YAML file describes your repo’s Pipelines jobs at the highest level. Pipelines will look for it in the root of your Git repo whenever a commit happens. It allows you to define a default job, different behaviors for your branches by name (or by matching), and even which environments to use when doing the job.
Note: In this file you can find the reference to the Docker Image (but in this context you don’t have to care about it) and the reference to the target functions defined in the build.xml.
Docker Image (mentioned for completeness)
Pipelines uses a default Docker image that does not include Apache Ant, which is a requirement of the Force.com Migration Tool. For the purpose of this document, we can consider this piece of functionality as “given”, so you don’t have to do anything.
For the seek of completeness,please find the definition of the Docker in the bitbucket-pipelines.yml file at line 2. The original source is https://hub.docker.com/r/mklinski/salesforce/~/dockerfile/.
(Note: the Pipelines are still turned off)
In mavensmate, right click on src folder → Mavensmate → Refresh From Server
Note: the subset of metadata you are actually retrieving from the server are specified in package.xml file.
After refreshing metadata in Mavensmate, switch to Source Tree and you should see a popup similar to the image below. In the image you see ‘3’ on the left side of the screen to the repo you are currently using, and Uncommitted changes in the main screen. Which means that Source Tree is telling you that something has changed in the local repository.
Click on Commit button in the higher part of the image → select all pending files (in other words you are staging the files) → put some comments → click Commit
After committing, you should see a popup number on the Push button as in the figure:
Click the Push button → choose the branches (in this case choose master) in the window opened → click Push again
Now in Source Tree everything is cleared, no popups are shown.
Please switch to Bitbucket and click on Commit tab in the left panel to see the items you have just pushed to the remote repo.
We have just reached the first checkpoint, congratulations!
The local repository and the remote repository are aligned (they are sharing the chosen metadata retrieved by the source DEV).
Notice: our goal is to deploy some changes from one environment to another, so we are just in the middle of our E2E tour.
Remember the assumption: every sandboxes have been refreshed (in our case just two: the source and the target sandboxes).
Once you have everything aligned, we are ready to simulate out CI/CD activities.
Let’s say we would like to push some changes to an Apex Class adding a reference to a custom field we have just created in our source sandbox.
How do we do the deployment to the destination environment? Using the pipelines of course!
Go to your Bitbucket account → Locate Pipelines tab on the left panel → Click on Enable Pipelines
In the YAML template picklist chose Other → Click Next
After turning on the Pipelines, one more thing is still missing: set up the Environment Variables.
From Bitbucket → click Settings on the left panel → Locate the Pipelines tab → click on Environment variables and configure SFDC_PASS, SFDC_SERVERURL, SFDC_USERNAME with target environment info’s
In the previous paragraph, we said that we would like to push some changes (i.e.: an Apex Class adding a reference to a new custom field) against the target environment from our source sandbox.
Let us see closer the .yml file:
We will use both branches/feature/* and branches/master to see how the deployment behaves based on the different pushing modality.
Are you ready to see the E2E tour from the beginning to the deployment? Let’s GO!
Before doing some changes, please be sure that the src/package.xml looks more or less like this:
Hence, the metadata shown above are: Apex Classes, Apex Component, Apex Page, Apex Trigger, SVMXC__Service_Order__c custom object and some Profiles.
For the purpose of this handbook, let us consider Apex Classes and SVMXC__Service_Order__c custom and some profiles. So please be sure to retrieve at least these metadata in your package.xml file.
Please do some changes to one wanted Apex Class (from Mavensmate) and create a new Custom Field from Salesforce UI (let’s say in SVMXC__Service_Order__c object) and set the FLS accordingly in your dev source instance.
After you’ve done so, please right click on src → Mavensmate → Refresh from Server to be sure that all the changes have been retrieved to your local repo.
If everything is fine, then you should see some popups in Source Tree. Indeed, these popups are related to the metadata files we have just changed in our source environment.
At this point, we need to just stage each files, Commit the changes and finally Push the changes to the remote repo branches/master (as described in the previous paragraph).
Switch to Bitbucket → Go to the tab Pipelines on the left side panel → see the new incoming commit, please expand it → you should see an image like the following:
Please take a look to your target sandbox and you’ll find the successful deployment of the apex class and the custom field with FLS set up accordingly to the profiles you have migrated.
The workflow is almost the same as described in the previous paragraph.
From Bitbucket → go to Create Branch tab on the left side panel → be sure to add feature/<something> to the title and Create the branch as the following and checkout into Source tree (using HTTPS).
After making all the changes to the Apex class and so on, please Stage each file, Commit and Push to the branches/feature/<something> as in the following image:
Hence please switch to the Bitbucket Pipelines tab and see the DeployEmptyCheck job in progress.
Notice: as already discussed, the definition of the job behavior has been defined in the build/build.xml file within the <target> tags.
As you can find in the related image, the checkonly flag is “true” indeed and the testLevel is equal to “RunLocalTest” which means that we are performing a validation only, no deploy will be performed.
In addition, please find some screenshots below:
Once the validation has been performed, we just need to merge the feature branch into the master branch.
Hence, Go to Branches in the left Panel from Bitbucket and click Merge button
At this point, one Pipeline job (deployCode) is running to perform the deployment.As a result of the previous action, please check the Pipelines tab on the left panel.
Once you’ve deployed the commit to the shared dev environment and reviewed accordingly, your commit is now ready to be deployed against the UAT staging sandbox.
How do you push the changes to another environment? Hence the answer is: creating a Pull Request.
After clicking on Create pull request button (from DEV shared repo which would be a Fork of UAT repo in a real scenario) against UAT repo, you will be automatically stepped into the UAT repo (not shown for the purpose of this document), and then you will see a Pipeline job running perform the deployment accordingly.
In conclusion, the same workflow works for the deployment in Production environment: create a pull request from UAT repo (which would be a Fork of Prod Repo) against Prod repo and see the Pipelines do their job.