Salesforce: Automating Deployments using Bitbucket Pipelines

on

 

Contents

Introduction

First things first

Repository Setup

Install Bitbucket and Source Tree

Forking the template stub repo

Cloning the Fork to your local repo

Setup the local repository as a Mavensmate Project

Step into the directory workspace

Refresh Mavensmate Project and Push Metadata to Bitbucket

Enable Pipelines and perform and set the Environment Variables

Perform a deployment to the Master branch vs Feature branch

Make some changes and see the Pipelines in action

What if I wanted to push to the feature branch instead of the master branch?

Create a Pull Request

Introduction

The aim of this handbook is to describe a very simple Salesforce CI/CD scenario involving several tools: Bitbucket Pipelines, Source Tree, Force.com Migration Tool, Mavensmate.

As a result of reading this doc, you will be able to configure a CI/CD environment putting all the above functionalities together.

  • Starting from a stub repo [email protected]:mklinski/salesforce-bitbucket-pipelines.git
  • Making changes through Salesforce UI/ Mavensmate into your dev sandbox through separate branches and merge to the master repo or directly commit to the master repo
  • Pushing developments to the remote repo from Source Tree
  • Finally see the Pipelines performing the validation and the deployment against to the configured environments…

Notice: the goal of the handbook is not to give you the theoretical concept behind Bitbucket Pipelines such as: the model behind Bitbucket, what is Git, what is Source Tree, the concept of Forking, Cloning, Pipelines, how does Apache Ant works and so on… but it is likely to be a “step by step” guide to easy setup the environment. I would strongly suggest you to first watch and learn the following videos/tutorials as prerequisites to successfully go through these topics:

Salesforce Deployments Made Easy with Bitbucket Cloud Pipelines

https://www.davidcdean.com/salesforce-ci-with-bitbucket-pipelines-part-1/

https://www.cloudinit.nz/single-post/2016/11/25/Automated-Salesforce-Deployments-with-BitBucket-Pipelines

https://www.atlassian.com/git/tutorials/learn-git-with-bitbucket-cloud

https://confluence.atlassian.com/bitbucket/tutorial-learn-sourcetree-with-bitbucket-cloud-760120235.html

First things first

Please refer to the following https://www.sales4k.com/sfdc/salesforce-automating-deployments-svn-ant-migration-tool/ to configure/install at least Java and Apache Ant in your local machine.

Repository Setup

We are going to setup our local repository (a Mavensmate project) linked to an existing sandbox from which we do any kind of changes and push them to another sandbox from Source Tree through Bitbucket Pipelines. This model utilizes repo forks for different Salesforce environments. The Production repo should be the main repo. A fork from Prod should then be created for UAT/Staging. Lastly, a fork for dev should be created from the UAT/Staging fork.

The prerequisite is that, at the beginning, the sandbox environments we would like to play with need to be the same. In the following paragraphs, I’m going to simulate the workflow from a personal DEV to another DEV (shared Sandbox).

image1

Install Bitbucket and Source Tree

The first thing you need to do is to create a Bitbucket account and install Source Tree (one of the best Git client offering a graphical interface for Git repositories):

https://bitbucket.org/account/signup/

https://confluence.atlassian.com/get-started-with-sourcetree/install-sourcetree-847359094.html

Forking the template stub repo

Our assumption is that we are going to fork the following repository https://bitbucket.org/mklinski/salesforce-bitbucket-pipelines which is our starting point to setup the environment. Why do we start from this repo instead of starting from scratch? Well, this repository has a predefined directory structure for Pipelines to work properly. In a real scenario you start from scratch by creating a new repo linked to Prod, then Fork the repo and link to UAT env, then Fork again and so on until Shared Dev Repo.

From Actions menu on the left → click Fork

image2

image3

Cloning the Fork to your local repo

Once you have created your Fork, you are ready to Clone it to your local repository (a Mavensmate project).

Locate your Mavensmate default Workspace directory and create an empty directory in that.

From Actions menu on the left → click Clone choosing HTTPS and click the Clone in Source Tree button (use HTTPS protocol).

image4

Then, in Source Tree, you should see a screen like the following.

Please change the Destination Path to the directory you have just created in your Mavensmate workspace and click Clone:

image5

Setup the local repository as a Mavensmate Project

From the left menu in Source Tree, locate the repo you’ve just created → Right click → Show in Explorer

Go to the parent dir → Drag the whole directory into Mavensmate. You should see something like this:

image6

Right Click on the parent directory → Mavensmate → Create Mavensmate Project.

NOTICE: Please insert your personal DEV credentials (the sandbox designed for development)

image7

After creating the project, the directory structure should be similar to the following:

image8

Step into the directory workspace

Let’s see the key files of this structure:

Build/Build.xml

The Force.com Migration settings for executing the deployment to Salesforce. Most of this has merge variables which we set via the BitBucket Pipelines Environment Variables. Note that there are <target> tags in which you find core functions telling bitbucket “how” does perform the deployment.

Following some samples:

  • getCode to retrieve an unpackaged set of metadata from the org. The file src/package.xml lists what is to be retrieved
  • deployEmptyCheckOnly to just run local tests on this instance. It never actually saves to the server.
  • deployCode to actually deploy items on the instance without running tests (if the instance is not Prod).

Package.xml

This is a manifest in which you define the subset of metadata you’d like to retrieve. If you’ve written an Apex Class, Visualforce Page, Object, Field, or any of the other Metadata types you’re familiar with, its name gets listed in here. Please find a reference here: https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/manifest_samples.htm

Bitbucket-pipelines.yml

This YAML file describes your repo’s Pipelines jobs at the highest level. Pipelines will look for it in the root of your Git repo whenever a commit happens. It allows you to define a default job, different behaviors for your branches by name (or by matching), and even which environments to use when doing the job.

Note: In this file you can find the reference to the Docker Image (but in this context you don’t have to care about it) and the reference to the target functions defined in the build.xml.

Docker Image (mentioned for completeness)

Pipelines uses a default Docker image that does not include Apache Ant, which is a requirement of the Force.com Migration Tool. For the purpose of this document, we can consider this piece of functionality as “given”, so you don’t have to do anything.

For the seek of completeness,please find the definition of the Docker in the bitbucket-pipelines.yml file at line 2. The original source is https://hub.docker.com/r/mklinski/salesforce/~/dockerfile/.

Refesh Mavensmate Project and Push Metadata to Bitbucket

(Note: the Pipelines are still turned off)

In mavensmate, right click on src folder → Mavensmate → Refresh From Server

Note: the subset of metadata you are actually retrieving from the server are specified in package.xml file.

After refreshing metadata in Mavensmate, switch to Source Tree and you should see a popup similar to the image below. In the image you see ‘3’ on the left side of the screen to the repo you are currently using, and Uncommitted changes in the main screen. Which means that Source Tree is telling you that something has changed in the local repository.

image9

Click on Commit button in the higher part of the image → select all pending files (in other words you are staging the files) → put some comments →  click Commit

After committing, you should see a popup number on the Push button as in the figure:

image10

Click the Push button → choose the branches (in this case choose master) in the window opened → click Push again

Now in Source Tree everything is cleared, no popups are shown.

Please switch to Bitbucket and click on Commit tab in the left panel to see the items you have just pushed to the remote repo.

image11

We have just reached the first checkpoint, congratulations!

The local repository and the remote repository are aligned (they are sharing the chosen metadata retrieved by the source DEV).

Notice: our goal is to deploy some changes from one environment to another, so we are just in the middle of our E2E tour.

Remember the assumption: every sandboxes have been refreshed (in our case just two: the source and the target sandboxes).

Enable Pipelines and perform and set the Environment Variables

Once you have everything aligned, we are ready to simulate out CI/CD activities.

Let’s say we would like to push some changes to an Apex Class adding a reference to a custom field we have just created in our source sandbox.

How do we do the deployment to the destination environment? Using the pipelines of course!

Go to your Bitbucket account → Locate Pipelines tab on the left panel → Click on Enable Pipelines

In the YAML template picklist chose Other Click Next

image12

After turning on the Pipelines, one more thing is still missing: set up the Environment Variables.

From Bitbucket → click Settings on the left panel → Locate the Pipelines tab → click on Environment variables and configure SFDC_PASS, SFDC_SERVERURL, SFDC_USERNAME with target environment info’s

image13

Perform a deployment to the Master branch vs Feature branch

In the previous paragraph, we said that we would like to push some changes (i.e.: an Apex Class adding a reference to a new custom field) against the target environment from our source sandbox.

Let us see closer the .yml file:

image14

We will use both branches/feature/* and branches/master to see how the deployment behaves based on the different pushing modality.

Are you ready to see the E2E tour from the beginning to the deployment? Let’s GO!

Make some changes and see the Pipelines in action

Before doing some changes, please be sure that the src/package.xml looks more or less like this:

image15

Hence, the metadata shown above are: Apex Classes, Apex Component, Apex Page, Apex Trigger, SVMXC__Service_Order__c custom object and some Profiles.

For the purpose of this handbook, let us consider Apex Classes and SVMXC__Service_Order__c custom and some profiles. So please be sure to retrieve at least these metadata in your package.xml file.

Please do some changes to one wanted Apex Class (from Mavensmate) and create a new Custom Field from Salesforce UI (let’s say in SVMXC__Service_Order__c object) and set the FLS accordingly in your dev source instance.

After you’ve done so, please right click on src → Mavensmate → Refresh from Server to be sure that all the changes have been retrieved to your local repo.

If everything is fine, then you should see some popups in Source Tree. Indeed, these popups are related to the metadata files we have just changed in our source environment.

At this point, we need to just stage each files, Commit the changes and finally Push the changes to the remote repo branches/master (as described in the previous paragraph).

Switch to Bitbucket → Go to the tab Pipelines on the left side panel → see the new incoming commit, please expand it → you should see an image like the following:

image16

image17

 

Please take a look to your target sandbox and you’ll find the successful deployment of the apex class and the custom field with FLS set up accordingly to the profiles you have migrated.

What if I wanted to push to the feature branch instead of the master branch?

The workflow is almost the same as described in the previous paragraph.

From Bitbucket → go to Create Branch tab on the left side panel → be sure to add feature/<something> to the title and Create the branch as the following and checkout into Source tree (using HTTPS).

im1

After making all the changes to the Apex class and so on, please Stage each file, Commit and Push to the branches/feature/<something> as in the following image:

im2

Hence please switch to the Bitbucket Pipelines tab and see the DeployEmptyCheck job in progress.

Notice: as already discussed, the definition of the job behavior has been defined in the build/build.xml file within the <target> tags.

As you can find in the related image, the checkonly flag is “true” indeed and the testLevel is equal to “RunLocalTest” which means that we are performing a validation only, no deploy will be performed.

In addition, please find some screenshots below:

im3

la1

la2

la3

Once the validation has been performed, we just need to merge the feature branch into the master branch.

Hence, Go to Branches in the left Panel from Bitbucket and click Merge button

la4

la5

At this point, one Pipeline job (deployCode) is running to perform the deployment.As a result of the previous action, please check the Pipelines tab on the left panel.

la6

Create a Pull Request

Once you’ve deployed the commit to the shared dev environment and reviewed accordingly, your commit is now ready to be deployed against the UAT staging sandbox.

How do you push the changes to another environment? Hence the answer is: creating a Pull Request.

 la7

After clicking on Create pull request button (from DEV shared repo which would be a Fork of UAT repo in a real scenario) against UAT repo, you will be automatically stepped into the UAT repo (not shown for the purpose of this document), and then you will see a Pipeline job running perform the deployment accordingly.

In conclusion, the same workflow works for the deployment in Production environment: create a pull request from UAT repo (which would be a Fork of Prod Repo) against Prod repo and see the Pipelines do their job.

(Visited 1,624 times, 2 visits today)

11 thoughts on “Salesforce: Automating Deployments using Bitbucket Pipelines

  1. Love the article.  Can you please post the repo that you referenced (or at least the bitbucket-pipelines.yml, build.xml) ?  The person at Atlassian who did the repository left the company and the repository is no longer available.  Thanks.

    1. Hi TexasUser, 

      Thanks for your comment. It’s a pity that mklinski left his company…however I would like to suggest the following repo as “template”: https://bitbucket.org/benedwards44/salesforce-pipelines

      Even though it is slightly different, the structure behind is the same.

      Following the file you requested from mklinski repo:

      bitbucket-pipelines.yml

      Build.xml

  2. Do you have a link to an active repo? The link provided in the article is no longer valid. It would be nice to see the actual files. Thank you.

    1. Hi John Smith,

      As I said to TexasUser, I would like to suggest the following repo as good starting point: https://bitbucket.org/benedwards44/salesforce-pipelines

      The concepts behind are exactly the same.

      Best Regards
      Antonio

  3. Hi. How does the individual developer update the package.xml when they want to push their changes from their dev Org to the UAT Org?  Currently our package.xml is set to pull all metadata (eg. using *) for most the various metadata components.  If a developer only has a change for a new Apex class and some custom fields, how do we handle updating the package.xml?  Again, excellent article.

    1. Hi TexasUser,

      from your Sublime workspace, locate the packege.xml file and simply apply some changes to the , , and tags adding specific Apex Classes for instance and the custom field you like. Please refer to the following https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/manifest_samples.htm to find some samples.

      Click Save, then Source Tree will detect the changes accordingly and you can do your job as described in the article (see “Make some changes and see the Pipelines in action” paragraph).

      Regards,
      Antonio

  4. Hi. Whenever changes are made and need to be deployed, the package.xml file will need to be updated. Does that mean the package.xml file will always be updated/changes with every deploy? E.g. Sometimes there could be one entry and sometimes there could be ten entries to reflect the updated metadata that needs to be deployed.

    the original package.xml that was present when the repo was first created to pull down all metadata from salesforce and pushed to bitbucket will no longer look the same right? Since it will be constantly updated with every new deployment. If this is the case, when UI confit changes are made in production, how do we pull those new metadata down so that we can push the updates to bitbucket? (We no longer have the original package.xml that was pulling all metadata. E.g. Using “*”.) The admin that made the changes won’t know how to use the dev flow or they made changes and forgot to document them.

    Thank you again. Sorry for the long reply.  We just want to make sure we are following the correct flow.

  5. Yes, even I am curious about the same thing. Ideally we should be able to deploy whatever files we have commited to BitBucket and not the whole repository. How can we achieve that?

    Say we have commited 1 Custom Object and 1 trigger. How do we deploy only those 2 components

     

  6. Nice article!

    I would like to know if there is a way to deploy only the changed files? If my source code is really massive then I don’ t want to deploy everything in all builds.

    Also is there a way to skip the build, like I don’t want to run any build if its my feature branch. We can add [skip ci] to commit message but what I am looking for is a solid solution for this.

    What would be the best solution for both of the above?

     

    Thanks

    AnuTroy

  7. Hi,

    I was searching a lot to find out a solution on how to deploy only committed files. I found some apexchange tools like Copado with option to deploy just committed files.

    Is there any potential solution on bamboo. bitbucket for implementing this approach?

  8. Hi Everybody. In my opinion you can take a look at Salesforce DX for CI strategies. We will write an article soon concerning that

Leave a Reply

Your email address will not be published. Required fields are marked *