Integrate CFS into Bitbucket CI/CD Pipeline
Learn how to add CFS and ORT to your existing Bitbucket CI/CD Pipeline.
Overview
The following instructions can be used for integrating either Canvass for Compliance (CFC), formerly known as LiAnORT, or Canvass for Security (CFC), with the Bitbucket Pipelines CI/CD product. For clarity, only CFC will be mentioned in the examples.
In order to integrate CFS into a Bitbucket CI/CD Pipeline, the following steps are needed:
- Create a Canvass Labs account. As CFS is a cloud-based product, a Canvass Labs account is required to submit CFS jobs.
- Enable Bitbucket Pipelines. Bitbucket Pipelines is an integrated CI/CD service built into Bitbucket. If you are new to BitBucket Pipelines, you can use this guide to set up a Pipeline.
- Install necessary utilities on current Pipeline.
In addition, the utilities
curl
,git
, andunzip
need to be installed in the current Pipeline. - Install Java 11 or higher on current Pipeline. CFS uses the OSS Review Toolkit (ORT) to assist with software dependency detection. As a result, Java 11 or higher must be installed in the current Pipeline for ORT to work correctly.
Modifying the bitbucket-pipelines.yml file
ORT needs to perform its scan in the project’s operating, build, or testing environment, to properly ascertain your software’s dependencies and their correct version numbers. Because of this, we cannot offer our CI/CD as a pre-packed Pipe, as this would require having ORT run within our own container instead of yours.
Instead, a Pipeline step that runs CFS and ORT in your container will be created.
To create this step, the file bitbucket-pipelines.yml
(which defines your Pipelines builds configuration) will need to be edited.
The actions performed in this step remain isolated, so this step can be run in parallel or sequentially.
Note: this article assumes an Ubuntu Linux Docker container image for this Pipeline. Software package installation may be different in your environment.
- Begin your step definition.
Define a meaningful name for this step. Here, we named our step Source Code Vulnerability Check.
- step: name: 'Source Code Vulnerability Check' script:
- Ensure that the utilities
curl
,git
, andunzip
are installed. If you haven't install them (or you are not sure), you can add the following lines to the step to install them. Also, install thesoftware-properties-common
package, which is required for the commandadd-apt-repository
that will be needed below.- apt update -y - apt install -y curl unzip git default-jdk software-properties-common
- Install and enable the necessary package managers in your container.
You are likely to have the required managers (e.g. npm for Node/Javascript-based projects, maven for Java, pip for Python)
already installed already to support your build, test, or deployment process.
However, you can include a script that ensures that they are correctly installed and configured.
For our example, we are analyzing a python-based project. For ORT to process this project, we will need to perform some additional steps to enable Python 3 for our Ubuntu 18.04 container. If you already have Python 3 installed, or derive your container from an official Python 3 container, you can omit these next steps:# Adding the deadsnakes/ppa repository is required to upgrade to Python 3.8. - add-apt-repository -y ppa:deadsnakes/ppa # set terminal settings to support UTF8, which is default output for Python3. # this is essential for ORT to work correctly with Python 3. - export LC_ALL=C.UTF-8 - export LANG=C.UTF-8 # virtualenv is used by ORT to build a testing environment. - apt install -y virtualenv python3.8 python3-pip - python3.8 -m pip install --upgrade pip
- Temporarily move your software project’s contents into a subdirectory.
ORT requires the software being scanned to be in a subdirectory, therefore storing the software in the container root along with CFS and ORT itself will not work.
Pick a unique name for the temporary subdirectory.
- mkdir unique_name # shopt and mv !(unique_name) work together to prevent # mv from returning a non-zero status code when it fails to move # ‘unique_name’ to itself. This will cause the Pipeline to fail immediately. - shopt -s extglob - mv !(unique_name) unique_named
- Download both ORT and the CFS executable file into the container.
The ort_binaries_only zip file is pre-compiled and packaged by Canvass Labs.
As a result, it contains only the binaries needed to run command-line operations plus licensing and copyright information.
- curl -o cfs.zip https://rivera.canvasslabs.com:5000/canvass_for_security/client/download/linux - unzip cfs.zip - rm cfs.zip - curl -o ort_binaries_only.zip https://rivera.canvasslabs.com:5000/client/download/ort - unzip ort_binaries_only.zip - rm ort_binaries_only.zip
- Use ORT and CFS to analyze and submit the required metadata to Canvass Labs' CFS servers.
With CFS and ORT downloaded, and with everything moved into its proper place, it takes only a few command lines to analyze and submit the required metadata to our servers.
# create and activate a testing environment for ORT and Python. - virtualenv –python=python3.8 venv - source venv/bin/activate - cd ./ort # turn off error trapping in Bitbucket Pipelines. # ORT returns 1 on fatal errors, and 0, 2, or higher. # We don’t want the workflow to fail if ORT returns 2 or higher. - set +e # generate a dependency analysis results file - bin/ort analyze --input-dir ../unique_name --output-dir ../analyzer_results # turn error trapping back on - set -e # the inline bash script below raises an error if ort exited with # a return code of 1. - if [ $? -eq 1 ]; then exit 1; fi - cd .. # Extract metadata from source code comments and string literals. # Submit that metadata, along with the dependency analysis report to the Cloud - ./CanvassForSecurity scan ./analyzer_results/analyzer-result.yml
- Return the directory structure to its original state.
Delete both ORT and the CFS client. Then move everything out of the temporary subdirectory and delete the temporary subdirectory.
- rm -rf ./CFS ort - mv unique_name/* . - rmdir unique_name
In the end, bitbucket-pipelines.yml
should look like the following:
image: ubuntu:18.04 pipelines: default: - parallel: - step: name: "Source Code Vulnerability Check" script: - apt update -y - apt install -y curl unzip git default-jdk software-properties-common # Adding the deadsnakes/ppa repository is required to upgrade to Python 3.8. - add-apt-repository -y ppa:deadsnakes/ppa # set terminal settings to support UTF8, which is default output for Python3. # this is essential for ORT to work correctly with Python 3. - export LC_ALL=C.UTF-8 - export LANG=C.UTF-8 # virtualenv is used by ORT to build a testing environment. - apt install -y virtualenv python3.8 python3-pip - python3.8 -m pip install --upgrade pip - mkdir unique_name # shopt and mv !(unique_name) work together to prevent # mv from returning a non-zero status code when it fails to move # unique_name to itself. This will cause the Pipeline to fail immediately. - shopt -s extglob - mv !(unique_name) unique_name - curl -o cfs.zip https://rivera.canvasslabs.com:5000/canvass_for_security/client/download/linux - unzip cfs.zip - rm cfs.zip - curl -o ort_binaries_only.zip https://rivera.canvasslabs.com:5000/client/download/ort - unzip ort_binaries_only.zip - rm ort_binaries_only.zip # create and activate a testing environment for ORT and Python. - virtualenv –python=python3.8 venv - source venv/bin/activate - cd ./ort # turn off error trapping in Bitbucket Pipelines. # ORT returns 1 on fatal errors, and 0, 2, or higher. # We don’t want the workflow to fail if ORT returns 2 or higher. - set +e # generate a dependency analysis results file - bin/ort analyze --input-dir ../unique_name --output-dir ../analyzer_results # turn error trapping back on - set -e # the inline bash script below raises an error if ort exited with # a return code of 1. - if [ $? -eq 1 ]; then exit 1; fi - cd .. # Extract metadata from source code comments and string literals. # Submit that metadata, along with the dependency analysis report to the Cloud - ./CanvassForSecurity scan ./analyzer_results/analyzer-result.yml - rm -rf ./LiAnORT ort - mv unique_name/* . - rmdir unique_name
Configure Environment Variables
CFS will need your Canvass Labs credentials to submit jobs to the CFS cloud.
CFS uses the file .lian_credentials
in your home directory to connect successfully.
You can learn how to define them here.
Alternatively, credential information can be specified using environment variables. These can be defined directly through the Pipeline editor, or via the Bitbucket Repository Settings > Repository Variables section. In the editor, click Add variables in the selection menu and define the variables that you want to use as login credentials.

After this, you are ready to save and exit the editor.
Running Your Pipeline
Now that Pipelines has been enabled and CFS integrated into a step via the bitbucket_pipelines.yml
file,
the Pipeline should automatically begin running.
The Pipeline will run every time you commit a change to the project or modify the bitbucket_pipelines.yml
file.
The progress of your Pipeline can be reviewed from the Pipelines section of Bitbucket. There, you can click on the Pipeline to bring up a detailed view of the Pipeline process. For more details, see Bitbucket Pipeline documentation.
Retrieving Results
CFS returns immediately after job submission instead of waiting for job completion to minimize the impact on your Pipeline execution time. An email will be sent to you upon job completion with an URL for retrieving your results.
You can review the status of your job by visiting the CFS jobs page.