January 24, 2021

Cost-efficient SonarCloud Integration for Bitbucket

Learn how you can configure SonarCloud for Bitbucket Pipelines cost-efficiently by controlling memory consumption. With examples for TypeScript and Python.

Cost-efficient SonarCloud Integration for Bitbucket

SonarCloud is a SonarQube-as-a-Service tool that is used to inspect code and keep track of its quality over time, thereby providing an excellent tool for managing technical debt. This includes code smells, bugs, security issues and test coverage reports. Be aware that you still need to generate the reports yourself. These reports are uploaded to SonarCloud where they will be visualized among the other metrics and used to see if the Quality Gate is passed. There is a great integration with Bitbucket, but do keep in mind that the pipeline images can run into memory issues, which, when solved sub-optimal, can double your build minutes. Keep reading to see how you can get around those issues and keep your build minutes in check using example configurations for TypeScript- and Python-based projects.

Reduce Memory Footprints and Build Time

In Bitbucket Pipelines you can configure up to a 100 steps in a single pipeline and each step can get 4GB, but by default the Docker process that underlies the step gets 1GB. This means that your build step can fail if the Docker process surpasses 1GB, and you will see the dreaded: "Container 'docker' exceeded memory limit"

The SonarCloud integration is a pipe itself, suggesting that it gets 1GB. To increase the Docker memory allocation use:

services:
    docker:
        memory: 2048
runTests         : &runTests
    image : <your_image>
    name  : My Test Step
    size: 2x
    services:
      - docker

As you can see in the example above, it is also possible to double the whole step size to 8GB. However, if you do, then your build costs will double.

It is generally a better strategy to choose a smaller image and break-up your build in smaller steps where possible. However, for the SonarCloud integration there is a limit to how far you can break it up as you will want to run your tests with coverage to pass it along via the scanner to your SonarCloud account.

Here are some tips on saving  costs by decreasing time and memory footprints:

  • Use a smaller image to begin with (e.g. no full blown Linux required, look at Alpine)
  • Cache docker steps with cache: -docker
  • Use the option SONAR_SCANNER_OPTS: -Xmx256m
  • Explicitly set the language and source dirs: EXTRA_ARGS: '-Dsonar.sources=src -Dsonar.language=python'

Start your integration

Now that you're aware of the pitfalls, let's get started integrating this tool!

  1. Install the Sonarcloud app from the Bitbucket marketplace
  2. Create or link to SonarCloud account
  3. Go to SonarCloud and import the Bitbucket account you want to integrate
  4. Select a project you want to configure
  5. Set the SONAR_TOKEN variable as given by the wizard
  6. Configure bitbucket-pipelines.yml appropriately (see below).
  7. Integrate the quality gate (If you’re project doesn’t pass the build will fail).
  8. For a nice widget at the top of your repository page go to repository settings → SonarCloud Settings→ check: “Show repository overview widget
  9. Merge your branch into master (overview won’t work if you don’t do this, find your feature branch via Administration > Branches & Pull Requests)
  10. Set New Code Definition (e.g. Previous Version)

TypeScript

The recommended image here is alpine-chrome with node, which contains a headless Chrome installation.

Testing with Angular Test

Modify the bitbucket-pipelines.yml as follows. Notice that the lcov path contains the project name. Furthermore, the coverage reports still need to be generated by adding the --code-coverage flag to the appropriate test in package.json.

caches:
  - node
  - docker
clone:
  depth: full   
script:
  - npm install
  - npm run test --coverage
  - pipe: sonarsource/sonarcloud-scan:1.2.0 
    variables:
      SONAR_TOKEN: ${SONAR_TOKEN}
      SONAR_SCANNER_OPTS: -Xmx256m
      EXTRA_ARGS: '-Dsonar.sources=src -Dsonar.language=js -Dsonar.tests=src -Dsonar.test.inclusions="**/testing/**,**/*.spec.ts" -Dsonar.javascript.lcov.reportPaths=coverage/<project-name>/lcov.info'
  - pipe: sonarsource/sonarcloud-quality-gate:0.1.4

Testing with Jest

This link has an excellent tutorial. The gotchas here are that the lcov path does not contain the project name, that the flag is --coverage and that you use the jest-sonar-reporter. Other than that it’s very similar to the above example.

Python

For Python it is important to change the coverage.py report to xml:

caches:
  - pip
  - docker
clone:
  depth: full  
script:
    - pip install coverage
    - coverage run -m unittest discover -s tests/ -p "test_*.py"
    - coverage report
    - coverage xml -i
    - pipe: sonarsource/sonarcloud-scan:1.2.0
      variables:
        SONAR_TOKEN: ${SONAR_TOKEN}
        SONAR_SCANNER_OPTS: -Xmx512m
        EXTRA_ARGS: '-Dsonar.sources=src -Dsonar.tests=tests -Dsonar.test.inclusions="**/test_**" -Dsonar.python.coverage.reportPaths=coverage.xml'
    - pipe: sonarsource/sonarcloud-quality-gate:0.1.4

Conclusion

The SonarCloud integration is highly recommended, despite the possible memory pitfalls. Especially the integration on the SonarCloud side was very impressive, with customized wizards that link to relevant sections in the repository that you are trying to configure. Configuring this integration will help you gain insights into and manage the technical debt of your projects.