CI/CD for Domain Modelling using GitHub Actions and Jargon

Alastair Parker
4 min readOct 10, 2023

Jargon is an opinionated platform for designing APIs and Data Models based on principles from Domain Driven Design. Many of Jargon’s opinions are informed by modern open-source software development processes, and today we’re going to talk about an important one: CI/CD or Continuous Integration / Continuous Deployment.

Jargon has an internal workflow that looks a bit like this, and is documented in-depth in our article about iterative approaches to API and Data management:

Jargon’s internal workflow for enabling iterative Domain Driven Design
Jargon’s internal workflow for enabling iterative Domain Driven Design

Throughout this process, work is done and value is added to your Domains. When something significant of note has been completed, you can Release your Domain and make it available to others. If you’ve done something special, but not quite ready for a release, you can save a Snapshot, which is a point-in-time view of your Domain that others can use to follow along.

Whether you make a Release or take a Snapshot, Jargon can help you integrate your changes into downstream automated processes.

The high-level flow looks a little like this:

Releases and Snapshots in Jargon trigger downstream automated processes to combine and deploy your work
Releases and Snapshots in Jargon can trigger downstream automated processes via WebHooks

Configuring WebHooks in Jargon

Jargon currently supports two triggers for WebHooks: Releasing a Domain, and creating a Snapshot, and every Domain in Jargon can be configure where either of those WebHooks is sent.

In the Settings page for each Domain, you’ll see a section called WebHooks, which is where this configuration happens:

Jargon WebHooks are very light-weight, and only contain URLs to Jargon’s generated artefacts

Up the top is where you’ll need to configure the details for which GitHub repo will receive the WebHook, as well as a Personal Access Token that has write scope for the repository’s content. There’s a good write up by Elio Struyf that covers the specifics of creating the right type of token.

For each Domain, you can choose to send Release and Snapshot Webhooks to the same or different repositories.

After you hit save, then you’re good to go! Next time you make a Release or create a Snapshot, Jargon will construct a WebHook and POST it off to the GitHub repositories you’ve configured.

A note for private Domains

Jargon can send WebHooks for private Domains as well as public ones, but for private Domains Jargon will generate an access token and append it to all the artefact URLs in the WebHook payload. As these URLs will contain an access token that will allow people to download your private Domain’s artefacts, you should treat them as secure information. Take extra precautions if you’ve got a private Domain but a public GitHub Repository, as the URLs may appear in the GitHub action logs.

A sample GitHub Action workflow to listen for Jargon’s WebHooks

We’ve put together a sample GitHub Action for you to take a look at if you’re going to use Jargon’s WebHooks. All the code is available in this repo https://github.com/jargon-sh/webhooks, but the relevant part is the GitHub workflow:

name: respond-to-jargon-webhooks
on: [repository_dispatch]

env:
ARTEFACT_DIR: jargon/
CI_AUTHOR: jargon-sh

jobs:

get-jargonArtefacts:
runs-on: ubuntu-latest

permissions:
# Give the default GITHUB_TOKEN write permission to commit and push the
# added or changed files to the repository.
contents: write

steps:
- uses: actions/checkout@v2
- run: mkdir -p ${{ env.ARTEFACT_DIR }}
- uses: actions/setup-python@v2
- uses: jannekem/run-python-script-action@v1
with:
script: |
import urllib.request
import json

def download(url, fileName):
try:
urllib.request.urlretrieve(url, '${{ env.ARTEFACT_DIR }}' + fileName)
except:
print('Failed to download: ' + url)

# Parse the Json Webhook from Jargon
jsonObj = json.loads("""${{ toJSON(github.event.client_payload) }}""")

# Determine if this was a Release, or Snapshot
onRelease = False
if jsonObj['action']['name'] == "onRelease": onRelease = True

if "test" in jsonObj['action']:
# Do nothing for test runs
print('Running in test mode. Exiting')
else:
# Download the artefacts
artefacts = jsonObj['artefacts']
for artefactType in artefacts:
if artefactType == 'lifeCycles':
lifeCycles = artefacts['lifeCycles']
for cycle in lifeCycles:
download(cycle['url'], cycle['fileName'])
else:
artefact = artefacts[artefactType]
download(artefact['url'], artefact['fileName'])

# Commit and push all changed files.
- name: Commit & Push
if: ${{ github.event.client_payload.action.test }} == undefined
run: |
git config --global user.name "${{ env.CI_AUTHOR }}"
git config --global user.email "${{ env.CI_AUTHOR }}@users.noreply.github.com"
git add ${{ env.ARTEFACT_DIR }}/*
git commit -m "Update ${{ github.event.client_payload.domain.account}}/${{ github.event.client_payload.domain.name }} from Jargon"
git push

There’s a bit of boilerplate configuration at the top to register this workflow to respond to WebHooks, which are conveniently named something obvious like ‘repository_dispatch’ :)

There’s an inlined python script that parses the JSON body of the WebHook, iterates through each of the artefacts, and downloads them to the Jargon directory.

Following that, is a step to add the downloaded artefacts, and commit them back to the repository.

When it’s all done, your repo will have the latest copies of your artefacts (either Released or Snapshots), for you to use in subsequent GitHub Action steps.

Here’s an example Jargon Readme, hosted in GitHub and displaying a Domain Model diagram: https://github.com/jargon-sh/webhooks/blob/main/jargon/jargon_SwaggerPetstore_readme.md

Wrapping up:

So that’s how you can use Jargon’s internal workflows to trigger GitHub Actions and integrate your Domains into CI/CD processes..

If you have any further questions about WebHooks in Jargon you can:

--

--

Alastair Parker

Semantic data nerd, and creator of https://Jargon.sh - a collaborative platform for developing data models and vocabularies