CI/CD for Amiberry using Github Actions (pt 2)

By | 2022-06-12

In the previous part of this series, we saw how to set up some self-hosted runners on my local devices, connect them to a Github repository and prepare to give them some jobs. I want to use this setup to automate builds of Amiberry whenever I push a new commit, as well as publish new releases whenever I choose to. I am going to use git tags to mark new releases, with the version number (e.g. v5.2).

So let’s dive right in the workflow content, shall we?

The syntax is YAML, which means indentation matters. A lot. As in, you’ll get errors if something does not have the right indentation. Annoying for sure, but manageable if you are aware of it, at least.

Starting from the top, we’ll need to give our workflow a name, and define when it should be triggered. I will go for the boring name of “C/C++ CI” which just indicates what this workflow does. And I want this workflow triggered on two scenarios:

  • Whenever I push a new commit to the “master” branch
  • Whenever I push a new tag with a new version. The format should be vX.Y, where X/Y are numbers. Something like v5.2, for example.

Considering the above, the first part of our workflow looks like this:

name: C/C++ CI

on:
  push:
    branches: [ master ]
    tags: 
      - v[1-9]+.[0-9]

Then we have to define the jobs we want it to perform. You can specify multiple jobs, and each job will run in parallel with the other. And of course, each job consists of individual steps.

For each job, you can specify where that will be executed (e.g. a specific environment or self-hosted runner), which means I can separate out the builds I want for each device. So then I can add things like this:

jobs:

  build-rpi3-dmx-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32, dmx]
    steps:
...

  build-rpi4-dmx-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32, dmx]
    steps:

Of course, I still need to specify the steps each job will take. Let’s take a look at those next.

The first step should be to checkout the repository. Remember, this is running on each device separately, so they need to get the sources first, before they can compile them. We can use the “checkout” action for this step, and it’s quite simple:

- uses: actions/checkout@v3

That would be enough for most cases, but in Amiberry’s repository I also have some git submodules which I’d like to retrieve also, in order to build the IPF supporting library (capsimg). The Checkout action has an option that we can use to do just that, so it then becomes:

    - uses: actions/checkout@v3
      with:
        submodules: 'true'

And the whole job so far, looks like this:

  build-rpi3-dmx-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32, dmx]
    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'true'

And I’ll go ahead and create one job for each build I want to be triggered, to start giving the workflow some structure. Now the jobs list looks like this:

jobs:

  build-rpi3-dmx-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32, dmx]
    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'true'

  build-rpi3-sdl2-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32]
    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'true'

  build-rpi4-dmx-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32, dmx]
    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'true'

  build-rpi4-sdl2-32bit-rpios:
    runs-on: [self-hosted, Linux, ARM, rpios32]
    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'true'

  build-rpi3-dmx-64bit-rpios:
    runs-on: [self-hosted, Linux, ARM64, rpios64, dmx]
    steps:
    - uses: actions/checkout@v3
      with:
        submodules: 'true'

...

I’m not posting the full list above, as I assume you get my point.

Compiling the sources

Great, so far we have a list of jobs, that will run on my self-hosted devices, and checkout my git repository including submodules. We’re ready to do some compiling!

Since my devices already have all Amiberry requirements pre-installed, I can simply call the compile commands I want from the command line. Those are basically two for each platform:

  • Build the capsimg library, since I want to include that in the ZIP archive
  • Build Amiberry itself, for each platform I’ll be using

It’s quite easy to do these steps:

    - name: make capsimg
      run: make capsimg
    - name: make for RPIOS RPI3-DMX 32-bit
      run: make -j4 PLATFORM=rpi3

The name line is just for the logging part, which is good to have in order to see the individual steps and what each one is doing. The run line executes what is specified there on the device’s default shell, like bash for Linux. That’s perfect for my needs!

I’ll need to add these steps for each job of course, modifying what the PLATFORM=<value> contains, since Amiberry supports many of them. The make capsimg line can remain the same for all of them, however.

Uploading build artifacts

When this step is finished, and if no errors occurred, we should have an Amiberry binary available in the current directory. The next step is to take that and all related directories and data files, and upload them somewhere that testers can find them. There is an Action that does just that, the upload-artifact one.

    - uses: actions/upload-artifact@v3
      with:
        name: amiberry-rpi3-dmx-32bit-rpios
        path: |
          amiberry
          capsimg.so
          abr/**
          conf/**
          controllers/**
          data/**
          kickstarts/**
          savestates/**
          screenshots/**
          whdboot/**

Notice that I’m not creating any ZIP Archive before uploading these. That’s because when you try to download the uploaded contents, a ZIP file will be dynamically created. If I had uploaded a ZIP file here, then we’d have a ZIP file inside a ZIP file, which I didn’t want. This is documented in the upload-artifact action “Limitations” section:

During a workflow run, files are uploaded and downloaded individually using the upload-artifact and download-artifact actions. However, when a workflow run finishes and an artifact is downloaded from either the UI or through the download api, a zip is dynamically created with all the file contents that were uploaded. There is currently no way to download artifacts after a workflow run finishes in a format other than a zip or to download artifact contents individually. One of the consequences of this limitation is that if a zip is uploaded during a workflow run and then downloaded from the UI, there will be a double zip created.

https://github.com/actions/upload-artifact

The name like is important here, as that will be the name of our artifact once it gets uploaded. I tried to keep a clear naming convention, to indicate the exact type of Amiberry version each one represents.

At this point, our workflow already covers some of our requirements:

  • It automatically triggers whenever a new commit is pushed to the repository
  • It will checkout the sources, and build the capsimg and Amiberry binary, for each platform I included
  • It will upload those binaries and all related data files/directories to a location that people can get it from

Creating archives for new Releases

That looks good already, but what about new Releases? I want it to take some extra steps, but only if I have used a git tag with the specific format I specified, which would mean I am generating a new Release. Let’s see how we can do that.

    - name: Get tag
      if: github.ref_type == 'tag'
      id: tag
      uses: dawidd6/action-get-tag@v1
      with:
        # Optionally strip `v` prefix
        strip_v: false

There are a few things to mention here. We can use an “if” line in an action/step, to specify that this step should only be executed if a certain condition applies. The condition I wanted to use in my case, was to check if the action that trigged this job was caused by a git tag, not just a new commit. Remember we set our git tag trigger at the top of this workflow, which had a specific format to look for – so all we need to do here, is to make sure the following steps will only be executed if we started the workflow because I pushed a new git tag.

Then I want to get the actual tag value that triggered this, and store it in a variable. We’ll be using that variable in the next step, because I want to name my ZIP archives with the version as well. For example, if I used the tag v5.2 to trigger this workflow, I want the text “v5.2” stored somewhere so I can use it to name my archive “amiberry-v5.2-rpi4-sdl2-rpios.zip” or similar.

I found an action that helps me do the job easily, and that’s what the uses: dawidd6/action-get-tag@v1 line is about. It will output the tag value to variable named “tag”. It can optionally also strip out the “v” part, if you enable that, as you can see in the line below.

The next step is to ZIP the files I want to include in the new Release. I want to use the variable name “tag” I stored above, as part of the filename. Otherwise, this is rather straightforward – I just have to make sure that zip is installed on the self-hosted runners, of course!

    - name: ZIP binaries
      if: github.ref_type == 'tag'
      run: zip -r amiberry-${{ steps.tag.outputs.tag }}-rpi3-dmx-32bit-rpios.zip amiberry capsimg.so abr conf controllers data kickstarts savestates screenshots whdboot

You can use variables with the special syntax shown above: ${{ variable name }} will be replaced with the variable value, during runtime. In my case that specific variable holds the value of my git tag, so it will be replaced with something like v5.2 (keeping the same example as before here). So that would make the whole line look like this, when executed (assuming the tag was “v5.2”):

run: zip -r amiberry-v5.2-rpi3-dmx-32bit-rpios.zip

Creating a Changelog with each release

Next, I wanted to have a changelog generated with each new release, that would include what changed since the previous release was published. I wanted to have this generated automatically, based on the commits that came between the two releases, and it took me a while to find something that would work exactly like I wanted. Eventually, I found something that came close enough:

    - name: Create Changelog
      if: github.ref_type == 'tag'
      id: changelog
      uses: loopwerk/tag-changelog@v1
      with:
          token: ${{ secrets.GITHUB_TOKEN }}
          config_file: .github/tag-changelog-config.js

Notice that I’m including the if: github.ref_type == 'tag' line here as well. I don’t want this step to trigger on new git commits, only when I’m publishing a new release. And the way I’ve designed things so far, is that new releases would only be triggered when I push a new git tag, with the specific version format I am using.

What was interesting about this specific action, is that it allows you to configure the look of your changelog depending on a Javascript config file. I went for the default example, as it seemed good enough for my needs, but it does require that your commits use a specific label format in order for this to work. It also means I will need to add that config file in my repository, of course. It needs to find it from somewhere…

Creating a new Release

Now that we have everything in place, it’s time to add one more step in our workflow: Creating a new Release and publishing all items (zip archive, changelog). Again, there are several actions to do this, but the one I found that worked well for me is this:

    - name: Create Release
      if: github.ref_type == 'tag'
      uses: ncipollo/release-action@v1
      with:
        allowUpdates: true
        omitBodyDuringUpdate: true
        body: ${{ steps.changelog.outputs.changes }}
        artifacts: |
          amiberry-${{ steps.tag.outputs.tag }}-rpi3-dmx-32bit-rpios.zip

I liked this one, because it allows me to create a new release and optionally upload an artifact to it. And it can also use my generated changelog, from the previous step. Perfect!

There are a few options I had to tweak around, to make sure it worked the way I wanted:

  • allowUpdates: true – Since I am having multiple jobs running in parallel, each one will complete at different times (when the binary compilation is done) and will try to add the ZIP archive to the same release. I need to allow updates to a release, in order for this to succeed.
  • omitBodyDuringUpdate: true – I don’t want any changes to the Release body, when it’s doing an update. It will only be uploading multiple ZIP archives as the jobs complete, so the body should remain the same once it was generated.
  • body: ${{ steps.changelog.outputs.changes }} – This is where it gets interesting. I can specify the body text of the Release, and I can use the output of my Changelog step here.
  • artifacts: finally, I can specify which artifact will be uploaded to the Release. Each job will complete separately, and will upload its own ZIP archive here, that’s why it’s important to allow Updates to the Release.

Of course, these should all be added for each job I added, which will make the file a bit long, but that’s because Amiberry supports so many targets (and I’m not even compiling all of them here). You can see the complete file here, if you want.

The final output can be seen on the v5.2 Release of Amiberry, on Github.

Triggering the workflow manually

The automated job works fine, but sometimes (and especially during testing), you may want to trigger the workflow manually as well. This is trivial to do, as all we need to do is add one line to the top of the workflow:

name: C/C++ CI

on:
  workflow_dispatch:
  push:
    branches: [ master ]
    tags: 

Adding workflow_dispatch: will make it possible for us to trigger this workflow from Github, with the use of a button:

Run workflow manually

This workflow worked well for my needs, but there’s always room for improvement. For example, compiling on each device takes quite some time, especially on slower ones. The whole process is automated of course, so I can just “fire and forget”, but it can still take more than 40 minutes before a full release is complete on Github. In the next part of this series, we’ll take a look at how we can improve that, and have everything complete in less than half that time!

One thought on “CI/CD for Amiberry using Github Actions (pt 2)

  1. Pingback: CI/CD for Amiberry using Github Actions (pt 1) – Blitter Studio

Comments are closed.