You are not logged in.

#1 2019-01-13 17:20:54

Omar007
Member
Registered: 2015-04-09
Posts: 317

Automating using GitLab CI

I'm currently looking into automating package builds using GitLab CI and for this I'm looking to create a Docker image equivalent of the clean chroot (mkarchroot) normally used to build packages.
While I think I'm close and it seems to be working properly, I'd like to get some feedback in case I missed something.

I've currently made the following Dockerfile:

FROM archlinux/base:latest

RUN curl -o /etc/pacman.d/mirrorlist "https://www.archlinux.org/mirrorlist/?country=NL&protocol=https&use_mirror_status=on" \
        && sed -i 's/^#//' /etc/pacman.d/mirrorlist \
        && pacman-key --refresh-keys \
        && pacman -Syu --noconfirm base-devel multilib-devel namcap

RUN useradd -d /home/builduser builduser \
        && echo "builduser ALL = NOPASSWD: /usr/bin/pacman" > /etc/sudoers.d/builduser-pacman

USER builduser

CMD ["/usr/bin/bash"]

The idea is to update the system and install the a base set of packages for building; base-devel and multilib-devel groups plus namcap to verify packages and PKGBUILDs with.
Then, create a special build user (with passwordless sudo pacman access) and switch to it for further execution.

Then use this image in GitLab CI to run the build jobs with. Example .gitlab-ci.yml

image: ${CI_REGISTRY}/docker-images/arch-build/arch-build:latest

stages:
  - check_pkgbuild
  - build_package
  - check_package

check_pkgbuild:
  stage: check_pkgbuild
  script:
    - namcap PKGBUILD | tee PKGBUILD.namcap.out
  artifacts:
    paths:
      - PKGBUILD.namcap.out

build_package:
  stage: build_package
  script:
    - makepkg --syncdeps --noconfirm --log
  artifacts:
    paths:
      - "*.log"
      - "*.pkg.tar.xz"

check_package:
  stage: check_package
  script:
    - namcap *.pkg.tar.xz | tee PKG.namcap.out
  artifacts:
    paths:
      - PKG.namcap.out

If everything works as designed I plan to add another job to push to the AUR and/or add it to a personal repo.

Are there any glaring issues with this setup or am I missing something crucial right now?

Offline

#2 2019-01-14 01:23:14

eschwartz
Trusted User/Bug Wrangler
Registered: 2014-08-08
Posts: 2,853

Re: Automating using GitLab CI

If your intent is to abort on namcap errors then piping to tee will hide any nonzero return codes from namcap unless you use pipefail: https://www.gnu.org/software/bash/manua … lines.html

That being said, namcap will never return nonzero in the first place... so you'd need to inspect that manually anyway.

A more reliable test of the package's sanity would be to make sure each PKGBUILD contains a useful check() function which implements the project unittests. That does depend on the package though. smile


Managing AUR repos The Right Way -- aurpublish (now a standalone tool)

Offline

#3 2019-01-14 02:05:45

Omar007
Member
Registered: 2015-04-09
Posts: 317

Re: Automating using GitLab CI

I did indeed notice namcap never returns a non-zero exit code. I had initially thought it'd give a non-zero exit code for any errors but that was not the case.
For that reason I had decided to store the output knowing that this is currently a manual check at all times. Though honestly mostly as a support if it fails at one of the further steps.
That said, since GitLab also stores the stdout of any command that was executed, tee might be a bit overkill and I may just change that to just a stdout redirect.

The check() function is more for the sanity of the software being packaged than the package itself though is it not?
As far as I'm aware this function is run by default during `makepkg` as well. That said, it's probably not a bad idea to explicitly provide '--check'.
While I'm at it, I might add '--install' as well for some extra certainty.

I take it the Dockerfile looked otherwise fine then?


EDIT:
Decided to split installation to its own stage. This is the current version:

image: ${CI_REGISTRY}/docker-images/arch-build/arch-build:latest

stages:
  - check_pkgbuild
  - build_package
  - install_package
  - check_package

check_pkgbuild:
  stage: check_pkgbuild
  script:
    - namcap PKGBUILD > PKGBUILD.namcap.out
  artifacts:
    paths:
      - PKGBUILD.namcap.out

build_package:
  stage: build_package
  script:
    - makepkg --syncdeps --noconfirm --log --check
  artifacts:
    paths:
      - "*.log"
      - "*.pkg.tar.xz"

install_package:
  stage: install_package
  script:
    - sudo pacman -U *.pkg.tar.xz

check_package:
  stage: check_package
  script:
    - namcap *.pkg.tar.xz > PKG.namcap.out
  artifacts:
    paths:
      - PKG.namcap.out

Still trying to figure out a nice way of getting these in a custom repo. I plan on simply using darkhttpd to host the files but I'm still looking for a way to get the different projects together into a repo database that can be hosted.
I assume that the repo-add operation can't be run multiple times on the same repo at the same time, but have to be performed sequentially if one doesn't want to corrupt it?

Last edited by Omar007 (2019-01-14 02:22:38)

Offline

#4 2019-01-14 02:27:06

eschwartz
Trusted User/Bug Wrangler
Registered: 2014-08-08
Posts: 2,853

Re: Automating using GitLab CI

I'm no great expert at Docker anything tongue but as long as it is doing the steps I'd do anyway, then it looks fine.

The check() function is supposed to test the functionality of the software, which is usually a pretty good metric for if the resulting binaries or whatever are actually functional. tongue There's no need to use --check, since unless you've modded the makepkg.conf it will be there by default.

Using --install is not a bad idea at all -- the official makechrootpkg script which uses systemd-nspawn containers also does an --install check, although it does not consider failure to install to be an error. The reason is because sometimes it is legitimately impossible to install the package(s), for example, it might be a split package which is mutually exclusive.


Managing AUR repos The Right Way -- aurpublish (now a standalone tool)

Offline

#5 2019-01-14 02:50:22

Omar007
Member
Registered: 2015-04-09
Posts: 317

Re: Automating using GitLab CI

The Dockerfile at the top basically just gets a set of mirrors, update the keys and packages and installs base-devel, multilib-devel and namcap then sets up a normal user with passwordless sudo pacman. That's all.

eschwartz wrote:

for example, it might be a split package which is mutually exclusive.

Good point.

Maybe it's a better idea to sequentially do an install then?
Something along the, lines of: `for pkg in *.pkg.tar.xz; do yes | sudo pacman -U ${pkg}; done`?

And otherwise I guess just going for the '--install' and not use a separate step is maybe the better option.

Last edited by Omar007 (2019-01-14 02:58:26)

Offline

Board footer

Powered by FluxBB