You are not logged in.

#1 2019-01-13 17:20:54

Omar007
Member
Registered: 2015-04-09
Posts: 354

Automating using GitLab CI

I'm currently looking into automating package builds using GitLab CI and for this I'm looking to create a Docker image equivalent of the clean chroot (mkarchroot) normally used to build packages.
While I think I'm close and it seems to be working properly, I'd like to get some feedback in case I missed something.

I've currently made the following Dockerfile:

FROM archlinux/base:latest

RUN curl -o /etc/pacman.d/mirrorlist "https://www.archlinux.org/mirrorlist/?country=NL&protocol=https&use_mirror_status=on" \
        && sed -i 's/^#//' /etc/pacman.d/mirrorlist \
        && pacman-key --refresh-keys \
        && pacman -Syu --noconfirm base-devel multilib-devel namcap

RUN useradd -d /home/builduser builduser \
        && echo "builduser ALL = NOPASSWD: /usr/bin/pacman" > /etc/sudoers.d/builduser-pacman

USER builduser

CMD ["/usr/bin/bash"]

The idea is to update the system and install the a base set of packages for building; base-devel and multilib-devel groups plus namcap to verify packages and PKGBUILDs with.
Then, create a special build user (with passwordless sudo pacman access) and switch to it for further execution.

Then use this image in GitLab CI to run the build jobs with. Example .gitlab-ci.yml

image: ${CI_REGISTRY}/docker-images/arch-build/arch-build:latest

stages:
  - check_pkgbuild
  - build_package
  - check_package

check_pkgbuild:
  stage: check_pkgbuild
  script:
    - namcap PKGBUILD | tee PKGBUILD.namcap.out
  artifacts:
    paths:
      - PKGBUILD.namcap.out

build_package:
  stage: build_package
  script:
    - makepkg --syncdeps --noconfirm --log
  artifacts:
    paths:
      - "*.log"
      - "*.pkg.tar.xz"

check_package:
  stage: check_package
  script:
    - namcap *.pkg.tar.xz | tee PKG.namcap.out
  artifacts:
    paths:
      - PKG.namcap.out

If everything works as designed I plan to add another job to push to the AUR and/or add it to a personal repo.

Are there any glaring issues with this setup or am I missing something crucial right now?

Offline

#2 2019-01-14 01:23:14

eschwartz
Trusted User/Bug Wrangler
Registered: 2014-08-08
Posts: 3,622

Re: Automating using GitLab CI

If your intent is to abort on namcap errors then piping to tee will hide any nonzero return codes from namcap unless you use pipefail: https://www.gnu.org/software/bash/manua … lines.html

That being said, namcap will never return nonzero in the first place... so you'd need to inspect that manually anyway.

A more reliable test of the package's sanity would be to make sure each PKGBUILD contains a useful check() function which implements the project unittests. That does depend on the package though. smile


Managing AUR repos The Right Way -- aurpublish (now a standalone tool)

Offline

#3 2019-01-14 02:05:45

Omar007
Member
Registered: 2015-04-09
Posts: 354

Re: Automating using GitLab CI

I did indeed notice namcap never returns a non-zero exit code. I had initially thought it'd give a non-zero exit code for any errors but that was not the case.
For that reason I had decided to store the output knowing that this is currently a manual check at all times. Though honestly mostly as a support if it fails at one of the further steps.
That said, since GitLab also stores the stdout of any command that was executed, tee might be a bit overkill and I may just change that to just a stdout redirect.

The check() function is more for the sanity of the software being packaged than the package itself though is it not?
As far as I'm aware this function is run by default during `makepkg` as well. That said, it's probably not a bad idea to explicitly provide '--check'.
While I'm at it, I might add '--install' as well for some extra certainty.

I take it the Dockerfile looked otherwise fine then?


EDIT:
Decided to split installation to its own stage. This is the current version:

image: ${CI_REGISTRY}/docker-images/arch-build/arch-build:latest

stages:
  - check_pkgbuild
  - build_package
  - install_package
  - check_package

check_pkgbuild:
  stage: check_pkgbuild
  script:
    - namcap PKGBUILD > PKGBUILD.namcap.out
  artifacts:
    paths:
      - PKGBUILD.namcap.out

build_package:
  stage: build_package
  script:
    - makepkg --syncdeps --noconfirm --log --check
  artifacts:
    paths:
      - "*.log"
      - "*.pkg.tar.xz"

install_package:
  stage: install_package
  script:
    - sudo pacman -U *.pkg.tar.xz

check_package:
  stage: check_package
  script:
    - namcap *.pkg.tar.xz > PKG.namcap.out
  artifacts:
    paths:
      - PKG.namcap.out

Still trying to figure out a nice way of getting these in a custom repo. I plan on simply using darkhttpd to host the files but I'm still looking for a way to get the different projects together into a repo database that can be hosted.
I assume that the repo-add operation can't be run multiple times on the same repo at the same time, but have to be performed sequentially if one doesn't want to corrupt it?

Last edited by Omar007 (2019-01-14 02:22:38)

Offline

#4 2019-01-14 02:27:06

eschwartz
Trusted User/Bug Wrangler
Registered: 2014-08-08
Posts: 3,622

Re: Automating using GitLab CI

I'm no great expert at Docker anything tongue but as long as it is doing the steps I'd do anyway, then it looks fine.

The check() function is supposed to test the functionality of the software, which is usually a pretty good metric for if the resulting binaries or whatever are actually functional. tongue There's no need to use --check, since unless you've modded the makepkg.conf it will be there by default.

Using --install is not a bad idea at all -- the official makechrootpkg script which uses systemd-nspawn containers also does an --install check, although it does not consider failure to install to be an error. The reason is because sometimes it is legitimately impossible to install the package(s), for example, it might be a split package which is mutually exclusive.


Managing AUR repos The Right Way -- aurpublish (now a standalone tool)

Offline

#5 2019-01-14 02:50:22

Omar007
Member
Registered: 2015-04-09
Posts: 354

Re: Automating using GitLab CI

The Dockerfile at the top basically just gets a set of mirrors, update the keys and packages and installs base-devel, multilib-devel and namcap then sets up a normal user with passwordless sudo pacman. That's all.

eschwartz wrote:

for example, it might be a split package which is mutually exclusive.

Good point.

Maybe it's a better idea to sequentially do an install then?
Something along the, lines of: `for pkg in *.pkg.tar.xz; do yes | sudo pacman -U ${pkg}; done`?

And otherwise I guess just going for the '--install' and not use a separate step is maybe the better option.

Last edited by Omar007 (2019-01-14 02:58:26)

Offline

#6 2020-06-25 01:32:30

NickCao
Member
From: Shanghai
Registered: 2018-08-16
Posts: 1

Re: Automating using GitLab CI

I've personally tried this approach earlier, and found that the toughest part of the whole process is not building the package itself, but to create a repository out of the packages built. However, during my experiments I found that it is possible to generate the pacman database in a distributed manner, and written a little script for that: https://gitlab.com/NickCao/experiments/ … r/gen-meta. It takes the package file together with the signature as input, then generates the corresponding files and desc entry. Hope that this will be of assistance to you, and I'm really looking forward to the day when the whole arch building process can be automated.

Offline

#7 2020-06-25 21:29:31

eschwartz
Trusted User/Bug Wrangler
Registered: 2014-08-08
Posts: 3,622

Re: Automating using GitLab CI

repo-add operates on a database and adds a new package to it without caring whether other packages are available.

What exactly are you trying to do, modify a database without actually having the database to modify? Why?

Download the database, update it, and upload it along with the new packages themselves.


Managing AUR repos The Right Way -- aurpublish (now a standalone tool)

Offline

#8 2020-06-29 16:52:55

Omar007
Member
Registered: 2015-04-09
Posts: 354

Re: Automating using GitLab CI

NickCao wrote:

I've personally tried this approach earlier, and found that the toughest part of the whole process is not building the package itself, but to create a repository out of the packages built.

That was something I didn't have to much trouble with on my end. Since I'm deploying to a K8S cluster I'd just decided to collect all the software into their own container, then use those as initContainers to collect them all together into a shared volume for the repo container to load as the Pod's main container.
Since the initContainers are executed sequentially the deployment time obviously increases with the amount of packages but for a personal system this setup was fine.

NickCao wrote:

However, during my experiments I found that it is possible to generate the pacman database in a distributed manner, and written a little script for that: https://gitlab.com/NickCao/experiments/ … r/gen-meta. It takes the package file together with the signature as input, then generates the corresponding files and desc entry. Hope that this will be of assistance to you, and I'm really looking forward to the day when the whole arch building process can be automated.

This looks like a fun experiment but I think this overcomplicates things a bit. But that depends on your deployment environment though.
Potentially it would be possible to completely manage packages individually and have them deployed as just another stand-alone service. Eventually you still need something to aggregate all the files and desc data and present them as part of a single database though.

Something else that could be possible with a K8S setup, is to create a repo as a CRD handler and register packages as a CRD which is then loaded into the repo database when deployed into the cluster.

There are def. options but the question is also, how far are you willing to go for what goal/benefit? For my personal repos I'm fine with how it's set up now. If they become much larger then I suppose I may revisit the current setup but for now it works fine smile

Last edited by Omar007 (2020-06-29 16:54:41)

Offline

Board footer

Powered by FluxBB