You are not logged in.
I want to add multiple sources to PKGBUILD.
It can attempt to download the second source if failed to download the first source when I install package.
But the following code can not work.
source=("$pkgname.zip::https://github.com/tboox/xmake/archive/v${pkgver}.zip" "$pkgname.zip::https://xxxxx.com/tboox/xmake/archive/v${pkgver}.zip")
md5sums=('fca9f41c64c1bb0838d399aad0ac3a2a' 'fca9f41c64c1bb0838d399aad0ac3a2a')
My PKGBUILD: https://aur.archlinux.org/cgit/aur.git/ … LD?h=xmake
Who can help me? Thanks!
Offline
Unfortunately, alternative sources are not supported. Add a second, commented source array to the PKGBUILD along with a commented explanation that if the download fails, the user can try the second source.
Why does the download sometimes fail?
Btw, you should use a stronger hash for the checksums (sha256 or sha512).
My Arch Linux Stuff • Forum Etiquette • Community Ethos - Arch is not for everyone
Offline
I visit github source is very unstable in our area, so I need add the second mirror source url.
Offline
A custom download agent in makepkg.conf replacing curl with a protocol that supports multiple sources can do that.
Check the aria2 wiki page.
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
A custom download agent in makepkg.conf replacing curl with a protocol that supports multiple sources can do that.
Check the aria2 wiki page.
Please don't do this. The sources would have to be moved from the sources array into the prepare or build functions. Checksumming and pre-emptive downloading would be disabled and PKGBUILD would provide incomplete metadata for parsers.
My Arch Linux Stuff • Forum Etiquette • Community Ethos - Arch is not for everyone
Offline
Xyne,
if i understand you correctly using the download agent on the wiki page won't enable aria2 's download from multiple sources possibilities ?
Seems like that entry could use a note then.
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
Xyne,
if i understand you correctly using the download agent on the wiki page won't enable aria2 's download from multiple sources possibilities ?
Seems like that entry could use a note then.
Which page are you referring to?
If you enable a download agent that supports parallel downloads then separate source files may be downloaded in parallel (I'm not sure that makepkg actually will download them in parallel as it's internal logic is possibly sequential). Nevertheless, what OP is asking about here is how to supply multiple URLs for the same source (so that the download will succeed even if one of the URLs fails). The sources array has no way (that I am aware of) to specify multiple URLs for the same file. The only way to do it would be to download the sources via multiple URLs in parallel in the prepare or build functions after removing them from the sources array, which is what I thought you were suggesting and which I opposed.
My Arch Linux Stuff • Forum Etiquette • Community Ethos - Arch is not for everyone
Offline
This whole approach sounds like a bad idea. If this PKGBUILD is for personal use, just use whatever download url works for you.
If this is to go to the AUR, use the upstream URL. If some regions block traffic to that upstream source, you can have a commented line with an alternative source (as suggested in post #2). Having a fallback url (uncommented and used by makepkg) seems dangerous to me - perhaps less so when the checksums are identical - but not so much less so with just a checksum as opposed to a secure hash (also suggested in post #2).
With a weak checksum, or with different checksums, it would be ridiculously easy to distribute malicious code under the radar if makepkg simply fell back to a secondary url.
"UNIX is simple and coherent" - Dennis Ritchie; "GNU's Not Unix" - Richard Stallman
Offline
Lone_Wolf wrote:Xyne,
if i understand you correctly using the download agent on the wiki page won't enable aria2 's download from multiple sources possibilities ?
Seems like that entry could use a note then.
Which page are you referring to?
Disliking systemd intensely, but not satisfied with alternatives so focusing on taming systemd.
clean chroot building not flexible enough ?
Try clean chroot manager by graysky
Offline
Xyne wrote:Which page are you referring to?
I just checked the makepkg source code. The source files are download one at a time in a single loop. Even with aria2c as the download agent, there is no speedup from parallel downloads. The only possible speedup is from segmented downloads from the same server, but only if the server throttles individual connections while allowing multiple simultaneous connections.
My Arch Linux Stuff • Forum Etiquette • Community Ethos - Arch is not for everyone
Offline
But homebrew supports this way and works fine.
Offline
But homebrew supports this way and works fine.
Why does that matter to us?
Offline
But homebrew supports this way and works fine.
No one said that it's technically impossible. Someone just needs to code it and get it accepted upstream.
My Arch Linux Stuff • Forum Etiquette • Community Ethos - Arch is not for everyone
Offline
@waruqi
In my opinion a solution is actually not just possible, but it also doesn't mess up with the metadata (@Xyne see below). Here is your PKGBUILD edited in order to use mirrors:
# Maintainer: <waruqi@gmail.com>
# PKGBuild Create By: lumpyzhu <lumpy.zhu@gmail.com>
pkgname='xmake'
pkgver='2.1.4'
pkgrel=1
pkgdesc='A make-like build utility based on Lua'
arch=('i686' 'x86_64')
url='https://github.com/tboox/xmake'
license=('Apache')
# You can add here as many mirrors as you want
_xmake_mirrors=(
"https://github.com/tboox/xmake/archive/v${pkgver}.zip"
"https://xxxxx.com/tboox/xmake/archive/v${pkgver}.zip"
)
# General function for scanning a list of mirrors
# Author: grufo <madmurphy333@gmail.com>
_get_mirror() {
local -n url_list=$1
local url_item
for url_item in ${url_list[@]}; do
if curl --output /dev/null --silent --head --fail "${url_item}"; then
echo -n "${url_item}"
return 0
fi
done
return 1
}
source=("${pkgname}.zip::$(_get_mirror _xmake_mirrors)")
sha256sums=('02000f3af2ab060107810d4b5b8d3864850aa49af669e82bee27a25d5e3bee21')
build() {
cd "${srcdir}/${pkgname}-${pkgver}"
make build
}
package() {
cd "${srcdir}/${pkgname}-${pkgver}"
mkdir -p "${pkgdir}/usr/share"
cp -r "./xmake" "${pkgdir}/usr/share/"
install -Dm755 ./core/src/demo/demo.b "${pkgdir}/usr/share/xmake/xmake"
echo "#/!bin/bash
export XMAKE_PROGRAM_DIR=/usr/share/xmake
/usr/share/xmake/xmake \"\$@\"
" > ./xmake.sh
install -Dm755 "./xmake.sh" "${pkgdir}/usr/bin/xmake"
}
I think it's a solution flexible enough… For example, your PKGBUILD requires only one file, but in theory, within the ${source} array, a file hosted by several mirrors could cohabitate together with other static files, as in the following code:
source=("${pkgname}.zip::$(_get_mirror _xmake_mirrors)"
"${pkgname}.png"
"${pkgname}.desktop")
Furthermore, imagine that two files require a mirror instead of one. With this solution all you need to do is to nest two lists of mirrors into the ${source} array, et voila:
_xmake_zip_mirrors=(
"https://mirror1.com/${pkgname}.zip"
"https://mirror2.com/${pkgname}.zip"
"https://mirror3.com/${pkgname}.zip"
)
_xmake_png_mirrors=(
"https://mirror1.com/${pkgname}.png"
"https://mirror2.com/${pkgname}.png"
"https://mirror3.com/${pkgname}.png"
)
source=("$(_get_mirror _xmake_zip_mirrors)"
"$(_get_mirror _xmake_png_mirrors)"
"${pkgname}.desktop")
@Xyne
Please don't do this. The sources would have to be moved from the sources array into the prepare or build functions. Checksumming and pre-emptive downloading would be disabled and PKGBUILD would provide incomplete metadata for parsers.
Currently I mantain a package, tor-browser (international PKGBUILD), which does not require mirrors but actually does something very similar to the example above (when searching for the local version of Tor Browser).
With this approach the metadata is generated after the list of mirrors has been scanned. Here is the .SRCINFO file generated by makepkg --printsrcinfo > .SRCINFO (using the PKGBUILD above):
pkgbase = xmake
pkgdesc = A make-like build utility based on Lua
pkgver = 2.1.4
pkgrel = 1
url = https://github.com/tboox/xmake
arch = i686
arch = x86_64
license = Apache
source = xmake.zip::https://github.com/tboox/xmake/archive/v2.1.4.zip
sha256sums = 02000f3af2ab060107810d4b5b8d3864850aa49af669e82bee27a25d5e3bee21
pkgname = xmake
As you can see the list of mirrors is not expanded here, and the .SRCINFO file contains only the first valid mirror found. But a single valid mirror is still a valid link, not less valid than choosing a single link manually.
--grufo
Last edited by grufo (2017-07-14 12:12:33)
Offline
@grufo Great, this is a good solution.
Thank you very much! : )
Offline
@waruqi
You are welcome.
Here is a complete PKGBUILD sample using the solution above:
# Maintainer: John Doe <john.doe@example.com>
pkgname='myapp'
pkgver='1.0.0'
pkgrel='1'
pkgdesc='My application'
arch=('i686' 'x86_64')
url="https://johndoe.github.io/myapp/"
license=('GPL')
# Package source mirrors. You can put here as many mirrors as you want.
_pkg_mirrors=("https://mirror1.com/${pkgname}.zip"
"https://mirror2.com/${pkgname}.zip"
"https://mirror3.com/${pkgname}.zip")
# Icon mirrors. You can put here as many mirrors as you want.
_icon_mirrors=("https://mirror1.com/${pkgname}.png"
"https://mirror2.com/${pkgname}.png"
"https://mirror3.com/${pkgname}.png")
# General function for scanning a list of mirrors
# Author: grufo <madmurphy333@gmail.com>
_get_mirror() {
local -n url_list=$1
local url_item
for url_item in ${url_list[@]}; do
if curl --output /dev/null --silent --head --fail "${url_item}"; then
echo -n "${url_item}"
return 0
fi
done
return 1
}
source=("$(_get_mirror _pkg_mirrors)"
"$(_get_mirror _icon_mirrors)"
"${pkgname}.desktop")
md5sums=('a757d38de4348440f6de36195586bcd6'
'3497f57f6fafd5bfbb670232125aa40f'
'4e163bc495d72561dea8a328e7ab4185')
prepare() {
cd "${srcdir}/${pkgname}-${pkgver}"
./autogen.sh
./configure --prefix=/usr
}
build() {
cd "${srcdir}/${pkgname}-${pkgver}"
make
}
check() {
cd "${srcdir}/${pkgname}-${pkgver}"
make check
}
package() {
cd "${srcdir}/${pkgname}-${pkgver}"
make DESTDIR="${pkgdir}" install
}
Last edited by grufo (2017-07-14 15:20:14)
Offline
I find the pervasive use of md5sums somewhat depressing, but more so when dealing with sources that are potentially under the control of multiple independent entities. This makes it a lot easier to inject maliciously corrupted sources, even if the PKGBUILD maintainer did their due diligence and verified the authenticity of the source they used to generate md5sums.
Please consider using something stronger, like sha256sums, or alternatively convincing upstream to supply PGP signatures.
Managing AUR repos The Right Way -- aurpublish (now a standalone tool)
Offline
@grufo Ok, I will try it. Thanks!
Offline
I also just implemented mirrors in Eclipse Reporting.
Offline