You are not logged in.
I've been thinking about possible ways to ease package maintenance, and have come up with a couple ideas, here's one:
To make keeping packages up-to-date easier, create a list of >= 1 mirror for each package (already in the PKGBUILD's) and a script that will update a database containing all packages available on that mirror; then, alert the maintainer whenever a newer version of a package is found. This, to me, is a smart idea (the more automated the process, the better), and I've created a simple script implementing elinks that will dump the available links of any supported URL (it could be improved, but this works):
#!/bin/bash
#
# Copyright (c) 2009 Dylon Edwards <deltaecho@archlinux.us>
# Initialize some variables
#RESTRICTED=1 # Restricted recursive
DUMP=0 # Dump files
LINKS=0 # List links
CONSOLIDATE=0 # Consolidate the output to reflect
#+ only the 1st occurrence of each file
# Create a tempory dump file
TMPFILE=$( mktemp "/tmp/lsweb.XXXXXXXX" )
# Print the help text
printUsage()
{
printf "%-20s%s\n" "-h, --help" "print this help text."
printf "%-20s%s\n" "-d, --dump" "dump a list of available files from the"
printf "%20s%s\n" "" "specified URLs."
printf "%-20s%s\n" "-l, --links" "list all the links within a URL."
printf "%-20s%s\n" "-c, --consolidate" "print only the first occurrance of each file."
printf "\n"
printf "%s\n" "usage: ${0##*/} -rd http://elinks.or.cz/download/"
# printf "%-20s%s\n" "-r, --restricted" "recursively follow all the links that"
# printf "%20s%s\n" "" "have the same domain as the URL."
# printf "%-20s%s\n" "-u, --unrestricted" "recursively follow all the links in"
# printf "%20s%s\n" "" "the URL, regardless their domain (BE CAREFUL)"
# printf "%-20s%s\n" "--color" "display colorful links."
# printf "%-20s%s\n" "-t, --tree" "list the links within a specified URL"
# printf "%20s\n" "in a tree structure."
}
# Parse the input arguments
parseArgs()
{
if [ -z "$1" ]; then
printUsage
exit 0
fi
while [ $# -gt 0 ]; do
OPTIND=0
for URL in $@; do
case "${URL}" in
-*)
break
;;
*)
URLS[urlindex++]="${URL}"
shift
;;
esac
done
while getopts ":hdlc-:" ARG ; do
case $ARG in
h)
printUsage
exit 0
;;
d)
DUMP=1
;;
l)
LINKS=1
;;
c)
CONSOLIDATE=1
;;
-)
case $OPTARG in
help)
printUsage
exit 0
;;
dump)
DUMP=1
;;
links)
LINKS=1
;;
consolidate)
CONSOLIDATE=1
;;
*)
echo "Unknown argument --${OPTARG}"
printUsage
exit 1
;;
esac
;;
*)
echo "Unknown argument: -${ARG}"
printUsage
exit 1
;;
esac
done
shift $(( ${OPTIND} - 1 ))
done
for URL in $@; do
URLS[urlindex++]="${URL}"
shift
done
if (( ! LINKS && ! DUMP )); then
LINKS=1
DUMP=1
fi
}
# Parse and print the output (in order of appearance)
parseAndPrint()
{
# Print the URL in reverse order
#+ so it can be sorted from right-to-left
awk -F"/" '{
for ( i = NF; i > 0; i-- )
printf("%s ", $i)
printf("\n")
}' < <( echo "$@" ) | \
# Sort the output, and keep only the
#+ 1st occurrence of each file
sort -k 1,1 -u | \
# Reprint the reversed URL in its
#+ correct order
awk '{
printf("%s", $NF)
for ( i = NF-1; i >= 0; i-- )
if ( i != ( NF - 1 ))
printf( "/%s", $i )
else
printf( "//%s", $i )
printf( "\n" )
}' | \
# Separate each file with a newline
awk '{
printf( "%-s\n", $1 )
}'
}
# Show the available URL links
dumpLinks()
{
parseAndPrint "$( awk '/\/$/' ${TMPFILE} )"
}
# Show the available files
dumpFiles()
{
parseAndPrint "$( awk '!/\/$/' ${TMPFILE} )"
}
# Get all the links available
fetchLinks()
{
if (( CONSOLIDATE )); then
# Dump the content of each URL
for URL in ${URLS[@]}; do
elinks -dump ${URL} | awk '/^ +[0-9]+[.] [A-Za-z]+:\/\// {print $2}' >> ${TMPFILE}
done
# Show the available links
if (( LINKS )); then
dumpLinks
fi
# Dump the available files
if (( DUMP )); then
dumpFiles
fi
else
# Dump the contents of each URL
for URL in ${URLS[@]}; do
echo "${URL} :"
echo
elinks -dump ${URL} | awk '/^ +[0-9]+[.] [A-Za-z]+:\/\// {print $2}'
echo
done
fi
}
# Parse the user's arguments
parseArgs $@
# Dump the links
fetchLinks
# Delete the temporary dump file
rm ${TMPFILE}The maintainer could even go so far as to create a script that would proceed to update the PKGBUILD automatically and place it in a specific directory for testing -- speeding up and easing the process even further.
What do y'all think?
(02/15/2009) Updated the script
(02/15/2009) I've just had an idea for the script, let me know what you think and I'll post the updated script shortly.
(02/15/2009 @ 15:07) Getting there; I've now upgraded the little function to a script that will sort each file in every URL you give it, and print the 1st occurrence of each one, in alphabetical order -- still kinda' rough around the edges though.
(02/15/2009 @ 18:32) Cool beans! It's now workable, but not yet complete. As you can see, I have a few more features to implement, but I think I'm going to stop here tonight...
Last edited by deltaecho (2009-02-15 23:34:30)
Dylon
Offline
This is (sort of) what Frugalware does. I bet that command you wrote doesnt work for all packages. I bet it doesnt work for most of them actually.
I dont like this because its not guaranteed to work for "ever" and maintaining such scripts is a waste of time.
Furthermore, many packages just announce new releases. On mailing lists, sourceforge, freshmeat etc
Last edited by dolby (2009-02-15 18:04:49)
There shouldn't be any reason to learn more editor types than emacs or vi -- mg (1)
[You learn that sarcasm does not often work well in international forums. That is why we avoid it. -- ewaller (arch linux forum moderator)
Offline
This is (sort of) what Frugalware does. I bet that command you wrote doesnt work for all packages. I bet it doesnt work for most of them actually.
I dont like this because its not guaranteed to work for "ever" and maintaining such scripts is a waste of time.Furthermore, many packages just announce new releases. On mailing lists, sourceforge, freshmeat etc
Last edited by dolby (Today 13:04:49)
From the elinks man page:
...
ELinks can handle both local files and remote URLs. The main supported remote URL protocols are HTTP, HTTPS (with SSL support compiled in) and FTP. Additional protocol support exists for BitTorrent finger, Gopher, SMB and NNTP.
...
It should work for most files listed in the source=() array of PKGBUILDs -- most of the unsupported protocols would be svn://, bzr://, git://, etc., which, due to their developmental nature, would be impractical to keep tabs on anyway.
Thanks for your input, though ![]()
Dylon
Offline
I didnt mean elinks. I know elinks can do that. I meant your script. There is no 1 script to fit em all. You got to maintain scripts too.
There shouldn't be any reason to learn more editor types than emacs or vi -- mg (1)
[You learn that sarcasm does not often work well in international forums. That is why we avoid it. -- ewaller (arch linux forum moderator)
Offline
This is (sort of) what Frugalware does. I bet that command you wrote doesnt work for all packages. I bet it doesnt work for most of them actually.
I dont like this because its not guaranteed to work for "ever" and maintaining such scripts is a waste of time.Furthermore, many packages just announce new releases. On mailing lists, sourceforge, freshmeat etc
On one hand, it is time-consuming to fix a script and even more annoying to find out that it broke down for some stupid reason.
but on the other hand we'd have to consider how much we'd gain from that. I am not sure whether it is worth it, since i am not a package maintainer, but i bet someone like Jan de Groot who is maintaining almost a thousand packages might find that handy.
The point with the mailing lists and freshmeat and sourceforge and stuff is that unfortunately it just sucks to assemble all this information along with all the crap you don't actually like to have. there are some projects which have a low-traffic, release only mailing list, but unfortunately not all of them do ![]()
hm,
anyway, cheers
Barde
Offline
There is a GUI package in the AUR that monitors web pages for changes. Can not remember what it was called... I used to use it but now I just let users flag my packages out of date.
Offline
There is a GUI package in the AUR that monitors web pages for changes. Can not remember what it was called... I used to use it but now I just let users flag my packages out of date.
aurgtk?
Offline
There is a GUI package in the AUR that monitors web pages for changes.
I, incidentally, have made such a program a year or two ago. I remade the program a couple days ago and now it actually works(i didn't know what i was doing back then). I'll post it to arch contributions and AUR tomorrow when i'm not half asleep.
KISS = "It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience." - Albert Einstein
Offline
Allan wrote:There is a GUI package in the AUR that monitors web pages for changes. Can not remember what it was called... I used to use it but now I just let users flag my packages out of date.
aurgtk?
No, a general web page monitor.
Offline
skottish wrote:Allan wrote:There is a GUI package in the AUR that monitors web pages for changes. Can not remember what it was called... I used to use it but now I just let users flag my packages out of date.
aurgtk?
No, a general web page monitor.
I updated the script again ![]()
Dylon
Offline
skottish wrote:Allan wrote:There is a GUI package in the AUR that monitors web pages for changes. Can not remember what it was called... I used to use it but now I just let users flag my packages out of date.
aurgtk?
No, a general web page monitor.
there is an addon for firefox "Update Scanner"
Offline
...
there is an addon for firefox "Update Scanner"
Good stuff, but not quite what I'm looking for ![]()
I had actually forgotten about Specto, it may be a better solution for my original post than my proposed script; I'll check it out and see if I like it better, and if so, will develop some sort of system using it. For a long time, though, I've wanted to create a script that would list the content of web pages, so I still want to complete "lsweb".
...in the meantime, check out my script and let me know what you think of it (is it actually useful, do you hate it, what about it could be improved, proposed solutions how to complete the ToDo -- commented out -- features)!
Last edited by deltaecho (2009-02-15 23:49:15)
Dylon
Offline
We have a winner! That is what I used to use to monitor for updates.
Offline