You are not logged in.

#3376 2020-05-25 01:39:30

salonkommunist
Member
From: Germany
Registered: 2020-02-18
Posts: 26

Re: Post your handy self made command line utilities

That was really helpful. Thank you guys!

Offline

#3377 2020-05-26 09:04:49

jasonwryan
Anarchist
From: .nz
Registered: 2009-05-09
Posts: 29,026
Website

Re: Post your handy self made command line utilities

Been a while...

#!/usr/bin/bash
# Update mail certs

target="$HOME/.local/certs"
PS3='Select a host: '
options=("dom1" "dom2" "dom3")

select opt in "${options[@]}"
  do
    case "$opt" in
    dom1) domain="mail.dom1.com"
              name="one"
              break
              ;;
    dom2) domain="mail.dom2.com"
              name="two"
              break
              ;;
    dom3) domain="mail.dom3.com"
              name="three"
              break
              ;;
           *) printf "%s\n" "Invalid host"
              exit 1
              ;;
    esac
done

openssl s_client -connect "$domain":993 -showcerts 2>&1 < /dev/null |\
  awk '/BEGIN/,/END/{print}/END/{exit}' > "$target"/"$name".pem

# vim:set ts=2 sts=2 sw=2 et:

Arch + dwm   •   Mercurial repos  •   Surfraw

Registered Linux User #482438

Offline

#3378 2020-05-29 20:59:13

salonkommunist
Member
From: Germany
Registered: 2020-02-18
Posts: 26

Re: Post your handy self made command line utilities

#!/bin/bash
wd="~/.scripts/feedgen"
for i in $(cat $wd/urls) ; do
    title=$(curl -s -L $i 2> /dev/null | grep -m 1 title | awk -F \> '{print $2}' | awk -F \< '{print $1}')
    test -z "$title" && continue
    feed=$(echo $title | sed 's/ //g')
    if !(find $wd -type d | grep -q $feed) ; then
        mkdir -p $wd/$feed
        w3m $i > $wd/$feed/$feed.stale || exit 1
        echo "<?xml version=\"1.0\" encoding=\"UTF-8\" ?>" >> $wd/$feed/$feed.xml
        echo "<rss version=\"2.0\">" >> $wd/$feed/$feed.xml
        echo "" >> $wd/$feed/$feed.xml
        echo "<channel>" >> $wd/$feed/$feed.xml
        echo "    <title>"$title"</title>" >> $wd/$feed/$feed.xml
        echo "    <description>Generated by feedgen.sh</description>" >> $wd/$feed/$feed.xml
        echo "    <link>"$i"</link>" >> $wd/$feed/$feed.xml
        echo "    <pubDate>"$(date '+%a, %d %b %Y %H:%M:%S %z')"</pubDate>" >> $wd/$feed/$feed.xml
        echo "" >> $wd/$feed/$feed.xml
        echo "    <item>" >> $wd/$feed/$feed.xml
        echo "        <title>Now tracking "$title"</title>" >> $wd/$feed/$feed.xml
        echo "        <description>Now tracking "$title"</description>" >> $wd/$feed/$feed.xml
        echo "        <link>"$i"</link>" >> $wd/$feed/$feed.xml
        echo "        <pubDate>"$(date '+%a, %d %b %Y %H:%M:%S %z')"</pubDate>" >> $wd/$feed/$feed.xml
        echo "    </item>" >> $wd/$feed/$feed.xml
        echo "" >> $wd/$feed/$feed.xml
        echo "</channel>" >> $wd/$feed/$feed.xml
        echo "</rss>" >> $wd/$feed/$feed.xml
    else
        w3m $i > $wd/$feed/$feed.fresh || exit 1
        change=$(diff $wd/$feed/$feed.stale $wd/$feed/$feed.fresh)
        if ! test -z "$change" ; then
            head -n -2 $wd/$feed/$feed.xml > $wd/$feed/$feed.xml.tmp
            mv $wd/$feed/$feed.xml.tmp $wd/$feed/$feed.xml
            echo "    <item>" >> $wd/$feed/$feed.xml
            echo "        <title>"$title" has changed!</title>" >> $wd/$feed/$feed.xml
            echo "        <description>"$change"</description>" >> $wd/$feed/$feed.xml
            echo "        <link>"$i"</link>" >> $wd/$feed/$feed.xml
            echo "        <pubDate>"$(date '+%a, %d %b %Y %H:%M:%S %z')"</pubDate>" >> $wd/$feed/$feed.xml
            echo "    </item>" >> $wd/$feed/$feed.xml
            echo "" >> $wd/$feed/$feed.xml
            echo "</channel>" >> $wd/$feed/$feed.xml
            echo "</rss>" >> $wd/$feed/$feed.xml
        fi
        mv $wd/$feed/$feed.fresh $wd/$feed/$feed.stale
    fi
done

Offline

#3379 2020-06-01 03:38:54

eschwartz
Trusted User/Bug Wrangler
Registered: 2014-08-08
Posts: 3,536

Re: Post your handy self made command line utilities

for i in $(cat $wd/urls) ; do
    ...
done

Don't do this, for loops don't loop over lines, they loop over space-separated words -- instead use:

while IFS= read -r i ; do
    ...
done < $wd/urls
wd="~/.scripts/feedgen"
for i in $(cat $wd/urls) ; do

The $wd variable contains the literal text ~/ and this will resolve to the directory name ./\~/.scripts/feedgen which I doubt you want. Don't quote home directory tilde expansions. Better yet, use $HOME which gets expanded properly.

    title=$(curl -s -L $i 2> /dev/null | grep -m 1 title | awk -F \> '{print $2}' | awk -F \< '{print $1}')

This is a lot of chained commands for something awk is not very good at. I would recommend sed here:

    title=$(curl -s -L $i 2> /dev/null | sed -nr '/title/{s/.*>(.*?)<.*/\1/p;q}'

But in truth, you should use a real xml parser, like xmllint or xmlstarlet.


    if !(find $wd -type d | grep -q $feed) ; then

The !(...) is not needed here, simply use

    if ! find $wd -type d | grep -q $feed ; then

You don't need to run the find | grep pipeline inside an *additional* subshell, and it's not part of the grammar of the "if" keyword.

        change=$(diff $wd/$feed/$feed.stale $wd/$feed/$feed.fresh)
        if ! test -z "$change" ; then

Since you don't care about creating a patch file, but only wish to assert that the two files are different, consider using "cmp -s" instead of diff.

Last edited by eschwartz (2020-06-01 03:39:17)


Managing AUR repos The Right Way -- aurpublish (now a standalone tool)

Offline

#3380 2020-07-04 07:37:48

assertion9
Member
Registered: 2019-04-13
Posts: 19

Re: Post your handy self made command line utilities

Performs full system backup with rsync tool in automated or user interacted ways. Invoked via system systemd timer in non-interactive shell. I will be glad to receive any criticism, since this is one of my first ever written shell script which takes more than 10 lines of code

Style issues I have found but didn't know how to fix:
1. Sciprt calculates two times backup_samples amount:
- at the start to check whether any sample exists to decrypt and unpack it
- in the function "remove_obsolete" after(if) new sample was successfully created
2. Don't know if this common practice, but I declared variable "latest" in function "check_first_run" and used it in "remove_obsolete" which perfectly worked but in other languages such usage would be impossible since variable "latest" would be inacessible for latter function "remove_obsolete"
3. Since script is invoked from non-interactive shell in "check_first_run" I have used some kind of a popup terminal window for receiving user password to decrypt latest gpg backup sample. Please, let me know if I could do this in any way better

PS1=Probably I should even cut out the whole part for checking existing samples of backup and don't unpack latest gpg file. Instead just store mirror of the whole system in "/var/tmp/data". Which in turn will drastically reduce time of backing up stuff, since I have a lot of media files such as audio/video/pictures and each time moving them with rsync and then compress already compressed formats of video/audio which will not result in any descension of total occupied size of backup sample seems like waste of energy

#!/bin/sh

recepient="assertion9@gmail.com"

dir="/var/tmp/fsbackup"
archive="/var/tmp/fsbackup/backup.tar.zst"
gpg_file="${dir:?}/$(date '+%I-%M%p_%d_%b_%Y').gpg"

MAX_BACKUP_SAMPLES=2
backup_samples=$(find "$dir" -maxdepth 1 -name "*.gpg" | wc -l)

dirs_skip_list="$dir /dev/ /proc/ /sys/ /tmp/ /run/ /mnt/"

get_user_input(){
    message=$1
    printf "%s\n" "$message"
    read -r answer
    if [ "$answer" != "${answer#[Yy]}" ]; then
        return 1
    else
        exit
    fi
}

remove_obsolete(){
    backup_samples=$(find "$dir" -maxdepth 1 -name "*.gpg" | wc -l)
    [ "$backup_samples" -ge "$MAX_BACKUP_SAMPLES" ] && rm -f "${oldest:?}"
}

check_first_run(){
    if [ -d "$dir" ] && [ "$backup_samples" -gt 0 ]; then
        samples="$(find /var/tmp/fsbackup -maxdepth 1 -type f -printf '%T+ %p\n' | sort | awk '{printf $NF " "}' | sed 's/.$//')"
        latest=${samples##* }
        oldest=${samples%% *}
        urxvt -name popup -e dash -c "gpg --homedir=/home/assertion9/.local/share/gnupg -o $archive -d $latest"
        if [ -f "$archive" ];then
            tar --zstd -xf "$archive"
            rm "$archive"
        else
            su assertion9 -c 'dunstify -u critical -i luckybackup "Bad passphrase! Backup aborted!"'
            exit 1
        fi
    else
        mkdir "$dir"
        chmod 700 "$dir"
    fi
}

backup(){
    src="/"
    dest="/var/tmp/fsbackup/data"
    msg="The full system backup will be created at $dir as a file $gpg_file. Proceed? (y/n)"

    [ "$1" != "auto" ] && get_user_input "$msg"

    su assertion9 -c 'dunstify -u critical -i luckybackup "Starting system backup..."'

    check_first_run

    exclude="$(printf ' --exclude=%s' $dirs_skip_list)"

    rsync -aAX --delete $exclude "$src" "$dest"

    tar --zstd -cf "$archive" "$dest"

    gpg --homedir=/home/assertion9/.local/share/gnupg -o "$gpg_file" -e -r "$recepient" "$archive"

    rm -rf "$dest" "$archive"

    remove_obsolete

    su assertion9 -c 'dunstify -u critical -i luckybackup "System backup finished!"'
}

restore(){
    msg="System will restore data from '$dest' to '/'. Proceed? (y/n)"

    get_user_input "$msg"

    while :
    do
        printf "Enter absolute path of gpg file: "
        read -r gpg_file_path
        if [ -z "$gpg_file_path" ];then
            printf "(!) Empty input (!)\n"
        elif [ -d "$gpg_file_path" ];then
            printf "(!) Input is a directory, not a gpg file (!)\n"
        elif [ -f "$gpg_file_path" ];then
            case "$gpg_file_path" in
                *.gpg )
                    src="$gpg_file_path"
                    dest="/"
                    exclude="$(printf ' --exclude=%s' $dirs_skip_list)"
                    rsync -aAX --info=progress2 $exclude "$src" "$dest"
                    ;;
                *)
                    printf "(!) Input isn't a gpg file (!)\n"
                    return
                    ;;
            esac
        else
            printf "(!) Input isn't a gpg file (!)\n"
        fi
    done
}

case "$1" in
(auto)
    backup auto
    exit 0
    ;;
(backup)
    backup
    exit 0
    ;;
(restore)
    restore
    exit 0
    ;;
(*)
    echo "Usage: $0 {backup|restore|auto}"
    exit 2
    ;;
esac

Last edited by assertion9 (2020-07-04 12:25:51)

Offline

#3381 2020-07-08 07:17:48

YesItsMe
Member
Registered: 2017-07-12
Posts: 37

Re: Post your handy self made command line utilities

If you have a cluttered $HOME (like I do), this automatic Git updater will, at least, take care of having your local Git stuff up-to-date:

#!/opt/schily/bin/bosh
# Update all Git repositories in/below $HOME.

# ---- VARIABLES

# Contains a list of (substrings of) directories to skip.
BLACKLIST="$HOME/go/ $HOME/Library"

# Contains a list of Git repository directories.
# Currently, this is collected automatically, but it can be a
# manual list as well.
GITDIRS=`find $HOME -maxdepth 3 -name '.?*' -prune -o -type d -call '[ -e "$1/.git" ]' {} \; -prune -print`

# ---- CODE

for DIR in ${GITDIRS}; do
	# Blacklist check:
	allowed=1
	for BAN in ${BLACKLIST}; do
		substrings=`expr "$DIR" : "$BAN"`
		if [ $substrings -ne 0 ] ; then
			# This directory is in/below a blacklisted one.
			allowed=0
			break
		fi
	done

	if [ $allowed -eq 0 ]; then
		continue
	fi

	# This directory is promised to be not blacklisted.
	# Process it.
	cd $DIR

	# Check the remote state:
	git fetch

	UPSTREAM=${1:-'@{u}'}
	LOCAL=`git rev-parse @`
	REMOTE=`git rev-parse "$UPSTREAM"`
	BASE=`git merge-base @ "$UPSTREAM"`

	if [ $LOCAL = $REMOTE ]; then
		# This repository is up-to-date.
		# Do nothing with it.
		continue
	elif [ $LOCAL = $BASE ]; then
		# The remote version is newer.
		echo "Updating Git repository: $DIR"
		git pull
	else
		# Something is different.
		echo "The Git repository in $DIR needs to be merged."
		echo "This cannot be done automatically."
	fi
done

Note that you’ll need to change the GITDIRS call if you’re using a non-Schily POSIX shell.

Last edited by YesItsMe (2020-07-08 07:18:09)

Offline

#3382 2020-07-30 09:47:58

Ferdinand
Member
Registered: 2020-01-02
Posts: 8

Re: Post your handy self made command line utilities

Here's one I use to move my camera files out of a folder and into parallel folders named from the file dates.

The workflow being something like this:

  1. Copy files from the camera into a folder "Newstuff"

  2. cd to "Newstuff" and run the script

  3. The new files are then moved to date-named folders, parallel to "Newstuff"

It has lots of improvement potential; I don't think it handles files with spaces well, and it could need a nicer confirmation dialogue to avoid calling it in the wrong folder (which can cause a real mess yikes )

#!/bin/bash

group_by_date () {
    for file in `ls`; do
        filedate=`stat -c %y $file | cut -d ' ' -f 1`
        if [ -d ../$filedate ]; then
            mv $file ../$filedate
        else
            mkdir ../$filedate
            mv $file ../$filedate
        fi
    done
}

# Make sure I do this from the correct directory
echo "Do you want to group by date the files in $PWD?"
select yn in "Yes" "No"; do
    case $yn in
        Yes ) group_by_date; break;;
        No ) exit;;
    esac
done

Offline

#3383 2020-07-30 12:46:59

teckk
Member
Registered: 2013-02-21
Posts: 380

Re: Post your handy self made command line utilities

Another way without parsing ls, or using cut.

for file in *; do
    filedate=$(date +%F -r "$file")
    echo "$filedate"
done

Offline

#3384 2020-07-30 13:41:00

Trilby
Inspector Parrot
Registered: 2011-11-29
Posts: 23,953
Website

Re: Post your handy self made command line utilities

This will do everything in the group_by_date function:

stat -c "%y %n" * | while read date time offset file; do
   mkdir -p ../$date
   mv "$file" ../$date
done

No need to parse ls, and only one subshell and one stat process total, rather than one subshell per file and a stat process per file.

EDIT: changed $date to ../$date as this is intended to be a relative directory.

Last edited by Trilby (2020-07-30 13:45:57)


"UNIX is simple and coherent..." - Dennis Ritchie, "GNU's Not UNIX" -  Richard Stallman

Online

#3385 2020-08-10 05:35:13

monodromy
Member
Registered: 2014-02-08
Posts: 60

Re: Post your handy self made command line utilities

exiftool can also do this; for example:

exiftool '-Directory<DateTimeOriginal' -d %Y/%m/%d dir

Offline

Board footer

Powered by FluxBB