You are not logged in.

#1651 2011-12-20 18:42:02

egan
Member
From: Mountain View, CA
Registered: 2009-08-17
Posts: 273

Re: Post your handy self made command line utilities

Two frontends to gs to concatenate pdfs and excise pages therefrom, respectively.

#!/bin/bash

##
# catpdf        -- concatenate pdfs together
#
# usage         -- catpdf INFILES OUTFILE
#
# notes         -- requires ghostscript and userbool.sh
#
# written       -- 6 June, 2011 by Egan McComb
#
# revised       -- 19 December, 2011 by author
##

usage()
{
        echo "Usage: $(basename $0) INFILES OUTFILE" >&2
}

writepdf()
{
        command gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite "$@"
}

chkargs()
{
        if (( $# < 3 ))
        then
                echo "Error: Too few arguments" >&2
                usage
                exit $ERR_NARGS
        elif [[ -f "${!#}" ]]
        then
                echo "Warning: Output file exists" >&2
                echo -n "Continue? [y/N] " >&2
                read response
                if ! userbool.sh $response
                then
                        echo "Aborting..." >&2
                        exit 1
                else
                        echo "Continuing..." >&2
                fi
        fi
        for file in "${@:1:$(($#-1))}"
        do
                if [[ ! -e "$file" ]]
                then
                        echo "Error: Invalid file '$file'" >&2
                        exit $ERR_VARGS
                fi
        done
}

##----MAIN----##
chkargs "$@"
writepdf  -sOutputFile="${!#}" "${@:1:$(($#-1))}" || { echo "Errors occurred!" >&2; exit 1; }
exit 0
#!/bin/bash

##
# excpdf	-- remove pages from pdfs with ghostscript
#
# usage		-- excpdf PAGERANGE INFILE OUTFILE
#			-PAGERANGE is given with ranges
#			 e.g. 3-5:7:9-15 keeps those pages
#			-Pages must be in numerical order
#
# notes		-- requires catpdf
#
# written	-- 19 December, 2011 by Egan McComb
#
# revised	--
##

usage()
{
	echo "Usage: $(basename $0) PAGERANGE INFILE OUTFILE" >&2
	echo -e "\t-PAGERANGE is given with ranges" >&2
	echo -e "\t e.g. 3-5:7:9-15 keeps those pages" >&2
	echo -e "\t-Pages must be in numerical order" >&2
}

trim()
{
	tr ":\-," "\n"
}

writepdf()
{
	gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite -dFirstPage=${subrange[0]} -dLastPage=${subrange[-1]} "$@"
}

chkargs()
{
	if (( $# != 3 ))
	then
		echo "Error: Wrong number of arguments" >&2
		usage
		exit $ERR_NARGS
	fi
	chkfile "$2" && in="$2"
	out="$3"
	if grep -q [^[:digit:]:-] <<< "$1"
	then
		echo "Error: Invalid page range syntax" >&2
		usage
		exit $ERR_VARGS
	elif ! trim <<< $1 | sort -nC || [[ ! -z "$(trim <<< $1 | uniq -d)" ]]
	then
		echo "Error: Invalid page range collation" >&2
		usage
		exit $ERR_VARGS
	fi
}

chkfile()
{
	if [[ ! -f "$1" ]] || ! grep -q "PDF" <<< $(file "$1")
	then
		echo "Error: Invalid input file '$1'" >&2
		exit $ERR_VARGS
	fi
}

range()
{
	IFS=":"
	ranges=($@)
	tfiles=()
	for range in ${ranges[@]}
	do
		if (( $(awk 'BEGIN { FS="-"} ; { print NF }' <<< $range) > 2 ))
		then
			echo "Error: Invalid subrange '$range'" >&2
			usage
			exit $ERR_VARGS
		fi
		IFS="-"
		subrange=(${range[@]})
		tfiles[${#tfiles[@]}]=$(mktemp)
		IFS=" "
		writepdf -sOutputFile="${tfiles[-1]}" "$in" || { echo "Errors occurred!" >&2; exit 1; }
	done
	catpdf ${tfiles[@]} "$out"
	rm ${tfiles[@]}
}

##----MAIN----##
chkargs "$@"
range $1

exit 0
#!/bin/bash

##
# userbool.sh	-- parse boolean user input
#
# usage		-- userbool INPUT
#
# notes		-- designed for use in other scripts
#
# todo		-- add safeguards
#
# written	-- 29 December, 2011 by Egan McComb
#
# revised	--
##

##----MAIN----##
if [[ $1 = "Y" ]] || [[ $1 = "y" ]]
then
	exit 0
else
	exit 1
fi

Last edited by egan (2012-01-05 23:45:02)

Offline

#1652 2011-12-20 20:18:13

anstmich
Member
Registered: 2009-06-20
Posts: 29

Re: Post your handy self made command line utilities

Hey all,

Recently, I worked on a project that required that I convert Microsoft word files (openXML .docx) to another usable format (i.e. html).  Although OpenOffice can do this quite well, it did not work well in my particular situation.  Instead, I decided to write my own script to take care of the conversion.  I thought I would share this in case anyone else ever decides they need it!

Written in python, the script takes 2 arguments: (1) input path and (2) output path.  The script depends the lxml and zipfile modules and is inspired partially by the docx module.

Currently, the script dumps text to html, maintaining only vertical spacing attributes (i.e. linebreaks).  I may add additional features later

Here it is:

import zipfile
from lxml import etree
import sys

if(len(sys.argv) < 3):
	print("Usage: python [INPUT PATH.docx] [OUTPUT PATH.html]")

else: 	
	fp_in  = sys.argv[1]
	fp_out = sys.argv[2]
	
	# A .docx file is really just a zip file -- load an unpack it
	docx = zipfile.ZipFile(fp_in)
	xml  = docx.read('word/document.xml')
	
	# pass the raw xml content to lxml for parsing
	xml_string = xml.encode('utf-8')
	document = etree.fromstring(xml_string)
	

	html_out = '<HTML>\n<HEAD></HEAD>\n<TITLE></TITLE>\n<BODY>\n'	
	tag = ''
	
	# dump the document text to an html file, preserving basing formatting
	for element in document.iter():
		
		# grab text and linebreaks
		tag = element.tag
		if(tag[ tag.find('}')+1 : len(tag) ] == 't'):
			html_out += element.text
		elif(tag[ tag.find('}')+1 : len(tag) ] == 'br'):
			html_out += '<br>\n'
	
	html_out += '</BODY>\n</HTML>'
	
	fout = open(fp_out, 'w')
	fout.write(html_out)
	fout.close()

I wrote this pretty quickly so only minimal error checking is done.  Any feedback or modifications are appreciated !!

Offline

#1653 2011-12-20 20:52:50

ninian
Member
From: United Kingdom
Registered: 2008-02-24
Posts: 726
Website

Re: Post your handy self made command line utilities

egan wrote:

Two frontends to gs to concatenate pdfs and excise pages therefrom, respectively.

Very useful, thanks
smile

Offline

#1654 2011-12-21 01:08:02

fsckd
Forum Fellow
Registered: 2009-06-15
Posts: 4,173

Re: Post your handy self made command line utilities

Nice tool. Do you want me to

  1. merge into Post your handy self made command line utilities

  2. move to Community Contributions

?

If you go with the latter, you may want to make a PKGBUILD and uplod to AUR.


aur S & M :: forum rules :: Community Ethos
Resources for Women, POC, LGBT*, and allies

Offline

#1655 2011-12-21 06:54:34

anstmich
Member
Registered: 2009-06-20
Posts: 29

Re: Post your handy self made command line utilities

fsckd wrote:

Nice tool. Do you want me to

  1. merge into Post your handy self made command line utilities

  2. move to Community Contributions

?

If you go with the latter, you may want to make a PKGBUILD and uplod to AUR.

Oh I did not know about the command line utilities thread!  That would be the most logical thread with which to merge it I think.

Offline

#1656 2011-12-23 02:37:11

ewaller
Administrator
From: Pasadena, CA
Registered: 2009-07-13
Posts: 19,739

Re: Post your handy self made command line utilities

anstmich wrote:

[...That would be the most logical thread with which to merge it I think.

Done


Nothing is too wonderful to be true, if it be consistent with the laws of nature -- Michael Faraday
Sometimes it is the people no one can imagine anything of who do the things no one can imagine. -- Alan Turing
---
How to Ask Questions the Smart Way

Offline

#1657 2011-12-27 14:52:29

ellis
Member
From: Birmingham, UK
Registered: 2011-10-20
Posts: 21

Re: Post your handy self made command line utilities

Here's (yet another) script for rsync-centric snapshot backups. It started life as a simple way of keeping track of my 3.5TB+ of data, it's now used in various different configs on all my computers and for my father's work servers, 100% Mac compatible.

#!/bin/bash

### ~/bash/Arch-backup.sh - Backup script inspired by Apple TimeMachine.
### 19:48 Dec 4, 2011
### Chris Cummins

# -----------------------------------------------------------------------
# CONFIG
# -----------------------------------------------------------------------
# LOCATION: Location to backup. [/filepath]
# SNAPSHOTS_DIR: Backup directory. [/filepath]
# SNAPSHOTS_LIMIT: Number of snapshots to store.
# EXCLUDE_LOC: Location of rsync excludes file.
# USER: Your username
# FILE_MGR: File manager to open directory.
#
LOCATION="/"
BACKUP="/mnt/data/backups"
SNAPSHOTS_LIMIT=4
EXCLUDE_LOC="/mnt/data/backups/Exclude List"
USER=ellis
FILE_MGR=thunar


## rsync performance tuning.
# VERBOSE: Set to "yes" for verbose output.
# PROGRESS: Set to "yes" for rsync progress.
# LOW_PRIORITY: Run the process with low priority [yes/no].
#
VERBOSE="no"
PROGRESS="no"
LOW_PRIORITY="no"

## File handling.
# ID_CODE: Sets the snapshot ID as found in 'by-id/'
# ID_CODE: Sets the snapshot ID as found in 'by-date/'
#
ID_CODE="%Y-%m-%d_%H%M%S"
DATE_CODE="%a %d %b, %T"

## Exit codes.
# EXIT_NOROOT: No root priveledges.
# EXIT_NODIR: Unable to create required directory.
# EXIT_NOEXEC: Specified excludes list missing.
# EXIT_RSYNC: rsync transfer failed.
# EXIT_EXISTING: Backup with identical tag already exists.
#
EXIT_NOROOT=87
EXIT_NODIR=88
EXIT_NOEXEC=89
EXIT_EXISTING=90
EXIT_RSYNC=5

# -----------------------------------------------------------------------
# STAGE 1
# -----------------------------------------------------------------------
# Performs program admin, sets up directories, sets variables.
#
if [ "$UID" != 0 ]
    then
    echo "Arch-backup: [Stage 1] Must be ran as root!"
    exit $EXIT_NOROOT
fi

if [ ! -f "$EXCLUDE_LOC" ]
then
    echo "Arch-backup: [Stage 1] Excludes list missing!"
    echo "                       '$EXCLUDE_LOC'"
    exit $EXIT_NOEXEC
else
    echo "Arch-backup: [Stage 1] Using exclude list '$EXCLUDE_LOC'"
    RSYNC_EXC="--exclude-from=$EXCLUDE_LOC"
fi

if [ ! -d $BACKUP ]
then
    echo "Arch-backup: [Stage 1] Creating directory:"
    echo "                       '$BACKUP'"
    mkdir -p $BACKUP
    if (( $? ))
    then
        echo "Arch-backup: [Stage 1] Unable to make required directory!"
        exit $EXIT_NODIR
    fi
fi

if [ ! -d "$BACKUP/by-id" ]
then
    echo "Arch-backup: [Stage 1] Creating directory:"
    echo "                       '$BACKUP/by-id'"
    mkdir $BACKUP/by-id
    if (( $? ))
    then
        echo "Arch-backup: [Stage 1] Unable to make required directory!"
        exit $EXIT_NODIR
    fi
fi

if [ ! -d "$BACKUP/by-date" ]
then
    echo "Arch-backup: [Stage 1] Creating directory:"
    echo "                       '$BACKUP/by-date'"
    mkdir $BACKUP/by-date
    if (( $? ))
    then
        echo "Arch-backup: [Stage 1] Unable to make required directory!"
        exit $EXIT_NODIR
    fi
fi

if [ -f "$BACKUP/by-id/.DS_Store" ]
then
    echo "Arch-backup: [Stage 1] Removing Desktop Services Store..."
    rm "$BACKUP/by-id/.DS-Store"
fi


# BY_ID: Snapshot directory for by-id/
# BY_DATE: Snapshot directory for by-date/
# NO_OF_SNAPSHOTS: Current number of snapshots in by-id/
#                  Based on item count of by-id/
# OLDEST_SNAPSHOT: Oldest item in by-id/ by Modified time.
# NEWEST_SNAPSHOT: Newest item in by-id/ by Modified time.
#
BY_ID=$(date +"$ID_CODE")
BY_DATE=$(date +"$DATE_CODE")
NO_OF_SNAPSHOTS=$(ls -1 "$BACKUP/by-id" | wc -l)
OLDEST_SNAPSHOT=$(ls -t "$BACKUP/by-id" | tail -1)
NEWEST_SNAPSHOT=$(ls -t1 "$BACKUP/by-id" | head -n1)

echo "Arch-backup: Number of backups [ $NO_OF_SNAPSHOTS / $SNAPSHOTS_LIMIT ]"

if [ -d "$BACKUP/by-id/$BY_ID" ]
then
    echo "Arch-backup: [Stage 1] Directory with ID already exists!"
    echo "                       '$BACKUP/by-id/$BY_ID'"
    exit $EXIT_EXISTING
fi

if [ $NO_OF_SNAPSHOTS -gt $SNAPSHOTS_LIMIT                       \
    -o $NO_OF_SNAPSHOTS -eq $SNAPSHOTS_LIMIT ]
then
    echo "Arch-backup: [Stage 1] Snapshot Limit ($SNAPSHOTS_LIMIT) reached, removing:"
    echo "                       '$OLDEST_SNAPSHOT'"
    rm -rf "$BACKUP/by-id/$OLDEST_SNAPSHOT"
    echo "Arch-backup: [Stage 1] Removing broken symlinks..."
    find -L "$BACKUP/by-date" -type l -exec rm {} + 2>/dev/null
fi

if [ -d "$BACKUP/by-id/$NEWEST_SNAPSHOT" ]
then
    echo "Arch-backup: [Stage 1] Using link destination:"
    echo "                       '$NEWEST_SNAPSHOT'"
    RSYNC_LINK="--link-dest=$BACKUP/by-id/$NEWEST_SNAPSHOT" 
fi

echo "Arch-backup: Stage 1 complete, moving onto Stage 2..."

# -----------------------------------------------------------------------
# STAGE 2
# -----------------------------------------------------------------------
# rsync of location with newest snapshot.
#
if [ $VERBOSE == "yes" ]
then
    echo "Arch-backup: [Stage 2] Setting rsync '-v' flag..."
    RSYNC_V="-v"
fi

if [ $PROGRESS == "yes" ]
then
    echo "Arch-backup: [Stage 2] Setting rsync '--progress' flag..."
    RSYNC_P="--progress"
fi

if [ $LOW_PRIORITY == "yes" ]
then
    echo "Arch-backup: [Stage 2] Setting low program priority..."
    ionice -c 3 -p $$
    renice +12  -p $$
fi

echo "Arch-backup: [Stage 2] Beginning rsync of '$LOCATION'..."
time rsync \
    --delete \
    --delete-excluded \
    --archive \
    --human-readable \
    $RSYNC_V \
    $RSYNC_P \
    "$RSYNC_EXC" \
    "$RSYNC_LINK" \
	"$LOCATION/" "$BACKUP/In Progress..."
if (( $? ))
then
    echo "Arch-backup: [Stage 2] rsync failed!"
    #Cleanup failed attempt?
    exit $EXIT_RSYNC
fi

echo "Arch-backup: Stage 2 complete, moving onto Stage 3..."

# -----------------------------------------------------------------------
# STAGE 3
# -----------------------------------------------------------------------
# Clean up new snapshot.
#
echo "Arch-backup: [Stage 3] Assigning backup ID..."
mv "$BACKUP/In Progress..." "$BACKUP/by-id/$BY_ID"

echo "Arch-backup: [Stage 3] Touching snapshot..."
touch "$BACKUP/by-id/$BY_ID"

echo "Arch-backup: [Stage 3] Creating date symlink..."
ln -s "$BACKUP/by-id/$BY_ID" "$BACKUP/by-date/$BY_DATE"

echo "Arch-backup: [Stage 3] Creating 'Most Recent' symlink..."
rm "$BACKUP/Most Recent Backup"
ln -s "$BACKUP/by-id/$BY_ID" "$BACKUP/Most Recent Backup"

echo "Arch-backup: Backup complete: '$BACKUP/by-id/$BY_ID'"
cd "$BACKUP/by-date/$BY_DATE"
su $USER -c "$FILE_MGR" &
exit 0

Typical / backup excludes file:

### /mnt/data/backups/Exclude List - rsync exclude list for filesystem backups.
### 19:46 Dec 26, 2011
### Chris Cummins

# -----------------------------------------------------------------------
# INCLUDES
# -----------------------------------------------------------------------
#
+ /dev/console
+ /dev/initctl
+ /dev/null
+ /dev/zero

# -----------------------------------------------------------------------
# EXCLUDES
# -----------------------------------------------------------------------
# Files and directories to exclude from backups.
#

# Backup point.
- /mnt/data/*

# System directories.
- /dev/*
- /proc/*
- /sys/*
- /tmp/*
- lost+found/
- /var/lib/pacman/sync/*

# Removeable devices.
- /media/*

# Config files, virtual filesystems, caches etc.
- /home/*/.gvfs
- /home/*/.mozilla
- /home/*/.netbeans

# User files.
- /home/*/Desktop
- /home/*/Downloads
- /home/*/Dropbox
- /home/*/Music
- /home/*/Pictures
- /home/*/Video

Sample output:

Arch-backup: [Stage 1] Using exclude list '/mnt/data/backups/Exclude List'
Arch-backup: Number of backups [ 4 / 4 ]
Arch-backup: [Stage 1] Snapshot Limit (4) reached, removing:
                       '2011-12-27_144804'
Arch-backup: [Stage 1] Removing broken symlinks...
Arch-backup: [Stage 1] Using link destination:
                       '2011-12-27_144849'
Arch-backup: Stage 1 complete, moving onto Stage 2...
Arch-backup: [Stage 2] Beginning rsync of '/home/ellis/backup'...

real    0m0.056s
user    0m0.007s
sys     0m0.003s
Arch-backup: Stage 2 complete, moving onto Stage 3...
Arch-backup: [Stage 3] Assigning backup ID...
Arch-backup: [Stage 3] Touching snapshot...
Arch-backup: [Stage 3] Creating date symlink...
Arch-backup: [Stage 3] Creating 'Most Recent' symlink...
Arch-backup: Backup complete: '/mnt/data/backups/by-id/2011-12-27_144850'

Regards


"Paradoxically, learning stuff is information-society enemy number one"

Offline

#1658 2011-12-28 21:49:57

Trilby
Inspector Parrot
Registered: 2011-11-29
Posts: 29,442
Website

Re: Post your handy self made command line utilities

Most of my life is spent in terminal apps (mutt, vim, R, and bash of course).  I was a bit tired of having a dozen identical and unhelpful icons in my app-switcher and tint2 panel.  I'm sure there are other ways of achieving what I did below, but the simplicity and flexibility of this handful of bashrc lines have worked wonderfully for me.  Note that this requires xseticon from the AUR, and as written assumes the relevant image files are in /usr/share/pixmaps.  Mine were just downloaded from a google image search.

##bashrc excerpts

# set arch icon as default for terminal
xseticon -id "$WINDOWID" /usr/share/pixmaps/arch.png

# update the window title to $PWD while under bash
PROMPT_COMMAND='echo -e "\033]0;$PWD\007"'

# window naming function
wname() { echo -en "\033]0;$@\007"; }

# aliases to set icons - the real fun:
alias mutt='xseticon -id "$WINDOWID" /usr/share/pixmaps/mutt.png; wname mutt; mutt; xseticon -id "$WINDOWID" /usr/share/pixmaps/arch.png'
alias r='xseticon -id "$WINDOWID" /usr/share/pixmaps/r.png; wname R; R --quiet; xseticon -id "$WINDOWID" /usr/share/pixmaps/arch.png'
icon_vim() {                                                                    
    xseticon -id "$WINDOWID" /usr/share/pixmaps/vim.png                         
    wname "vim $@"; vim "$@"                                                    
    xseticon -id "$WINDOWID" /usr/share/pixmaps/arch.png                        
}
alias vim='icon_vim '

Thanks to the writers/maintainers of xseticon.  It's quite handy.

Last edited by Trilby (2011-12-28 21:52:02)


"UNIX is simple and coherent..." - Dennis Ritchie, "GNU's Not UNIX" -  Richard Stallman

Offline

#1659 2012-01-02 21:23:15

xr4y
Member
Registered: 2011-05-06
Posts: 33

Re: Post your handy self made command line utilities

Here is the first bash script that I wrote and that I still use daily, I call it "actiontime". I did not know about "cron" or the "at" command at the time I wrote it.

#!/bin/bash

while [ "$(date +%R)" != "$1" ]; do
    sleep 1
done

usage: actiontime HH:MM && next_script (&)

Last edited by xr4y (2012-01-02 21:25:04)

Offline

#1660 2012-01-05 21:44:28

trontonic
Member
Registered: 2008-07-21
Posts: 80

Re: Post your handy self made command line utilities

I use this little shellscript to update md5sums and sha256sums in PKGBUILDS (requires setconf):

#!/bin/sh
setconf PKGBUILD `makepkg -g 2>/dev/null | head -1 | cut -d"=" -f1` "`makepkg -g 2>/dev/null | cut -d"=" -f2`" ')'

I'm sure it can be improved with awk or something, to just run "makepkg -g" once.

Offline

#1661 2012-01-05 21:54:25

karol
Archivist
Registered: 2009-05-06
Posts: 25,440

Re: Post your handy self made command line utilities

trontonic wrote:

I use this little shellscript to update md5sums and sha256sums in PKGBUILDS (requires setconf):

#!/bin/sh
setconf PKGBUILD `makepkg -g 2>/dev/null | head -1 | cut -d"=" -f1` "`makepkg -g 2>/dev/null | cut -d"=" -f2`" ')'

I'm sure it can be improved with awk or something, to just run "makepkg -g" once.

https://bbs.archlinux.org/viewtopic.php … 7#p1026767

Offline

#1662 2012-01-07 01:57:32

trontonic
Member
Registered: 2008-07-21
Posts: 80

Re: Post your handy self made command line utilities

karol, thanks for pointing to the nice awk script by falconindy. The advantage of the line above is that it's only one line, and it doesn't create a temporary file, but the disadvantage is that setconf might not be installed already, while awk usually ("always") is. smile

Offline

#1663 2012-01-07 03:34:22

karol
Archivist
Registered: 2009-05-06
Posts: 25,440

Re: Post your handy self made command line utilities

You may want to add your script to the wiki: https://wiki.archlinux.org/index.php/Ma … omatically


setconf PKGBUILD $(makepkg -g 2>/dev/null | pee "head -1 | cut -d= -f1" "cut -d= -f2") ')'

seems to give the same output with just one makepkg run, but:
1. It adds a requirement for 'moreutils'
2. It doesn't work with multiple checksums as setconf expects them surrounded in quotes :-(

Offline

#1664 2012-01-10 02:46:32

rsking84
Member
Registered: 2011-05-09
Posts: 17

Re: Post your handy self made command line utilities

I'm still relatively new to bash scripting, but I've come up with this handy little script to take care of all my system upgrade tasks. It uses pacaur to upgrade packages via the repos and the AUR, then searches for any .pacsave or .pacnew files and displays them in meld. After editing, the .pac* files are deleted. I aliased this to "update" in my .bashrc file so now system upgrades are a snap.

I think the meld piece of this came from someone else on these forums, but I can't find where. Thank you, anonymous Arch'er!

#!/bin/bash
#System update script
#uses pacaur to perform pacman and AUR system upgrades
#then searches for .pac* files and opens them with meld

#system upgrade
pacaur -Syu

# search for *.pac* files in /etc
echo -n ":: Searching for *.pacnew and *.pacsave files..."
countnew=$(sudo find /etc -type f -name "*.pacnew" | wc -l )
countsave=$(sudo find /etc -type f -name "*.pacsave" | wc -l )
count=$((countnew+countsave ))
echo "$count file(s) found."

# if files are found, merge *.pacnew and *.pacsave files with original configurations using meld
if [ $count -gt 0 ] ; then
	pacnew=$(sudo find /etc -type f -name "*.pac*")
	echo ":: Merging $countnew *.pacnew and $countsave *.pacsave file(s)..."
		for config in $pacnew; do
		  # Merge with meld
		  gksudo meld ${config%\.*} $config >/dev/null 2>&1 &
		  wait
		done
	#interactively delete *.pacnew and *.pacsave files
	echo ":: Removing files ... "
	sudo rm -i $pacnew
fi

echo ":: System upgrade complete."

Last edited by rsking84 (2012-01-10 02:48:06)

Offline

#1665 2012-01-11 17:37:33

trontonic
Member
Registered: 2008-07-21
Posts: 80

Re: Post your handy self made command line utilities

karol, added the script to the wiki, thanks for pointing out that page. Hopefully multiple arrays with checksums will work with setconf in the future (not that it's that common).

While I'm at it, here's a python2 script for finding libraries that have multiple definitions in header files by searching through /usr/include with ctags (takes some time to run).
Don't know if it's useful or not yet, but here goes:

#!/usr/bin/python2
# -*- coding: utf-8 -*-

import os

# filename -> package name
packagecache = {}

# package name + definition -> counter
definitioncount = {}

# package name -> (filename, list of duplicate definitions)
dupedefs = {}

def pkg(filename):
    if not filename in packagecache:
        print("Examining " + filename + "...")
        packagecache[filename] = os.popen3("pacman -Qo -q " + filename)[1].read().strip()
    return packagecache[filename]

def main():
    # Gather all function definitions in /usr/include
    data = os.popen3("ctags --sort=foldcase -o- -R /usr/include | grep -P '\tf\t'")[1].read()
    # Find the definitions and count duplicate ones
    for line in data.split("\n")[:-1]:
        fields = line.split("\t")
        name, filename = fields[:2]
        definition = line.split("/^")[1].split("$/;\"")[0].strip()
        if definition.endswith(","):
            # Skip the partial definitions
            continue
        id = pkg(filename) + ";" + definition
        if id not in definitioncount:
            definitioncount[id] = 0
        else:
            definitioncount[id] += 1
    # Gather all the duplicate definitions
    for id, count in definitioncount.items():
        pkgname, definition = id.split(";", 1)
        if count > 1:
            if not pkgname in dupedefs:
                dupedefs[pkgname] = [definition]
            else:
                dupedefs[pkgname] += [definition]
    # Output the duplicate definitions per package
    for pkgname, deflist in dupedefs.items():
        print("Duplicate definitions in %s:" % (pkgname))
        for definition in deflist:
            print("\t" + definition)

main()

I recommend piping the output to a file, as it's quite abundant.

Last edited by trontonic (2012-01-11 17:38:27)

Offline

#1666 2012-01-14 20:18:31

evil
Member
From: Indianapolis, IN
Registered: 2010-03-06
Posts: 41
Website

Re: Post your handy self made command line utilities

Here is a script I wrote that connects to btjunkie.org, extracts the latest video uploads, from most seeded to least, then outputs a readable list with a tinyurl to the torrent. This script also saves the results in btjunkie.txt

#!/usr/bin/python

# Script Name: btjunkie.py
# Script Author: Lucian Adamson
# Script License: GPL v3
# Web Site: http://www.lucianadamson.com / http://evillittleshit.wordpress.com
# Written in: Python 3.2.2

# 2012-01-04, Lucian Adamson
#	Version 1.0: Finished, tho there are prob. a few bugs. Will update when
#			bugs are discovered and/or fixed. Also, I could probably implement
#			better error control and possibly add command line input.
#	CRUDE BEGINNING: Write a script to connect to btjunkie.org and return
#				the latest movie additions

import re, urllib.request
WRITE_FILE="btjunkie.txt"

def shrink_url(url):
	tinyurl = "http://tinyurl.com/api-create.php?%s"
	encoder = urllib.parse.urlencode({'url':url})
	try:
		short = urllib.request.urlopen(tinyurl % encoder).read().decode('utf-8')
	except:
		return "N/A"
	return short

def extract_names():

	print("Retrieving data from btjunkie. This part shouldn't take very long.")
	f=urllib.request.urlopen('http://btjunkie.org/browse/Video/page1/?o=72&t=0&s=1')
	text = f.read().decode('utf-8')
	
	matcher=''
	for x in range(32, 126): matcher+=chr(x)
	
	matcher = "([" + matcher + "]+)"
	tuples = re.findall(r'<a href="'+matcher+'" class="BlckUnd">'+matcher+'</a></th>', text)
	
	return tuples

def write_data(data):

	f = open(WRITE_FILE, 'w')
	f.write(data)
	f.close()
	
def extract_formatted_info():
	newHold={}
	names=extract_names()
	print("Parsing data and creating short urls. Depending on TinyURL, speeds will vary.")
	msg="New torrents on BTJUNKIE:\n\n"
	count=1
	for x in names:
		(tmp1, tmp2) = x
		msg+=str(count) + ": " + str(tmp2) + "\n		D/L:" + shrink_url("http://dl.btjunkie.org" + tmp1 + "/download.torrent") + "\n"
		count+=1
	
	write_data(msg)
	return msg

def main():
	print(extract_formatted_info())
	print("Data was saved to \"" + WRITE_FILE + "\" in current working directory")

if __name__ == '__main__':
	main()

Site | Blog | Freenode Nick: i686

Offline

#1667 2012-01-14 23:49:19

berz_
Member
Registered: 2011-06-12
Posts: 26

Re: Post your handy self made command line utilities

rsking84 wrote:

I'm still relatively new to bash scripting, but I've come up with this handy little script to take care of all my system upgrade tasks. It uses pacaur to upgrade packages via the repos and the AUR, then searches for any .pacsave or .pacnew files and displays them in meld. After editing, the .pac* files are deleted. I aliased this to "update" in my .bashrc file so now system upgrades are a snap.

I like it. I replaced my "alias upgrade='sudo pacman -Syu && pacaur -u'" with it and modified it a bit. My version is included below. What I changed: (1) use the same formatting as the color version of pacaur and pacman to display messages. (2) run find only once instead of three times. (3) only use pacaur to update the AUR, dirrectly call pacman where possible

#!/bin/bash
#System update script
#uses pacman and pacaur to upgrade all packages then searches 
#for .pacnew and pacsave files and opens them with meld

#define colors
reset="\e[0m"
colorB="\e[1;34m"
colorW="\e[1;39m"

#custom echo function
status () { echo -e "${colorB}:: ${colorW}$1${reset}"; }

#system upgrade
status "Starting repository upgrade"
sudo pacman-color -Syu

#AUR upgrade
pacaur -u

# search for *.pac* files in /etc
status "Searching for *.pacnew and *.pacsave files in /etc..."
files=$(sudo find /etc -iname "*.pacnew" -or -iname "*.pacsave")
count=$(echo $files | wc -w)

# if files are found, merge them with the original configurations using meld
if [ $count -gt 0 ] ; then
	status "Merging $count configuration file(s)..."
		for config in $files; do
			# Merge with meld
			gksudo meld ${config%\.*} $config >/dev/null 2>&1 &
			wait
		done
	#interactively delete *.pacnew and *.pacsave files
	status "Removing files... "
	sudo rm -i $files
else
	status "No configuration files found"
fi

status "System upgrade complete."

Offline

#1668 2012-01-17 14:50:35

evil
Member
From: Indianapolis, IN
Registered: 2010-03-06
Posts: 41
Website

Re: Post your handy self made command line utilities

This is a script I wrote convert mp3 to m4r and transfer the converted files over to my jailbroken iPhone.

#!/bin/bash 
# Script Name: iTone.sh
# Script Author: Lucian Adamson
# Script License: None, do as you please
# Website: http://www.lucianadamson.com
# Blog: http://blog.lucianadamson.com

# Description: A script that will convert 1 or more mp3 files to m4r format.
# 		Additionally, this script will also transfer the new m4r files to your
#		jailbroken iPhone if you so wish.
[[ ! $(which faac 2> /dev/null) ]] && echo "$(basename $0): Requires package \"faac\" installed" && exit 1
[[ ! $(which mplayer 2> /dev/null) ]] && echo "$(basename $0): Requires package \"mplayer\" installed" && exit 1
REMOVE=0
VERBOSE=0
TRANSFER=0
HOSTNAME=''
PORT=22
USAGE="\nUsage: $(basename $0) [OPTION...] [FILE...]\n\n   -r   \
Remove old files after conversion\n   -v   Enable verbosity on \
conversion\n   -p   Set the port for SCP (default 22)\n   -t   Set the hostname to auto-transfer m4r files\n\nPlease note, using the -r flag will utilize rm and will \
permanently delete your old files after conversion. Only use this if \
you are 100% sure you no longer need them.\n"

while getopts ":rvht:" FLAG;
	do
		case $FLAG in
			r)
				REMOVE=1 ;;
			v)
				VERBOSE=1 ;;
			h)
				echo -e "$USAGE"
				exit 0 ;;
			p)
				PORT=$OPTARG ;;
			t)
				[[ ! $(which scp 2> /dev/null) ]] && echo "$(basename $0): Requires \"scp\" from OpenSSH package." && exit 1
				TRANSFER=1 
				HOSTNAME=$OPTARG ;;
			\?)
				echo "Invalid options: -$OPTARG" >& 2 ;;
		esac
	done

shift $((OPTIND-1))

for each in "$@"; 
	do 

		filename=$(basename $each)
		extension=${filename##*.}
		filename=${filename%.*}
		if [ $VERBOSE == 1 ]; then
			mplayer -vo null -vc null -ao pcm:fast:file=$filename.wav $each
			faac -b 128 -c 44100 -w $filename.wav
		else
			mplayer -vo null -vc null -ao pcm:fast:file=$filename.wav $each &> /dev/null
			faac -b 128 -c 44100 -w $filename.wav &> /dev/null
		fi
		mv -i $filename.m4a $filename.m4r
		[[ $REMOVE == 1 ]] && rm -rf "$each" && rm -rf "$filename.wav"
		
	done
	
if [ $TRANSFER == 1 ]; then
	scp -P $PORT *.m4r $HOSTNAME:/Library/Ringtones/
fi

Site | Blog | Freenode Nick: i686

Offline

#1669 2012-01-17 15:36:35

steve___
Member
Registered: 2008-02-24
Posts: 452

Re: Post your handy self made command line utilities

[[ ! $(which faac 2> /dev/null) ]] && echo "..

could be changed to:

command -v faac &>/dev/null || echo "...

Also be sure to quote your vars; especially filenames.

Offline

#1670 2012-01-18 05:23:38

bohoomil
Member
Registered: 2010-09-04
Posts: 2,376
Website

Re: Post your handy self made command line utilities

A trivial script that returns external IP and/or country code:

#!/bin/sh

myip=$(dig myip.opendns.com @resolver1.opendns.com +short)
loc=$(geoiplookup $myip | awk -F' ' '{print $4}' | sed '$s/,$//')

echo -en "IP: $myip\n"; 

echo -en "country:\e[01;35m $loc\e[0m\n";

#echo -en "$loc\n"

Last edited by bohoomil (2012-01-18 05:29:20)


:: Registered Linux User No. 223384

:: github
:: infinality-bundle+fonts: good looking fonts made easy

Offline

#1671 2012-01-20 18:57:55

jordi
Member
Registered: 2006-12-16
Posts: 103
Website

Re: Post your handy self made command line utilities

bohoomil wrote:

A trivial script that returns external IP and/or country code:

#!/bin/sh

myip=$(dig myip.opendns.com @resolver1.opendns.com +short)
loc=$(geoiplookup $myip | awk -F' ' '{print $4}' | sed '$s/,$//')

echo -en "IP: $myip\n"; 

echo -en "country:\e[01;35m $loc\e[0m\n";

#echo -en "$loc\n"

# curl ifconfig.me

does the same.

Last edited by jordi (2012-01-20 18:58:10)

Offline

#1672 2012-01-20 19:03:36

karol
Archivist
Registered: 2009-05-06
Posts: 25,440

Re: Post your handy self made command line utilities

Well, you don't get a pretty colored country code.

Offline

#1673 2012-01-20 19:25:47

bohoomil
Member
Registered: 2010-09-04
Posts: 2,376
Website

Re: Post your handy self made command line utilities

jordi wrote:

# curl ifconfig.me

does the same.

So?...


:: Registered Linux User No. 223384

:: github
:: infinality-bundle+fonts: good looking fonts made easy

Offline

#1674 2012-01-21 19:58:40

yaffare
Member
Registered: 2011-12-29
Posts: 71

Re: Post your handy self made command line utilities

Just in case so didnt know this site:

http://www.commandlinefu.com/

A nice one is this (wikipedia over dns):

function whats()
{
 dig +short txt $1.wp.dg.cx
}

Example:

whats archlinux

Answer:

"Arch Linux (or Arch) is a Linux distribution intended to be lightweight and simple. The design approach of the development team focuses on "simplicity", elegance, code correctness and minimalism. "Simplicity", according to Arch, is defined as "...without " "unnecessary additions, modifications, or complications.." and is defined from a developer standpoint, rather than a user..."

systemd is like pacman. enjoys eating up stuff.

Offline

#1675 2012-01-21 20:44:00

falconindy
Developer
From: New York, USA
Registered: 2009-10-22
Posts: 4,111
Website

Re: Post your handy self made command line utilities

Apropos for the new pacman release, find out how much of a repo is signed:

#!/bin/bash
#
# query a repo by name to see number of packages signed
#

IFS=$'\n' read -rd '' -a pkgs < <(pacman -Sql "$1")

(( ${#pkgs[*]} )) || exit

expac -S '%g' "${pkgs[@]/#/$1/}" |
  awk '
BEGIN { yes = no = 0 }

{ $1 == "(null)" ? no++ : yes++ }

END {
  printf "Yes: %s [%.f%%]\n", yes, (yes / (yes + no) * 100)
  printf "No: %s\n", no
}'

Last edited by falconindy (2012-01-21 21:22:37)

Offline

Board footer

Powered by FluxBB