Commands using printf (206)

  • already described on the other two versions, this one uses ascii characters on game style to display elapsed time. Show Sample Output


    0
    export I=$(date +%s); watch -t -n 1 'T=$(date +%s);E=$(($T-$I));hours=$((E / 3600)) ; seconds=$((E % 3600)) ; minutes=$((seconds / 60)) ; seconds=$((seconds % 60)) ; echo $(printf "%02d:%02d:%02d" $hours $minutes $seconds) | toilet -f shadow'
    m33600 · 2009-10-23 07:56:30 10
  • In most modern shells, printf is a builtin command.


    0
    printf "%s\n" .*
    cfajohnson · 2009-11-23 18:07:18 3

  • 0
    Split() { SENT=${*} ; sentarry=( ${SENT} ) ; while [[ ${#sentarry[@]} -gt 0 ]] ; do printf "%s\n" "${sentarry[0]}" ; sentarry=( ${sentarry[@]:1} ) ; done ; }
    unixhome · 2010-02-09 20:32:56 3
  • I created this command to give me a quick overview of how many file types a directory, and all its subdirectories, contains. It works based off file extension, rather than file(1)'s magic output, because it ended up being more accurate and less confusing. Files that don't have an ext (README) are generally not important for me to want to count, but you're free to customize this fit your needs. Show Sample Output


    0
    printf "\n%25s%10sTOTAL\n" 'FILE TYPE' ' '; for ext in $(find . -iname \*.* | egrep -o '\.[^[:space:].]+$' | egrep -v '\.svn*' | sort -f | uniq -i); do count=$(find . -iname \*$ext | wc -l); printf "%25s%10s%d\n" $ext ' ' $count; done
    rkulla · 2010-04-16 21:12:11 3
  • first off, if you just want a random UUID, here's the actual command to use: uuidgen Your chances of finding a duplicate after running this nonstop for a year are about the same as being hit by a meteorite before finishing this sentence The reason for the command I have is that it's more provably unique than the one that uuidgen creates. uuidgen creates a random one by default, or an unencrypted one based on time and network address if you give it the -t option. Mine uses the mac address of the ethernet interface, the process id of the caller, and the system time down to nanosecond resolution, which is provably unique over all computers past, present, and future, subject to collisions in the cryptographic hash used, and the uniqueness of your mac address. Warning: feel free to experiment, but be warned that the stdin of the hash is binary data at that point, which may mess up your terminal if you don't pipe it into something. If it does mess up though, just type reset Show Sample Output


    0
    printf $(( echo "obase=16;$(echo $$$(date +%s%N))"|bc; ip link show|sed -n '/eth/ {N; p}'|grep -o -E '([[:xdigit:]]{1,2}:){5}[[:xdigit:]]{1,2}'|head -c 17 )|tr -d [:space:][:punct:] |sed 's/[[:xdigit:]]\{2\}/\\x&/g')|sha1sum|head -c 32; echo
    camocrazed · 2010-07-14 14:04:53 10
  • Prints movie length in H:MM:SS format with appropriate leading zeros. Show Sample Output


    0
    mplayer -vo null -ao null -frames 0 -identify movie.avi | awk '{FS="="}; /ID_LENGTH/{ H=int($2/3600); M=int(($2-H*3600)/60); S=int($2%60); printf "%d:%02d:%02d\n",H,M,S}'
    PNuts · 2010-10-13 14:51:41 4
  • function for .bash_aliases that prints a line of the character of your choice in the color of your choice across the terminal. Default character is "=", default color is white.


    0
    println() {echo -n -e "\e[038;05;${2:-255}m";printf "%$(tput cols)s"|sed "s/ /${1:-=}/g"}
    joedhon · 2011-01-09 18:08:18 3

  • 0
    MIN=10 && for i in $(seq $(($MIN*60)) -1 1); do printf "\r%02d:%02d:%02d" $((i/3600)) $(( (i/60)%60)) $((i%60)); sleep 1; done
    bugmenot · 2011-02-20 10:24:22 3
  • The exported TSV file of Google Adwords' first five columns are text, they usually should collapse into one cell, a multi-line text cell, but there is no guaranteed way to represent line-break within cells for .tsv file format, thus Google split it to 5 columns. The problem is, with 5 columns of text, there are hardly space to put additional fields while maintain printable output. This script collapses the first five columns of each row into one single multi-line text cell. new line character we use Line-Separator character (unicode U+2028), which is respected by gnumeric. It outputs a new .tsv file that opens in gnumeric.


    0
    awk -F $'\t' '{printf $1 LS $2 LS $3 LS $4 LS $5; for (i = 7; i < NF; i++) printf $i "\t"; printf "\n";}' LS=`env printf '\u2028'` 'Ad report.tsv'
    zhangweiwu · 2011-02-28 10:48:46 3
  • Among other things, this allows the sorting of comment descriptions and command lines retrieved as text from CommandLineFu.com. Show Sample Output


    0
    gawk 'BEGIN {RS="\n\n"; if (ARGV[1]=="-i"){IGNORECASE=1; ARGC=1}};{Text[NR]=$0};END {asort(Text);for (i=1;i<=NR;i++) printf "%s\n\n",Text[i] }' -i<Zip.txt
    IF_Rock · 2011-05-10 19:08:27 5
  • also works in vim


    0
    printf "g/^/m0\nw\nq"|ed $FILE
    till · 2011-05-27 08:57:31 4
  • WIDTHL=10 and WIDTHR=60 are setting the widths of the left and the right column/bar. BAR="12345678" etc. is used to create a 80 char long string of "="s. I didn't know any shorter way. If you want to pipe results into it, wrap the whole thing in ( ... ) I know that printing bar graphs can be done rather easily by other means. Here, I was looking for a Bash only variant. Show Sample Output


    0
    SCALE=3; WIDTHL=10; WIDTHR=60; BAR="12345678"; BAR="${BAR//?/==========}"; while read LEFT RIGHT rest ; do RIGHT=$((RIGHT/SCALE)); printf "%${WIDTHL}s: %-${WIDTHR}s\n" "${LEFT:0:$WIDTHL}" "|${BAR:0:$RIGHT}*"; done < dataset.dat
    andreasS · 2011-08-22 19:35:21 7

  • 0
    i=60;while [ $i -gt 0 ];do if [ $i -gt 9 ];then printf "\b\b$i";else printf "\b\b $i";fi;sleep 1;i=`expr $i - 1`;done
    barnesmjsa · 2011-09-08 18:14:48 3
  • I find it useful when I want to add another crontab entry and I need to specify the appropriate PATH. I give ''whichpath'' a list of programs that I use inside my script and it gives me the PATH I need to use for this script. ''whichpath'' uses associative array, therefore you should have Bash v4 in order to run it. See sample output. Show Sample Output


    0
    whichpath() { local -A path; local c p; for c; do p=$(type -P "$c"); p=${p%/*}; path[${p:-/}]=1; done; local IFS=:; printf '%s\n' "${!path[*]}"; }
    RanyAlbeg · 2011-09-16 15:55:15 14

  • 0
    find . -type f -exec awk '/linux/ { printf "%s %s: %s\n",FILENAME,NR,$0; }' {} \;
    Neo23x0 · 2011-11-29 12:32:06 8
  • Better awk example, using only mplayer, grep, cut, and awk. Show Sample Output


    0
    mplayer -endpos 0.1 -vo null -ao null -identify *.avi 2>&1 |grep ID_LENGTH |cut -d = -f 2|awk '{SUM += $1} END { printf "%d:%d:%d\n",SUM/3600,SUM%3600/60,SUM%60}'
    Coderjoe · 2011-12-12 15:49:07 3
  • Here's my version. It's a bit lengthy but I prefer it since it's all Bash.


    0
    genRandomText() { x=({a..z}); for(( i=0; i<$1; i++ )); do printf ${x[$((RANDOM%26))]}; done; printf "\n"; }
    uxseven · 2012-01-26 08:19:33 3
  • Improvement on Coderjoe's Solution. Gets rid of grep and cut (and implements them in awk) and specifies some different mplayer options that speed things up a bit. Show Sample Output


    0
    find /path/to/dir -iname "*.ext" -print0 | xargs -0 mplayer -really-quiet -cache 64 -vo dummy -ao dummy -identify 2>/dev/null | awk '/ID_LENGTH/{gsub(/ID_LENGTH=/,"")}{SUM += $1}END{ printf "%02d:%02d:%02d\n",SUM/3600,SUM%3600/60,SUM%60}'
    DarkSniper · 2012-03-11 12:28:48 3
  • proc lister usage: p proc killer usage: p patt [signal] uses only ps, grep, sed, printf and kill no need for pgrep/pkill (not part of early UNIX) _p(){ ps ax \ |grep $1 \ |sed ' /grep.'"$1"'/d' \ |while read a;do printf ${a%% *}' '; printf "${a#* }" >&2; printf '\n'; done; } p(){ case $# in 0) ps ax |grep .|less -iE; ;; 1) _p $1; ;; [23]) _p $1 2>/dev/null \ |sed '/'"$2"'/!d; s,.*,kill -'"${3-15}"' &,'|sh -v ;; esac; } alas, can't get this under 255 chars. flatcap? Show Sample Output


    0
    _p(){ ps ax |grep $1 |sed '/grep.'"$1"'/d' |while read a;do printf ${a%% *}' ';printf "${a#* }" >&2;printf '\n';done;}
    argv · 2012-04-01 19:45:17 3
  • proc lister usage: p proc killer usage: p patt [signal] uses only ps, grep, sed, printf and kill no need for pgrep/pkill (not part of early UNIX) _p(){ ps ax \ |grep $1 \ |sed ' /grep.'"$1"'/d' \ |while read a;do printf ${a%% *}' '; printf "${a#* }" >&2; printf '\n'; done; } p(){ case $# in 0) ps ax |grep .|less -iE; ;; 1) _p $1; ;; [23]) _p $1 2>/dev/null \ |sed '/'"$2"'/!d; s,.*,kill -'"${3-15}"' &,'|sh -v ;; esac; } alas, can't get this under 255 chars. flatcap? Show Sample Output


    0
    _p(){ ps ax |grep $1 |sed '/grep.'"$1"'/d' |while read a;do printf ${a%% *}' ';printf "${a#* }" >&2;printf '\n';done;}
    argv · 2012-04-01 19:46:19 5

  • 0
    yes "$(seq 232 255;seq 254 -1 233)" | while read i; do printf "\x1b[48;5;${i}m\n"; sleep .01; done
    hogofogo · 2012-04-03 06:41:43 4
  • Sometimes you want to see all of the systcls for a given $thing. I happened to need to easily look at all of the vm sysctls between two boxes and compare them. This is what I came up with. Show Sample Output


    0
    find /proc/sys/vm -maxdepth 1 -type f | while read i ; do printf "%-35s\t%s\n" "$i" "$(<$i)" ; done | sort -t/ -k4
    SEJeff · 2012-05-25 16:34:16 7
  • printf reapeats the format as longer as it has arguments. Then the idea is to make cut retain as much fields as we have elements in the array. As usual with such join/split string manipulation, you have to make sure you don't have conflicts between your separator and your array content.


    0
    printf "%s," "${LIST[@]}" | cut -d "," -f 1-${#LIST[@]}
    Valise · 2012-06-04 14:56:12 3
  • usage: alarmclock TIME TIME is a sleep(1) parameter which tells function how long to wait until raise the alarm.


    0
    alarmclock() { [ $1 ] || echo Parameter TIME is missing. 1>&2 && return 1 ; ( sleep $1 ; for x in 9 8 7 6 5 4 3 2 1 ; do for y in `seq 0 $[ 10 - $x ] ` ; do printf "\a"; sleep 0.$x ; done ; done ) & }
    lkj · 2012-08-16 15:35:15 3
  • Could easily be used for lowercase --> ((i=97;i Show Sample Output


    0
    for ((i=65;i<91;i++)); do printf "\\$(printf '%03o' $i) "; done
    slappy · 2012-08-24 13:24:36 4
  • ‹ First  < 4 5 6 7 8 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

show all established tcp connections on os x

List all files in current dir and subdirs sorted by size
or $ tree -ifsF --noreport .|sort -n -k2|grep -v '/$' (rows presenting directory names become hidden)

Recover tmp flash videos (deleted immediately by the browser plugin)
Newer versions of the flashplayer browser plugin delete the tmp flash video immediately after opening a filehandle to prevent the user from "exporting" the video by simply copying the /tmp/FlashXYZ file. This command searches such deleted flash videos and creates symbolic links to the opened filehandle with the same name as the deleted file. This allows you to play your flash-videos (from e.g. youtube) with e.g. mplayer or copy the buffered video if you want to keep it.

Find where a kind of file is stored
In this case searches for where .desktop files are stored. The resulted is a sorted list of the top directories containing such files.

Split a large file, without wasting disk space
It's common to want to split up large files and the usual method is to use split(1). If you have a 10GiB file, you'll need 10GiB of free space. Then the OS has to read 10GiB and write 10GiB (usually on the same filesystem). This takes AGES. . The command uses a set of loop block devices to create fake chunks, but without making any changes to the file. This means the file splitting is nearly instantaneous. The example creates a 1GiB file, then splits it into 16 x 64MiB chunks (/dev/loop0 .. loop15). . Note: This isn't a drop-in replacement for using split. The results are block devices. tar and zip won't do what you expect when given block devices. . These commands will work: $ hexdump /dev/loop4 . $ gzip -9 < /dev/loop6 > part6.gz . $ cat /dev/loop10 > /media/usb/part10.bin

Random number generation within a range N, here N=10

Find the most recent snapshot for an AWS EBS volume
Uses the python-based AWS CLI (https://aws.amazon.com/cli/) and the JSON query tool, JQ (https://stedolan.github.io/jq/)

FAST Search and Replace for Strings in all Files in Directory
I needed a way to search all files in a web directory that contained a certain string, and replace that string with another string. In the example, I am searching for "askapache" and replacing that string with "htaccess". I wanted this to happen as a cron job, and it was important that this happened as fast as possible while at the same time not hogging the CPU since the machine is a server. So this script uses the nice command to run the sh shell with the command, which makes the whole thing run with priority 19, meaning it won't hog CPU processing. And the -P5 option to the xargs command means it will run 5 separate grep and sed processes simultaneously, so this is much much faster than running a single grep or sed. You may want to do -P0 which is unlimited if you aren't worried about too many processes or if you don't have to deal with process killers in the bg. Also, the -m1 command to grep means stop grepping this file for matches after the first match, which also saves time.

Route outbound SMTP connections through a addtional IP address rather than your primary

GRUB2: set Super Mario as startup tune


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: