Commands using printf (206)

  • Converts a number of bytes provided as input, to a human readable number. Show Sample Output


    2
    human_filesize() { awk -v sum="$1" ' BEGIN {hum[1024^3]="Gb"; hum[1024^2]="Mb"; hum[1024]="Kb"; for (x=1024^3; x>=1024; x/=1024) { if (sum>=x) { printf "%.2f %s\n",sum/x,hum[x]; break; } } if (sum<1024) print "1kb"; } '}
    ArtBIT · 2011-12-02 18:21:20 3
  • Unlike other alternatives, this command only relies on bash builtins and should also work on windows platforms with the bash executable. Sparseness corresponds to the number 128 and can be adjusted. To print all possible digits instead of only 0 and 1 replace RANDOM%2 by RANDOM%10 or RANDOM%16 to add letters [A-F]. Show Sample Output


    2
    while true; do printf "\e[32m%X\e[0m" $((RANDOM%2)); for ((i=0; i<$((RANDOM%128)); i++)) do printf " "; done; done
    seb1245 · 2012-11-27 10:40:42 11

  • 2
    today() { printf '%(%Y-%m-%d)T\n' -1; } ## bash-4
    cfajohnson · 2013-01-27 06:17:25 6
  • opposite of https://www.commandlinefu.com/commands/view/10014/urldecoding-with-one-pure-bash-builtin ;-) Show Sample Output


    2
    function URLEncode { local dataLength="${#1}"; local index; for ((index = 0;index < dataLength;index++)); do local char="${1:index:1}"; case $char in [a-zA-Z0-9.~_-]) printf "$char"; ;; *) printf "%%%02X" "'$char"; ;; esac; done; }
    emphazer · 2018-09-14 12:08:10 360

  • 2
    printf '*%.s' {1..40}; echo
    metropolis · 2019-07-01 07:41:18 48
  • No need to use perl, awk, nor /usr/bin/date -- bash's "printf" builtin will do it. Show Sample Output


    2
    printf '%(%FT%T)T\n' 1606752450
    Mozai · 2021-06-20 05:11:20 176
  • Sometimes, in a shell script, you need a random number bigger than the range of $RANDOM. This will print a random number made of four hex values extracted from /dev/urandom. Show Sample Output


    1
    printf %d 0x`dd if=/dev/urandom bs=1 count=4 2>/dev/null | od -x | awk 'NR==1 {print $2$3}'`
    introp · 2009-02-18 16:23:09 6

  • 1
    echo '123/7' |bc -l |xargs printf "%.3f\n"
    mrttlemonde · 2009-03-18 14:20:32 5

  • 1
    printf "%d\n" "'A" "'B"
    twfcc · 2009-10-17 09:50:44 4
  • printf treats first char after single ' as numeric equivalent


    1
    ord() { printf "%d\n" "'$1"; }
    zude · 2009-10-17 22:02:52 3

  • 1
    printf "%s\n" .*
    cfajohnson · 2009-11-20 21:41:02 3
  • Prompts the user for username and password, that are then exported to http_proxy for use by wget, yum etc Default user, webproxy and port are used. Using this script prevent the cleartext user and pass being in your bash_history and on-screen Show Sample Output


    1
    set-proxy () { P=webproxy:1234; DU="fred"; read -p "username[$DU]:" USER; printf "%b"; UN=${USER:-$DU}; read -s -p "password:" PASS; printf "%b" "\n"; export http_proxy="http://${UN}:${PASS}@$P/"; export ftp_proxy="http://${UN}:${PASS}@$P/"; }
    shadycraig · 2010-02-04 13:12:59 5
  • This one uses hex conversion to do the converting and is in shell/sed only (should probably still use the python/perl version).


    1
    uri_escape(){ echo -E "$@" | sed 's/\\/\\\\/g;s/./&\n/g' | while read -r i; do echo $i | grep -q '[a-zA-Z0-9/.:?&=]' && echo -n "$i" || printf %%%x \'"$i" done }
    infinull · 2010-02-13 01:39:51 41
  • underline() will print $1, followed by a series of '=' characters the width of $1. An optional second argument can be used to replace '=' with a given character. This function is useful for breaking lots of data emitted in a for loop into sections which are easier to parse visually. Let's say that 'xxxx' is a very common pattern occurring in a group of CSV files. You could run grep xxxx *.csv This would print the name of each csv file before each matching line, but the output would be hard to parse visually. for i in *.csv; do printf "\n"; underline $i; grep "xxxx" $i; done Will break the output into sections separated by the name of the file, underlined. Show Sample Output


    1
    underline() { echo $1; for (( i=0; $i<${#1}; i=$i+1)); do printf "${2:-=}"; done; printf "\n"; }
    bartonski · 2010-02-26 05:46:49 8
  • The function 'box' takes either one or two arguments. The first argument is a line of text to be boxed, the second argument (optional) is a character to use to draw the box. By default, the drawing character will be '='. The function 'n()' is a helper function used to draw the upper and lower lines of the box, its arguments are a length, and an character to print. (I used 'n' because 'line', 'ln' and 'l' are all commonly used) Show Sample Output


    1
    box() { l=${#1}+4;x=${2:-=};n $l $x; echo "$x $1 $x"; n $l $x; }; n() { for (( i=0; $i<$1; i=$i+1)); do printf $2; done; printf "\n"; }
    bartonski · 2010-02-26 06:56:59 3

  • 1
    pmap $(pgrep [ProcessName] -n) | gawk '/total/ { a=strtonum($2); b=int(a/1024); printf b};'
    lv4tech · 2010-04-28 08:16:28 3
  • This one liner; combines all sequentially numbered files; in this example IMG_0001.png to IMG_1121.png by generating the shell script, making the shell script executable and then running the shell script to combine the 1121 png into a single png file named _final.png tested on Mac OS X 10.6.3 with ImageMagick 6.5.8-0 2009-11-22 Q16 http://www.imagemagick.org


    1
    echo -n "convert " > itcombino.sh; printf "IMG_%00004u.png " {1..1121} >> itcombino.sh; echo -n "-layers merge _final.png" >> itcombino.sh; chmod +x itcombino.sh && ./itcombino.sh
    IsraelTorres · 2010-05-22 03:56:30 5
  • The first argument is the interpreter for your script, the second argument is the name of the script to create. Show Sample Output


    1
    shebang() { if i=$(which $1); then printf '#!%s\n\n' $i > $2 && vim + $2 && chmod 755 $2; else echo "'which' could not find $1, is it in your \$PATH?"; fi; }
    bartonski · 2011-03-09 14:47:32 25
  • A shorter version Show Sample Output


    1
    while cat /proc/net/dev; do sleep 1; done | awk '/eth0/ {o1=n1; o2=n2; n1=$2; n2=$10; printf "in: %9.2f\t\tout: %9.2f\r", (n1-o1)/1024, (n2-o2)/1024}'
    quadcore · 2011-03-26 02:52:14 3
  • Watch the temperatures of your CPU cores in real time at the command line. Press CONTROL+C to end. GORY DETAILS: Your computer needs to support sensors (many laptops, for example, do not). You'll need to install the lm-sensors package if it isn't already installed. And it helps to run the `sensors-detect` command to set up your sensor kernel modules first. At the very end of the sensors-detect interactive shell prompt, answer YES to add the new lines to the list of kernel modules loaded at boot. Show Sample Output


    1
    while :; do sensors|grep ^Core|while read x; do printf '% .23s\n' "$x"; done; sleep 1 && clear; done;
    linuxrawkstar · 2011-04-20 06:41:57 7
  • Replace service --status-all 2>&1 by service --status-all 2>/dev/null to hide all services with the status [ ? ]


    1
    services() { printf "$(service --status-all 2>&1|sed -e 's/\[ + \]/\\E\[42m\[ + \]\\E\[0m/g' -e 's/\[ - \]/\\E\[41m\[ - \]\\E\[0m/g' -e 's/\[ ? \]/\\E\[43m\[ ? \]\\E\[0m/g')\n";}
    stanix · 2011-04-23 12:38:09 4

  • 1
    arp-scan -I eth0 -l | perl -ne '/((\d{1,3}\.){3}\d{1,3})/ and $ip=$1 and $_=`nmblookup -A $ip` and /([[:alnum:]-]+)\s+<00>[^<]+<ACTIVE>/m and printf "%15s %s\n",$ip,$1'
    bandie91 · 2011-07-08 07:41:41 3
  • Check the API. You shouldn't need sed. The print-newline at the end is to prevent zsh from inserting a % after the end-of-output. Also works with http://v.gd Show Sample Output


    1
    isgd () { curl 'http://is.gd/create.php?format=simple&url='"$1" ; printf "\n" }
    dbbolton · 2011-08-14 23:31:39 3
  • This one line Perl script will display the smallest to the largest files sizes in all directories on a server. Show Sample Output


    1
    du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf ("%6.1f\t%s\t%25s %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}' | more
    Q_Element · 2012-02-07 15:49:19 10
  • Counts the files present in the different directories recursively. One only has to change maxdepth to have further insight in the directory hierarchy. Found at unix.stackexchange.com: http://unix.stackexchange.com/questions/4105/how-do-i-count-all-the-files-recursively-through-directories Show Sample Output


    1
    find -maxdepth 3 -type d | while read -r dir; do printf "%s:\t" "$dir"; find "$dir" | wc -l; done
    brainstorm · 2012-10-15 15:00:09 7
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Dump a web page
Useful to browse dangerous web sites.

Create a tar archive with all files of a certain type found in present dir and subdirs
Note: the tar archive must not exist in order to create it. If exists it will only be updated and no already existent files in present search will still remain in the tar archive. The update option has to be used instead of create because the command tar may be executed more than once depending on the number of arguments that find throws. You can see maximum number of arguments with 'getconf ARG_MAX'

Mysql extended status
Useful when checking MySQL status.

Disable annoying sound emanations from the PC speaker
To ensure that it will never come back, you can edit /etc/modprobe.d/blacklist Add "blacklist pcspkr" sans quotes

Find files of two different extensions and copy them to a directory

Get a regular updated list of zombies
to omit "grep -v", put some brackets around a single character

Get technical and tag information about a video or audio file
MediaInfo supplies technical and tag information about a video or audio file. (sudo apt install mediainfo)

Quickly generate an MD5 hash for a text string using OpenSSL

Realtime lines per second in a log file
Displays the realtime line output rate of a logfile. -l tels pv to count lines -i to refresh every 10 seconds -l option is not in old versions of pv. If the remote system has an old pv version: ssh tail -f /var/log/apache2/access.log | pv -l -i10 -r >/dev/null

See where a shortened url takes you before click


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: