Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged bash from sorted by
Terminal - Commands tagged bash - 722 results
for code in {0..255}; do echo -e "\e[38;05;${code}m $code: Test"; done
2010-06-19 02:14:42
User: scribe
Functions: echo
Tags: bash color colors
42

Same as http://www.commandlinefu.com/commands/view/5876, but for bash.

This will show a numerical value for each of the 256 colors in bash. Everything in the command is a bash builtin, so it should run on any platform where bash is installed. Prints one color per line. If someone is interested in formatting the output, paste the alternative.

L(){ l=`builtin printf %${2:-$COLUMNS}s` && echo -e "${l// /${1:-=}}"; }
2

One of the first functions programmers learn is how to print a line. This is my 100% bash builtin function to do it, which makes it as optimal as a function can be. The COLUMNS environment variable is also set by bash (including bash resetting its value when you resize your term) so its very efficient. I like pretty-output in my shells and have experimented with several ways to output a line the width of the screen using a minimal amount of code. This is like version 9,000 lol.

This function is what I use, though when using colors or other terminal features I create separate functions that call this one, since this is the lowest level type of function. It might be better named printl(), but since I use it so much it's more optimal to have the name contain less chars (both for my programming and for the internal workings).

If you do use terminal escapes this will reset to default.

tput sgr0

For implementation ideas, check my

http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

!:1-3
!:n
2010-06-12 02:48:27
User: dbbolton
Tags: history bash zsh
8

'n' is a non-negative integer. Using 0 will expand to the name of the previous command.

statt(){ C=c;stat --h|sed '/Th/,/NO/!d;/%/!d'|while read l;do p=${l/% */};[ $p == %Z ]&&C=fc&&echo ^FS:^;echo "`stat -$C $p \"$1\"` ^$p^${l#%* }";done|column -ts^; }
2010-06-11 23:31:03
User: AskApache
Functions: column read sed
3

This shows every bit of information that stat can get for any file, dir, fifo, etc. It's great because it also shows the format and explains it for each format option.

If you just want stat help, create this handy alias 'stath' to display all format options with explanations.

alias stath="stat --h|sed '/Th/,/NO/!d;/%/!d'"

To display on 2 lines:

( F=/etc/screenrc N=c IFS=$'\n'; for L in $(sed 's/%Z./%Z\n/'<<<`stat --h|sed -n '/^ *%/s/^ *%\(.\).*$/\1:%\1/p'`); do G=$(echo "stat -$N '$L' \"$F\""); eval $G; N=fc;done; )

For a similarly powerful stat-like function optimized for pretty output (and can sort by any field), check out the "lll" function

http://www.commandlinefu.com/commands/view/5815/advanced-ls-output-using-find-for-formattedsortable-file-stat-info

From my .bash_profile ->

http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
7

I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable.

find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50%

00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp

The key is:

# -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'

which believe it or not took me hundreds of tweaking before I was happy with the output.

You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions.

lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; }

Personally I'm using this function because:

lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`; find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; }

# i can sort by user

lll -sort=3

# or sort by group reversed

lll -sort=4 -r

# and sort by modification time

lll -sort=6

If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like:

for a; do lll $a; done

Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

alias sorth='sort --help|sed -n "/^ *-[^-]/s/^ *\(-[^ ]* -[^ ]*\) *\(.*\)/\1:\2/p"|column -ts":"'
3

Once you get into advanced/optimized scripts, functions, or cli usage, you will use the sort command alot. The options are difficult to master/memorize however, and when you use sort commands as much as I do (some examples below), it's useful to have the help available with a simple alias. I love this alias as I never seem to remember all the options for sort, and I use sort like crazy (much better than uniq for example).

# Sorts by file permissions

find . -maxdepth 1 -printf '%.5m %10M %p\n' | sort -k1 -r -g -bS 20%

00761 drwxrw---x ./tmp

00755 drwxr-xr-x .

00701 drwx-----x ./askapache-m

00644 -rw-r--r-- ./.htaccess

# Shows uniq history fast

history 1000 | sed 's/^[0-9 ]*//' | sort -fubdS 50%

exec bash -lxv

export TERM=putty-256color

Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

acpi -V
IFS=`echo -en "\n\b"`; for i in $(curl http://feeds.digg.com/digg/container/technology/popular.rss | grep '<title>' | sed -e 's#<[^>]*>##g' | tail -n10); do echo $i; echo $i | sed 's/^/Did you hear about /g' | say; sleep 30; done
2010-06-07 22:16:19
User: echosedawk
Functions: echo grep sed sleep tail
Tags: bash sed curl osx
-2

Instead of having someone else read you the Digg headlines, Have OSX do it. Requires Curl+Sed+Say. This could probably be easily modified to use espeak for Linux.

cowsay `fortune` | toilet --metal -f term
2010-06-03 21:48:54
-6

Get colorful fortunes dictated by an ASCII cow. For full enjoyment you'll need to have color setup enabled for your terminal.

declare -F | cut -d ' ' -f 3
while [ 1 -lt 2 ]; do i=0; COL=$((RANDOM%$(tput cols)));ROW=$((RANDOM%$(tput cols)));while [ $i -lt $COL ]; do tput cup $i $ROW;echo -e "\033[1;34m" $(cat /dev/urandom | head -1 | cut -c1-1) 2>/dev/null ; i=$(expr $i + 1); done; done
2010-05-28 16:07:56
User: dave1010
Functions: cat cut expr head tput
1

Same as original, but works in bash

for d in $(find . -maxdepth 1 -type d -name '[^.]*'); do cd "$d"; svn up; cd ..; done
2010-05-28 10:09:19
User: udog
Functions: cd find
-2

If you have a directory with many working copies of various subversion projects and you want to update them all at once, this one may be for you.

fbemailscraper YourFBEmail Password
grep -A 5 -e podcast-feed rhythmdb.xml | grep -e "<location>" | sed 's: *</*[a-t]*>::g' > PodFeeds.txt
2010-05-22 05:50:45
User: dogflap
Functions: grep sed
Tags: bash
1

The first grep any line with pod-feed in it plus the following five lines.

The second grep throws out any line not containing .

sed removes the leading four spaces then and the trailing .

Using a colon as sed's separating character avoids having to escape the /.

Works ok with Mythbuntu 9.04 (used mostly as a three line bash script).

sortwc () { local L;while read -r L;do builtin printf "${#L}@%s\n" "$L";done|sort -n|sed -u 's/^[^@]*@//'; }
2010-05-20 20:13:52
User: AskApache
Functions: printf read sed sort
2

This provides a way to sort output based on the length of the line, so that shorter lines appear before longer lines. It's an addon to the sort that I've wanted for years, sometimes it's very useful. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

command ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -vsz -A|sed -u '/^ *PID/d;10q'
2

I've wanted this for a long time, finally just sat down and came up with it. This shows you the sorted output of ps in a pretty format perfect for cron or startup scripts. You can sort by changing the k -vsz to k -pmem for example to sort by memory instead.

If you want a function, here's one from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

aa_top_ps(){ local T N=${1:-10};T=${2:-vsz}; ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -${T} -A|sed -u "/^ *PID/d;${N}q"; }
for i in $(find ~/.config/chromium/*/Extensions -name 'manifest.json'); do n=$(grep -hIr name $i| cut -f4 -d '"'| sort);u="https://chrome.google.com/extensions/detail/";ue=$(basename $(dirname $(dirname $i))); echo -e "$n:\n$u$ue\n" ; done
2010-05-18 15:16:36
User: new_user
Functions: cut find grep
2

Gives you a list for all installed chrome (chromium) extensions with URL to the page of the extension.

With this you can easy add a new Bookmark folder called "extensions" add every URL to that folder, so it will be synced and you can access the names from every computer you are logged in.

------------------------------------------------------------------------------------------------------------------

Only tested with chromium, for chrome you maybe have to change the find $PATH.

!:-
2010-05-15 15:12:47
User: new_user
Tags: bash edit
62
/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.google.com/

then

!:- http://www.commandlinefu.com/

is the same as

/usr/sbin/ab2 -f TLS1 -S -n 1000 -c 100 -t 2 http://www.commandlinefu.com/
tail -n2000 /var/www/domains/*/*/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{ if ($1 > 20)print $1,$2}'
(for i in `find . -maxdepth 2 -name .svn | sed 's/.svn$//'`; do echo $i; svn info $i; done ) | egrep '^.\/|^URL'
2010-05-09 11:54:37
User: jespere
Functions: echo egrep info sed
0

If you have lots of subversion working copies in one directory and want to see in which repositories they are stored, this will do the trick. Can be convenient if you need to move to a new subversion server.

changing_assets = `s3cmd sync --dry-run -P -M --exclude=*.php --delete-removed #{preprod_release_dir}/web/ #{s3_bucket} | grep -E 'delete:|upload:' | awk '{print $2}' | sed s_#{preprod_release_dir}/web__`
2010-05-07 16:03:42
User: trivoallan
Functions: awk grep sed sync
2

Can be useful to granulary flush files in a CDN after they've been changed in the S3 bucket.

s3cmd ls s3://bucket.example.com | s3cmd del `awk '{print $4}'`
shopt -s extglob; for f in *.ttf *.TTF; do g=$(showttf "$f" 2>/dev/null | grep -A1 "language=0.*FullName" | tail -1 | rev | cut -f1 | rev); g=${g##+( )}; mv -i "$f" "$g".ttf; done
2

Just a quick hack to give reasonable filenames to TrueType and OpenType fonts.

I'd accumulated a big bunch of bizarrely and inconsistently named font files in my ~/.fonts directory. I wanted to copy some, but not all, of them over to my new machine, but I had no idea what many of them were. This script renames .ttf files based on the name embedded inside the font. It will also work for .otf files, but make sure you change the mv part so it gives them the proper extension.

REQUIREMENTS: Bash (for extended pattern globbing), showttf (Debian has it in the fontforge-extras package), GNU grep (for context), and rev (because it's hilarious).

BUGS: Well, like I said, this is a quick hack. It grew piece by piece on the command line. I only needed to do this once and spent hardly any time on it, so it's a bit goofy. For example, I find 'rev | cut -f1 | rev' pleasantly amusing --- it seems so clearly wrong, and yet it works to print the last argument. I think flexibility in expressiveness like this is part of the beauty of Unix shell scripting. One-off tasks can be be written quickly, built-up as a person is "thinking aloud" at the command line. That's why Unix is such a huge boost to productivity: it allows each person to think their own way instead of enforcing some "right way".

On a tangent: One of the things I wish commandlinefu would show is the command line HISTORY of the person as they developed the script. I think it's that conversation between programmer and computer, as the pipeline is built piece-by-piece, that is the more valuable lesson than any canned script.