What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged awk from sorted by
Terminal - Commands tagged awk - 303 results
for i in `ip addr show dev eth1 | grep inet | awk '{print $2}' | cut -d/ -f1`; do echo -n $i; echo -en '\t'; host $i | awk '{print $5}'; done
for fil in *.JPG; do datepath="$(identify -verbose $fil | grep DateTimeOri | awk '{print $2"_"$3 }' | sed s%:%_%g)"; mv -v $fil $datepath.jpg; done
2013-08-02 01:42:04
Functions: mv

Requires ImageMagick.

Extracts date taken from image and renames it properly.

Based on StackOverflow answer.

ifconfig | egrep [0-9A-Za-z]{2}\(:[0-9A-Za-z]{2}\){5} | awk '{print $1 ":\t" $5}'
2013-07-30 17:02:07
User: jaimeanrm
Functions: awk egrep ifconfig

Is the better option on a Open SuSE Box

history | awk '{$1="";print substr($0,2)}'
2013-07-07 08:00:26
User: Fagood
Functions: awk
Tags: history awk

alias h="history | awk '{\$1=\"\";print substr(\$0,2)}'"

# h

[ 07/07/2013 10:04:53 ] alias h="history | awk '{\$1=\"\";print substr(\$0,2)}'"

svn info | grep ^URL | awk -F\/ '{print $NF}'
awk '!($0 in array) { array[$0]; print }' temp
awk '{print $1}' ~/.bash_history | sort | uniq -c | sort -rn | head -n 10
eval `ls -1d * | awk '{print "zip -r "$1".zip "$1";"}'`
apt-get -s upgrade | awk '/Inst.+/ {print $2}'
2013-03-25 21:23:11
User: lpanebr
Functions: apt awk
Tags: awk apt-get

Usefull if you only want to see the package names, or if you want to use them in a script.

find /some/path -type f | gawk -F/ '{print $NF}' | gawk -F. '/\./{print $NF}' | sort | uniq -c | sort -rn
2013-03-18 14:40:26
User: skkzsh
Functions: find gawk sort uniq

If you have GNU findutils, you can get only the file name with

find /some/path -type f -printf '%f\n'

instead of

find /some/path -type f | gawk -F/ '{print $NF}'
svn revert .
df | awk '{if ($2!=dspace) print "different"; dspace=$2;}'
ps auxw | grep sbin/apache | awk '{print"-p " $2}' | xargs strace -f
2013-02-19 19:14:57
User: msealand
Functions: awk grep ps strace xargs

This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.

sudo ifconfig wlan0 | grep inet | awk 'NR==1 {print $2}' | cut -c 6-
2013-02-18 14:10:07
User: mouths
Functions: awk cut grep ifconfig sudo

On wired connections set 'eth0' instead of 'wlan0'

awk '{for (i=9;i<=NF;i++) {printf "%s",$i; printf "%s", " ";}; printf "\n"}'
2013-02-12 13:57:43
User: adimania
Functions: awk printf
Tags: awk

It'll print the file names preserving the spaces in their names and adding new line after every new filename.

I wrote this to quickly find out how many files in any directory is owned by a particular user. This can be extended using pipe and grep to do much more.

find . -type f -size +100M
load=`uptime|awk -F',' '{print $3}'|awk '{print $3}'`; if [[ $(echo "if ($load > 1.0) 1 else 0" | bc) -eq 1 ]]; then notify-send "Load $load";fi
2013-02-06 08:30:24
User: adimania
Functions: awk echo

I run this via crontab every one minute on my machine occasionally to see if a process is eating up my system's resources.

curl http://en.wikipedia.org/wiki/List_of_programming_languages | grep "<li>" | awk -F"title=" '{ print $2 }' | awk -F\" '{ print $2 }'
2013-01-09 21:40:11
User: sxiii
Functions: awk grep

Requirements: curl, grep, awk, internet connection with access to wikipedia

Loaded page: http://en.wikipedia.org/wiki/List_of_programming_languages

If you can make shorter version of this listgetter, you are welcome to paste it here :)

find . -name "*.pdf" -exec pdftk {} dump_data output \; | grep NumberOfPages | awk '{s+=$2} END {print s}'
exipick -zi | xargs --max-args=1000 exim -Mrm
2012-12-12 20:46:22
User: jasen
Functions: xargs
Tags: bash awk exim

do 1000 at a time so that if your doodoo is deep you can avoid avoid "command-line too big" error

awk '/^md/ {printf "%s: ", $1}; /blocks/ {print $NF}' </proc/mdstat
find . -type f -print | awk -F'.' '{print $NF}' | sort | uniq -c
awk '{print NR "\t" $0}'
netstat -tn | awk '($4 ~ /:22\s*/) && ($6 ~ /^EST/) {print substr($5, 0, index($5,":"))}'
for file in `svn st | awk '{print $2}'`; do svn revert $file; done