What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using head from sorted by
Terminal - Commands using head - 253 results
gem install `ruby ./isuckat_ruby.rb 2>&1 | sed -e 's/.*find gem .//g' -e 's/ .*//g' | head -n 1`
2016-08-03 19:41:27
User: operat0r
Functions: head install sed

When bundle install sucks ...This runs isuckat_ruby.rb and when stderror matches find gem ' it will gem install what ever is missing ...

links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
2016-07-26 12:54:53
User: mogoh
Functions: grep head sort uniq

sort -R randomize the list.

head -n1 takes the first.

du -a /var | sort -n -r | head -n 10
head -n1 | xargs -I {} aws sts get-session-token --serial-number $MFA_ID --duration-seconds 900 --token-code {} --output text --query [Credentials.AccessKeyId,Credentials.SecretAccessKey,Credentials.SessionToken]
2016-04-12 10:57:00
User: keymon
Functions: head xargs

You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token.

This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use:

`awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'`

You must adapt the command line to include:

* $MFA_IDis ARN of the virtual MFA or serial number of the physical one

* TTL for the credentials

find . -type f -printf '%T@ %TY-%Tm-%Td %TH:%TM:%.2TS %p\n' | sort -nr | head -n 5 | cut -f2- -d" "
2016-03-23 11:56:39
User: paulera
Functions: cut find head sort

The output format is given by the -printf parameter:

%T@ = modify time in seconds since Jan. 1, 1970, 00:00 GMT, with fractional part. Mandatory, hidden in the end.

%TY-%Tm-%Td %TH:%TM:%.2TS = modify time as YYYY-MM-DD HH:MM:SS. Optional.

%p = file path

Refer to http://linux.die.net/man/1/find for more about -printf formatting.


sort -nr = sort numerically and reverse (higher values - most recent timestamp - first)

head -n 5 = get only 5 first lines (change 5 to whatever you want)

cut -f2- -d" " = trim first field (timestamp, used only for sorting)


Very useful for building scripts for detecting malicious files upload and malware injections.

curl -sL http://goo.gl/3sA3iW | head -16 | tail -14
head -1 file.csv | tr ',' '\n' | tr -d " " | awk '{print NR,$0}'
2015-08-26 05:46:15
User: neomefistox
Functions: awk head tr

Useful to identify the field number in big CSV files with large number of fields. The index is the reference to use in processing with commands like 'cut' or 'awk' involved.

ps -auxf | sort -nr -k 4 | head -10
mosth() { history | awk '{CMD[$2]++;count++;}END { for (a in CMD)print CMD[a] " " CMD[a]/count*100 "% " a;}' | grep -v "./" | column -c3 -s " " -t | sort -nr | nl | head -n10; }
2015-05-11 17:41:55
User: nnsense
Functions: awk column grep head nl sort

I copied this (let's be honest) somewhere on internet and I just made it as a function ready to be used as alias. It shows the 10 most used commands from history. This seems to be just another "most used commands from history", but hey.. this is a function!!! :D

git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '
git rev-list --all|tail -n1|xargs git show|grep -v diff|head -n1|cut -f1-3 -d' '
ls -l | head -n 65535 | awk '{if (NR > 1) total += $5} END {print total/(1024*1024*1024)}'
rsync -v --ignore-existing `ls | head -n 40` [email protected]:/location
strings /dev/urandom | tr -cd '[:alnum:]' | fold -w 30 | head -n 1
2014-12-11 06:21:51
User: atoponce
Functions: fold head strings tr

This command is similar to the alternate, except with head(1), you can pick as many passwords as you wish to generate by changing the number of lines you wish to preview.

while sleep 1; do if [ $(echo "$(cat /proc/loadavg | cut -d' ' -f1) > .8 " | bc) -gt 0 ]; then echo -e "\n\a"$(date)" \e[5m"$(cat /proc/loadavg)"\e[0m"; ps aux --sort=-%cpu|head -n 5; fi; done
2014-12-08 15:44:40
User: tyzbit
Functions: cat echo head ps sleep

This checks the system load every second and if it's over a certain threshold (.8 in this example), it spits out the date, system loads and top 4 processes sorted by CPU.

Additionally, the \a in the first echo creates an audible bell.

history|awk '{print $2}'|sort|uniq -c|sort -rn|head -30|awk '!max{max=$1;}{r="";i=s=100*$1/max;while(i-->0)r=r"#";printf "%50s %5d %s %s",$2,$1,r,"\n";}'
2014-09-29 12:40:43
User: injez
Functions: awk head printf sort uniq

Top 30 History Command line with histogram display

ifconfig | head -n 2 | tr -d '\n' | sed -n 's/.*\(00:[^ ]*\).*\(adr:[^ ]*\).*/mac:\1 - \2/p'
while true; do ps aux | sort -rk 3,3 | head -n 11 | cut -c -120 | netcat -l -p 8888 2>&1 >/dev/null; done &
2014-08-29 07:10:57
User: manumiu
Functions: cut head ps sort

If you want to see your top ten cpu using processes from the browser (e.g. you don't want to ssh into your server all the time for checking system load) you can run this command and browse to the machines ip on port 8888. For example

dd if=/dev/random count=1 bs=2 2>/dev/null | od -i | awk '{print $2}' | head -1
tr -dc '\x15-\x7e' < /dev/urandom| head -c 16 | paste
awk '/text to grep/{print \$1}' logs... | sort -n | uniq -c | sort -rn | head -n 100
2014-07-10 20:36:02
User: impinball
Functions: awk head sort uniq
Tags: Linux sh

Accepts multiple files via logs.... Substitute "text to grep" for your search string.

If you want to alias this, you could do something like this:

alias parse-logs='awk "/$1/{print \$1}" ${@[@]:1} | sort -n | uniq -c | sort -rn | head -n 100'
awk '/text to grep/{print $1}' "log" | sort -n | uniq -c | sort -rn | head -n 100
2014-07-09 08:48:06
User: kln0thing
Functions: awk head sort uniq

Original command: cat "log" | grep "text to grep" | awk '{print $1}' | sort -n | uniq -c | sort -rn | head -n 100

This is a waste of multiple cats and greps, esp when awk is being used

cat "log" | grep "text to grep" | awk '{print $1}' | sort -n | uniq -c | sort -rn | head -n 100
git verify-pack -v .git/objects/pack/pack-*.idx | grep blob | sort -k3nr | head | while read s x b x; do git rev-list --all --objects | grep $s | awk '{print "'"$b"'",$0;}'; done
mco ping | head -n -4 | awk '{print $1}' | sort