What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





All commands from sorted by
Terminal - All commands - 12,232 results
du -x --max-depth=1|sort -rn|awk -F / -v c=$COLUMNS 'NR==1{t=$1} NR>1{r=int($1/t*c+.5); b="\033[1;31m"; for (i=0; i<r; i++) b=b"#"; printf " %5.2f%% %s\033[0m %s\n", $1/t*100, b, $2}'|tac
2015-09-12 10:36:49
Functions: awk du printf sort

A more efficient way, with reversed order to put the focus in the big ones.

tr -d "\"" < infile.csv > noquotes.csv
2015-09-11 23:41:48
User: UnklAdM
Functions: tr
Tags: CSV quotes

I always forget this one and find all kinds of complex solutions on google. Also works great while piping data. ex. 'cat data | process-data | tr -d "\"" > processed-data-without-quotes'

ffgif() { p="fps=10,scale=${4:-320}:-1:flags=lanczos"; ffmpeg -y -ss ${2:-0} -t ${3:-0} -i "$1" -vf ${p},palettegen .p.png && ffmpeg -ss ${2:-0} -t ${3:-0} -i "$1" -i .p.png -filter_complex "${p}[x];[x][1:v]paletteuse" "${1%.*}".gif && rm .p.png; }
2015-09-08 21:13:17
User: snipertyler
Functions: rm

I had to compress it a bit to meet the 255 limit. See sample for full command (274)


ffgif foo.ext

Supports 3 arguments (optional)

ffgif filename seek_time time_duration scale

ffgif 10 5 320 will seek 10 seconds in, convert for 5 seconds at a 320 scale.

Default will convert whole video to gif at 320 scale.

Inspiration - http://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality/556031#556031

perl -ne '@a=split(/,/); $b=0; foreach $r (1..$#a){ $b+=$a[$r] } print "$a[0],$b\n"' -f file.csv
2015-09-04 21:05:56
User: miniker84
Functions: perl

For all lines, sum the columns following the first one, and then print the first column plus the sum of all the other columns.

example input:






curl ifconfig.co/all.json
curl ifconfig.co
nohup bash example.sh 2>&1 | tee -i i-like-log-files.log &
nohup exemplo.sh &
sudo sh -c 'printf "[SeatDefaults]\nallow-guest=false\n" >/usr/share/lightdm/lightdm.conf.d/50-no-guest.conf'; sudo sh -c 'printf "[SeatDefaults]\nallow-guest=false\n" >/usr/share/lightdm/lightdm.conf.d/50-guest-wrapper.conf'
2015-08-31 18:12:21
User: andregyn62
Functions: sh sudo

This command will disable a guest user logon, this user don't have password to login in the system.

sed -e "s/[^/]*\/\/\([^@]*@\)\?\([^:/]*\).*/\2/"
: </dev/tcp/
2015-08-28 19:07:27
User: zlemini

For times when netcat isn't available.

Will throw a Connection refused message if a port is closed.


(: </dev/tcp/ &>/dev/null && echo "OPEN" || echo "CLOSED"
sed 's/,/\n/g;q' file.csv | nl
2015-08-26 11:38:56
User: flatcap
Functions: sed
Tags: sed nl

Take the header line from a comma-delimited CSV file and enumerate the fields.


First sed replaces all commas with newlines


Then sed quits (q) after the first line.

Finally, nl numbers all the lines

awk -F, '{gsub(/ /,"");for(f=1;f<=NF;f++) print f,$f;exit}' file.csv
2015-08-26 09:30:43
User: sesom42
Functions: awk


use , as field separator


deletes all spaces


loops over all input fields and print their index and value


exit after first line

head -1 file.csv | tr ',' '\n' | tr -d " " | awk '{print NR,$0}'
2015-08-26 05:46:15
User: neomefistox
Functions: awk head tr

Useful to identify the field number in big CSV files with large number of fields. The index is the reference to use in processing with commands like 'cut' or 'awk' involved.

stat -c'%s %n' **/* | sort -n
python -c "import requests; from bs4 import BeautifulSoup; print '\n'.join([cmd.text for cmd in BeautifulSoup(requests.get('http://www.commandlinefu.com/commands/by/${USER}').content, 'html.parser').find_all('div','command')])"
2015-08-22 21:32:36
User: funky
Functions: python

This utilizes the Requests and BeautifulSoup libraries in Python to retrieve a user page on commandlinefu, parse it (error-tolerant) and extract all the lines of the following format:

gzip *

To print them, a list comprehension is used to iterate over the values, and join() is called on a newline character.

sudo apt-get install ufraw
2015-08-20 20:37:00
User: dnlcorrea
Functions: install sudo

Convert RAW files (eg. .CR2) to JPEGs, PNGs and whatnot.

2015-08-19 20:57:09
User: adeverteuil
Tags: exit mc

For those who like to hit instead of typing "exit" to leave the shell and find it annoying that it doesn't work in Midnight Commander, just press to switch to the subshell and now you can leave with

curl - https://graph.facebook.com/fql?q=SELECT%20like_count,%20total_count,%20share_count,%20click_count,%20comment_count%20FROM%20link_stat%20WHERE%20url%20=%20%27<URL>%27 | awk -F\" '{ print $7 }' | awk -F":" '{ print $2 }' | awk -F"," '{ print $1 }'
2015-08-19 20:01:15
User: sxiii
Functions: awk

Replace the with your URL, for example http://rublacklist.net/12348/ and it will show likes number

find /proc/*/fd -xtype f -printf "%l\n" | grep -P '^/(?!dev|proc|sys)' | sort | uniq -c | sort -n
2015-08-18 17:58:21
User: flatcap
Functions: find grep sort uniq
Tags: sort uniq find grep

List all open files of all processes.


find /proc/*/fd

Look through the /proc file descriptors


-xtype f

list only symlinks to file


-printf "%l\n"

print the symlink target


grep -P '^/(?!dev|proc|sys)'

ignore files from /dev /proc or /sys


sort | uniq -c | sort -n

count the results


Many processes will create and immediately delete temporary files.

These can the filtered out by adding:

... | grep -v " (deleted)$" | ...
lsof -a -d 1-99 -Fn / | grep ^n | cut -b2- | sort | uniq -c | sort -n
sudo lsof | egrep 'w.+REG' | awk '{print $10}' | sort | uniq -c | sort -n
2015-08-18 14:09:02
User: kennethjor
Functions: awk egrep sort sudo uniq

This command run fine on my Ubuntu machine, but on Red Hat I had to change the awk command to `awk '{print $10}'`.

echo "quit" | openssl s_client -connect facebook.com:443 | openssl x509 -noout -text | grep "DNS:" | perl -pe "s/(, )?DNS:/\n/g"
wget http://rendezvousavecmrx.free.fr/audio/mr_x_{1997..2015}_{01..12}_{01..31}.mp3
git log -i --grep='needle'
2015-08-11 23:07:55
User: sudopeople
Tags: git grep git-log

Normally, searching git log comments is case sensitive. The -i luckily applies to the --grep switch.