What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





All commands from sorted by
Terminal - All commands - 12,272 results
followers() { curl -s https://twitter.com/$1 | grep -o '[0-9,]* Followers'; }
2015-09-19 07:07:36
Functions: grep
Tags: CLFUContest

See how many people are following you (or anyone) on Twitter.

followers cadejscroggins
last|grep `whoami`|grep -v logged|cut -c61-71|sed -e 's/[()]//g'|awk '{ sub("\\+", ":");split($1,a,":");if(a[3]){print a[1]*60*60+a[2]*60+a[3]} else {print a[1]*60+a[2] }; }'|paste -s -d+ -|bc|awk '{printf "%dh:%dm:%ds\n",$1/(60*60),$1%(60*60)/60,$1%60}'
2015-09-19 03:02:43
User: donjuanica
Functions: awk cut grep last paste sed

Add -n to last command to restrict to last num logins, otherwise it will pull all available history.

btc() { echo "1 BTC = $(curl -s https://api.coindesk.com/v1/bpi/currentprice/$1.json | jq .bpi.\"$1\".rate | tr -d \"\"\") $1"; }
2015-09-19 02:49:30
User: benjabean1
Functions: echo

The only pre-requisite is jq (and curl, obviously).

The other version used grep, but jq is much more suited to JSON parsing than that.

sudo lsof -nP | awk '/deleted/ { sum+=$8 } END { print sum }'
2015-09-19 00:45:23
Functions: awk sudo sum

A potential source of a full filesystem are large files left open but have been deleted. On Linux, a file may be deleted (removed/unlinked) while a process has it open. When this happens, the file is essentially invisible to other processes, but it still takes on physical space on the drive. Tools like du will not see it.

sudo apt-get remove --purge $(dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d')
dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'
watch "awk '/Rss/{sum += \$2; } END{print sum, \"kB\"}' < /proc/$(pidof firefox)/smaps"
2015-09-19 00:36:34
User: gumnos
Functions: watch

Sometimes top/htop don't give the fine-grained detail on memory usage you might need. Sum up the exact memory types you want

echo "1 BTC = $(curl -s https://api.coindesk.com/v1/bpi/currentprice/usd.json | grep -o 'rate":"[^"]*' | cut -d\" -f3) USD"
wget https://medium.com/@commandlinefu/holy-fsck-a-contest-cd320952726b
2015-09-18 23:57:23
User: jonhendren
Functions: wget
Tags: CLFUContest

Here?s the idea: Submit a one-liner that returns a value or string usable for monitoring something. The more interesting/important, the better.

Tag your one-liners with CLFUContest to enter. Whether you?re participating or not, be sure to vote on the other submissions. The top 5 contest entries by vote count will receive a $10 Amazon gift certificate. On top of that, we?ll select our 3 favorite entries to receive $25 Amazon gift certificates. The prizes might even overlap! Feel free to enter as many times as you like. Check out the URL above for the fine print.


wget -q -O - ifconfig.co
du -h --max-depth=1 /home/ | sort -n
-A INPUT -p tcp --dport 22 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT
2015-09-17 14:51:47
User: erez83

-A INPUT -p udp -m udp --dport 10000:66000 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT

-A INPUT -p udp -m udp --dport 5060 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT

-A INPUT -p tcp --dport 22 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT

curl $1 | grep -E "http.*\.mp3" | sed "s/.*\(http.*\.mp3\).*/\1/" | xargs wget
2015-09-17 13:19:53
User: theodric
Functions: grep sed xargs

The difference between the original version provided and this one is that this one works rather than outputting a wget error

du -sh * | sort -h
du -x --max-depth=1|sort -rn|awk -F / -v c=$COLUMNS 'NR==1{t=$1} NR>1{r=int($1/t*c+.5); b="\033[1;31m"; for (i=0; i<r; i++) b=b"#"; printf " %5.2f%% %s\033[0m %s\n", $1/t*100, b, $2}'|tac
2015-09-12 10:36:49
Functions: awk du printf sort

A more efficient way, with reversed order to put the focus in the big ones.

tr -d "\"" < infile.csv > noquotes.csv
2015-09-11 23:41:48
User: UnklAdM
Functions: tr
Tags: CSV quotes

I always forget this one and find all kinds of complex solutions on google. Also works great while piping data. ex. 'cat data | process-data | tr -d "\"" > processed-data-without-quotes'

ffgif() { p="fps=10,scale=${4:-320}:-1:flags=lanczos"; ffmpeg -y -ss ${2:-0} -t ${3:-0} -i "$1" -vf ${p},palettegen .p.png && ffmpeg -ss ${2:-0} -t ${3:-0} -i "$1" -i .p.png -filter_complex "${p}[x];[x][1:v]paletteuse" "${1%.*}".gif && rm .p.png; }
2015-09-08 21:13:17
User: snipertyler
Functions: rm

I had to compress it a bit to meet the 255 limit. See sample for full command (274)


ffgif foo.ext

Supports 3 arguments (optional)

ffgif filename seek_time time_duration scale

ffgif 10 5 320 will seek 10 seconds in, convert for 5 seconds at a 320 scale.

Default will convert whole video to gif at 320 scale.

Inspiration - http://superuser.com/questions/556029/how-do-i-convert-a-video-to-gif-using-ffmpeg-with-reasonable-quality/556031#556031

perl -ne '@a=split(/,/); $b=0; foreach $r (1..$#a){ $b+=$a[$r] } print "$a[0],$b\n"' -f file.csv
2015-09-04 21:05:56
User: miniker84
Functions: perl

For all lines, sum the columns following the first one, and then print the first column plus the sum of all the other columns.

example input:






curl ifconfig.co/all.json
curl ifconfig.co
nohup bash example.sh 2>&1 | tee -i i-like-log-files.log &
nohup exemplo.sh &
sudo sh -c 'printf "[SeatDefaults]\nallow-guest=false\n" >/usr/share/lightdm/lightdm.conf.d/50-no-guest.conf'; sudo sh -c 'printf "[SeatDefaults]\nallow-guest=false\n" >/usr/share/lightdm/lightdm.conf.d/50-guest-wrapper.conf'
2015-08-31 18:12:21
User: andregyn62
Functions: sh sudo

This command will disable a guest user logon, this user don't have password to login in the system.

sed -e "s/[^/]*\/\/\([^@]*@\)\?\([^:/]*\).*/\2/"
: </dev/tcp/
2015-08-28 19:07:27
User: zlemini

For times when netcat isn't available.

Will throw a Connection refused message if a port is closed.


(: </dev/tcp/ &>/dev/null && echo "OPEN" || echo "CLOSED"