Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,338 results
find . -iname '*png' -exec pngcrush -ow -brute {} {}.crush \;
2015-09-22 11:10:16
User: miniker84
Functions: find
Tags: GNU find
1

Find all pngs in directory structure and pngcrush them, none destructive. You can just remove the "{}.crush" part if you want destructive.

echo apt-get\ {update,-y\ upgrade}\ \&\& true | sudo bash
2015-09-22 00:48:26
User: alecthegeek
Functions: echo sudo true
1

it's nice to be able to use the command `ls program.{h,c,cpp}`. This expands to `ls program.h program.c program.cpp`. Note: This is a text expansion, not a shell wildcard type expansion that looks at matching file names to calculate the expansion. More details at http://www.linuxjournal.com/content/bash-brace-expansion

I often run multiple commands (like apt-get) one after the other with different subcommands. Just for fun this wraps the whole thing into a single line that uses brace expansion.

while true; do nc -z localhost 3333 >|/dev/null || (ssh -NfL 3333:REMOTE_HOST:5432 [email protected]_HOST); sleep 15; done
2015-09-21 02:25:49
User: rxw
Functions: sleep ssh
-1

Check if SSH tunnel is open and open it, if it isn't.

NB: In this example, 3333 would be your local port, 5432 the remote port (which is, afaik, usually used by PostgreSQL) and of course you should replace REMOTE_HOST with any valid IP or hostname. The example above let's you work on remote PostgreSQL databases from your local shell, like this:

psql -E -h localhost -p 3333
F=bigdata.xz; lsof -o0 -o -Fo $F | awk -Ft -v s=$(stat -c %s $F) '/^o/{printf("%d%%\n", 100*$2/s)}'
2015-09-19 22:22:43
User: flatcap
Functions: awk stat
7

Imagine you've started a long-running process that involves piping data,

but you forgot to add the progress-bar option to a command.

e.g.

xz -dc bigdata.xz | complicated-processing-program > summary

.

This command uses lsof to see how much data xz has read from the file.

lsof -o0 -o -Fo FILENAME

Display offsets (-o), in decimal (-o0), in parseable form (-Fo)

This will output something like:

.

p12607

f3

o0t45187072

.

Process id (p), File Descriptor (f), Offset (o)

.

We stat the file to get its size

stat -c %s FILENAME

.

Then we plug the values into awk.

Split the line at the letter t: -Ft

Define a variable for the file's size: -s=$(stat...)

Only work on the offset line: /^o/

.

Note this command was tested using the Linux version of lsof.

Because it uses lsof's batch option (-F) it may be portable.

.

Thanks to @unhammer for the brilliant idea.

f=bigdata.xz; calc "round($(lsof -o0 -o "$f"|awk '{o=substr($7,3)}END{print o}')/$(stat -c %s "$f")*100)"
2015-09-19 18:27:12
User: unhammer
2

Say you're started "xzcat bigdata.xz | complicated-processing-program >summary" an hour ago, and you of course forgot to enable progress output (you could've just put "awk 'NR%1000==0{print NR>"/dev/stderr"}{print}'" in the pipeline but it's too late for that now). But you really want some idea of how far along your program is. Then you can run the above command to see how many % along xzcat is in reading the file.

Note that this is for the GNU/Linux version of lsof; the one found on e.g. Darwin has slightly different output so the awk part may need some tweaks.

curl --write-out %{http_code} --connect-timeout 10 --max-time 20 -s -I -L http://google.com --output /dev/null
2015-09-19 17:29:32
User: Guyverix
0

Basic check to see if a website (or even an API) is responding as expected. If this is put in a case statement, there would be enough flexibility to monitor just about anything that gives an HTTP response code, as well as the network path to the site looking at curl exit codes. As this is for a possible monitor, timeouts have been added to make sure that iteration overruns do not occur. ( hopefully, curl sometimes "sticks" longer than expected )

while cat energy_now; do sleep 1; done |awk -v F=$(cat energy_full) -v C=60 'NR==1{P=B=$1;p=100/F} {d=$1-P; if(d!=0&&d*D<=0){D=d;n=1;A[0]=B=P}; if(n>0){r=g=($1-B)/n;if(n>C){r=($1-A[n%C])/C}}; A[n++%C]=P=$1; printf "%3d %+09.5f %+09.5f\n", p*$1, p*g, p*r}'
2015-09-19 15:45:40
User: sqweek
Functions: awk cat printf sleep
-1

Needs to be run in a battery sysfs dir, eg. /sys/class/power_supply/BAT0 on my system.

Displays the battery's current charge and the rate per-second at which energy is {dis,}charging. All values are displayed as percentages of "full" charge.

The first column is the current charge. The second is the rate of change averaged over the entire lifetime of the command (or since the AC cable was {un,}plugged), and the third column is the rate of change averaged over the last minute (controlled by the C=60 variable passed to awk).

The sample output captures a scenario where I ran 'yes' in another terminal to max out a CPU. My battery was at 76% charge and you can see the energy drain starts to rise above 0.01% per-second as the cpu starts working and the fan kicks in etc. While idle it was more like 0.005% per-second.

I tried to use this to estimate the remaining battery life/time until fully charged, but found it to be pretty useless... As my battery gets more charged it starts to charge slower, which meant the estimate was always wrong. Not sure if that's common for batteries or not.

[ $(date +"%H") -lt 7 ] && echo you should probably be sleeping...
weather() { curl -s "http://www.wunderground.com/q/zmw:$1.1.99999" | grep "og:title" | cut -d\" -f4 | sed 's/&deg;/ degrees F/'; }
pidstat -t | sed 's/,/./4' | awk -v seuil='10.0' '{if (NR>3 && $8>seuil) print }'
while true; do date; ps auxf | awk '{if($8=="D") print $0;}'; sleep 1; done
curl -sL http://goo.gl/3sA3iW | head -16 | tail -14
followers() { curl -s https://twitter.com/$1 | grep -o '[0-9,]* Followers'; }
2015-09-19 07:07:36
Functions: grep
Tags: CLFUContest
0

See how many people are following you (or anyone) on Twitter.

followers cadejscroggins
last|grep `whoami`|grep -v logged|cut -c61-71|sed -e 's/[()]//g'|awk '{ sub("\\+", ":");split($1,a,":");if(a[3]){print a[1]*60*60+a[2]*60+a[3]} else {print a[1]*60+a[2] }; }'|paste -s -d+ -|bc|awk '{printf "%dh:%dm:%ds\n",$1/(60*60),$1%(60*60)/60,$1%60}'
2015-09-19 03:02:43
User: donjuanica
Functions: awk cut grep last paste sed
-1

Add -n to last command to restrict to last num logins, otherwise it will pull all available history.

btc() { echo "1 BTC = $(curl -s https://api.coindesk.com/v1/bpi/currentprice/$1.json | jq .bpi.\"$1\".rate | tr -d \"\"\") $1"; }
2015-09-19 02:49:30
User: benjabean1
Functions: echo
-1

The only pre-requisite is jq (and curl, obviously).

The other version used grep, but jq is much more suited to JSON parsing than that.

sudo lsof -nP | awk '/deleted/ { sum+=$8 } END { print sum }'
2015-09-19 00:45:23
Functions: awk sudo sum
0

A potential source of a full filesystem are large files left open but have been deleted. On Linux, a file may be deleted (removed/unlinked) while a process has it open. When this happens, the file is essentially invisible to other processes, but it still takes on physical space on the drive. Tools like du will not see it.

sudo apt-get remove --purge $(dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d')
dpkg -l 'linux-*' | sed '/^ii/!d;/'"$(uname -r | sed "s/\(.*\)-\([^0-9]\+\)/\1/")"'/d;s/^[^ ]* [^ ]* \([^ ]*\).*/\1/;/[0-9]/!d'
watch "awk '/Rss/{sum += \$2; } END{print sum, \"kB\"}' < /proc/$(pidof firefox)/smaps"
2015-09-19 00:36:34
User: gumnos
Functions: watch
2

Sometimes top/htop don't give the fine-grained detail on memory usage you might need. Sum up the exact memory types you want

echo "1 BTC = $(curl -s https://api.coindesk.com/v1/bpi/currentprice/usd.json | grep -o 'rate":"[^"]*' | cut -d\" -f3) USD"
wget https:[email protected]/holy-fsck-a-contest-cd320952726b
2015-09-18 23:57:23
User: jonhendren
Functions: wget
Tags: CLFUContest
0

Here?s the idea: Submit a one-liner that returns a value or string usable for monitoring something. The more interesting/important, the better.

Tag your one-liners with CLFUContest to enter. Whether you?re participating or not, be sure to vote on the other submissions. The top 5 contest entries by vote count will receive a $10 Amazon gift certificate. On top of that, we?ll select our 3 favorite entries to receive $25 Amazon gift certificates. The prizes might even overlap! Feel free to enter as many times as you like. Check out the URL above for the fine print.

-Admin

wget -q -O - ifconfig.co
du -h --max-depth=1 /home/ | sort -n
-A INPUT -p tcp --dport 22 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT
2015-09-17 14:51:47
User: erez83
0

-A INPUT -p udp -m udp --dport 10000:66000 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT

-A INPUT -p udp -m udp --dport 5060 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT

-A INPUT -p tcp --dport 22 -m mac --mac-source 3E:D7:88:A6:66:8E -j ACCEPT

curl $1 | grep -E "http.*\.mp3" | sed "s/.*\(http.*\.mp3\).*/\1/" | xargs wget
2015-09-17 13:19:53
User: theodric
Functions: grep sed xargs
3

The difference between the original version provided and this one is that this one works rather than outputting a wget error