Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,034 results
sed -e '/4.2.2.2/ s/^;//' -i test.txt
2014-10-13 13:37:53
User: suyashjain
Functions: sed
Tags: sed
0

This sed command will search for 4.2.2.2 in all lines of test.txt and replace comment symbol ";" . You can use it for other purpose also.

psql -U quassel quassel -c "SELECT message FROM backlog ORDER BY time DESC LIMIT 1000;" | grep my-query
2014-10-12 19:53:06
User: Tatsh
Functions: grep
0

Replace the credentials to psql if necessary, and the my-query part with your query.

curl -s http://pages.cs.wisc.edu/~ballard/bofh/bofhserver.pl |grep 'is:' |awk 'BEGIN { FS=">"; } { print $10; }'
2014-10-10 21:17:33
User: toj
Functions: awk grep
Tags: curl BOFH
0

Sure, it's dirty, but it's quick, it only displays the excuse, and it works.

ip addr show enp3s0 | awk '/inet[^6]/{print $2}' | awk -F'/' '{print $1}'
<ctrl+u>
for f in */*.ape; do avconv -i "$f" "${f%.ape}.flac"; done
2014-10-10 12:33:00
User: qdrizh
0

Converts all monkey audio files below currently directory to FLAC.

For only current directory, use `for f in *.ape; do avconv -i "$f" "${f%.ape}.flac"; done`

To remove APE files afterward, use `rm */*.ape`

mtr www.google.com
firefox $(grep -i ^url=* file.url | cut -b 5-)
2014-10-08 05:56:27
User: nachos117
Functions: cut grep
0

This command will use grep to read the shortcut (which in the above examle is file.url), and filter out all but the only important line, which contains the website URL, and some extra characters that will need to be removes (for example, URL=http://example.com). The cut command is then used to get rid of the URL= at the beginning. The output is then piped into Firefox, which should interpret the it as a web URL to be opened. Of course, you can replace Firefox with any other broswer. Tested in bash and sh.

egrep -wi --color 'warning|error|critical'
alias lp="echo -n \"some text to copy\" | pbcopy; sleep 120 && echo -n \"done\" | pbcopy &"
2014-10-05 19:43:49
User: wsams
Functions: alias
Tags: alias pbcopy
0

This alias is useful if you need to use some text often. Executing the alias will copy the text into your clipboard and then remove it after X seconds.

eog someimg.jpg
url=`curl http://proxybay.info/ | awk -F'href="|" |">|</' '{for(i=2;i<=NF;i=i+4) print $i,$(i+2)}' | grep follow|sed 's/^.\{19\}//'|shuf -n 1` && firefox $url
2014-10-04 19:08:13
User: dunryc
Functions: awk grep sed
-1

polls the pirate bay mirrors list and chooses a random site and opens it for you in firefox

git reflog --date=local | grep "Oct 2 .* checkout: moving from .* to" | grep -o "[a-zA-Z0-9\-]*$" | sort | uniq
2014-10-03 15:12:22
User: Trindaz
Functions: grep sort
0

Replace "Oct 2" in the first grep pattern to be the date to view branch work from

for i in `cat hosts_list`; do RES=`ssh myusername@${i} "ps -ef " |awk '/[p]rocessname/ {print $2}'`; test "x${RES}" = "x" && echo $i; done
2014-10-03 14:57:54
User: arlequin
Functions: awk echo test
Tags: ssh awk test ps
0

Given a hosts list, ssh one by one and echo its name only if 'processname' is not running.

FILE=file_name; CHUNK=$((64*1024*1024)); SIZE=$(stat -c "%s" $FILE); for ((i=0; i < $SIZE; i+=$CHUNK)); do losetup --find --show --offset=$i --sizelimit=$CHUNK $FILE; done
2014-10-03 13:18:19
User: flatcap
Functions: losetup stat
5

It's common to want to split up large files and the usual method is to use split(1).

If you have a 10GiB file, you'll need 10GiB of free space.

Then the OS has to read 10GiB and write 10GiB (usually on the same filesystem).

This takes AGES.

.

The command uses a set of loop block devices to create fake chunks, but without making any changes to the file.

This means the file splitting is nearly instantaneous.

The example creates a 1GiB file, then splits it into 16 x 64MiB chunks (/dev/loop0 .. loop15).

.

Note: This isn't a drop-in replacement for using split. The results are block devices.

tar and zip won't do what you expect when given block devices.

.

These commands will work:

hexdump /dev/loop4

.

gzip -9 < /dev/loop6 > part6.gz

.

cat /dev/loop10 > /media/usb/part10.bin
sudo tee /path/to/file < /dev/null
tee < file.org file.copy1 file.copy2 [file.copyn] > /dev/null
2014-10-02 16:41:36
User: dacoman
Functions: tee
1

Copies file.org to file.copy1 ... file.copyn

sudo bash -c "> /var/log/httpd/access_log"
host `hostname` | rev | cut -d' ' f1 | rev
2014-10-01 18:55:05
Functions: cut host rev
0

Gives the DNS listed IP for the host you're on... or replace `hostname` with any other host

(ps -U nms -o pid,nlwp,cmd:500 | sort -n -k2) && (ps -U nms -o nlwp | tail -n +2 | paste -sd+ | bc)
2014-09-30 18:25:56
User: cmullican
Functions: paste ps sort tail
0

I occasionally need to see if a machine is hitting ulimit for threads, and what process is responsible. This gives me the total number, sorted low to high so the worst offender is at the end, then gives me the total number of threads, for convenience.

niki
finger $(whoami) | perl -ne '/Name: ([a-zA-Z0-9 ]{1,})/ && print "$1\n"'
pdftk fill_me_in.pdf output no_thanks.pdf flatten
2014-09-30 09:59:46
User: qdrizh
0

Some PDF viewers don't manage form fields correctly when printing. Instead of treating them as transparent, they print as black shapes.

catmandu convert JSON --multiline 1 to YAML < file.json > file.yaml
2014-09-29 16:45:24
Tags: perl json yaml
0

This is based on __unixmonkey73469__ answer. You will need to supply `--multiline 1` option to JSON importer if your .json is multiline (i.e. it was prettyfied)

And you still need catmandu installed via `cpanm Catmandu`

/bin/ls -lF "$@" | sed -r ': top; s/. ([0-9]+)([0-9]{3}[,0-9]* \w{3} )/ \1,\2/ ; t top'
2014-09-29 14:33:23
User: hackerb9
Functions: sed
2

This modifies the output of ls so that the file size has commas every three digits. It makes room for the commas by destructively eating any characters to the left of the size, which is probably okay since that's just the "group".

Note that I did not write this, I merely cleaned it up and shortened it with extended regular expressions. The original shell script, entitled "sl", came with this description:

 : '

 : For tired eyes (sigh), do an ls -lF plus whatever other flags you give

 : but expand the file size with commas every 3 digits. Really helps me

 : distinguish megabytes from hundreds of kbytes...

 :

 : Corey Satten, [email protected], 11/8/89

 : '

Of course, some may suggest that fancy new "human friendly" options, like "ls -Shrl", have made Corey's script obsolete. They are probably right. Yet, at times, still I find it handy. The new-fangled "human-readable" numbers can be annoying when I have to glance at the letter at the end to figure out what order of magnitude is even being talked about. (There's a big difference between 386M and 386P!). But with this nifty script, the number itself acts like a histogram, a quick visual indicator of "bigness" for tired eyes. :-)