Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

All commands from sorted by
Terminal - All commands - 12,177 results
smbpasswd -r <domain-server> -U <user name>
2009-08-12 07:46:48
User: greppo
7

If you use Linux in a Windows domain and there are N days to expiry, this is how you can change it without resorting to a windows machine.

aria2c -s 4 http://my/url
2009-08-11 22:34:00
User: jrk
8

`aria2c` (from the aria2 project) allows. Change -s 4 to an arbitrary number of segments to control the number of concurrent connections. It is also possible to provide multiple URLs to the same content (potentially over multiple protocols) to download the file concurrently from multiple hosts.

img test.jpg
2009-08-11 22:09:21
User: Inouire
Tags: bash java picture
2

This command allows you to see a preview of a picture via the terminal. It can be usefull when you are ssh'ing your server without X-forwarding.

To have en example of the output you can get with this command see http://www.vimeo.com/3721117

Download at http://inouire.net/image-couleur.html

Sources here: http://inouire.net/archives/image-couleur_source.tar.gz

dmidecode 2.9 | grep "Maximum Capacity"; dmidecode -t 17 | grep Size
system_profiler | mail -s "$HOSTNAME System Profiler Report" user@domain.com
2009-08-11 20:16:37
User: monkeymac
Functions: mail
4

Replace "user@domain.com" with the target e-mail address. Thanks to alediaz for "$HOSTNAME" which is very useful when running the command with Apple Remote Desktop to multiple machines simultaneously.

find $MAILDIR/ -type f -printf '%T@ %p\n' | sort --reverse | sed -e '{ 1,100d; s/[0-9]*\.[0-9]* \(.*\)/\1/g }' | xargs -i sh -c "cat {}&&rm -f {}" | gzip -c >>ARCHIVE.gz
ps -A
2009-08-11 16:52:59
User: xraj
Functions: ps
-13

list all the processes

perl -F',' -ane '$a += $F[3]; END { print $a }' test.csv
2009-08-11 15:08:58
Functions: perl
Tags: awk column CSV sum
1

More of the same but with more elaborate perl-fu :-)

php -i
php -r "phpinfo\(\);"
whois domainnametocheck.com | grep match
2009-08-11 13:33:25
User: Timothee
Functions: grep whois
Tags: whois
6

Returns nothing if the domain exists and 'No match for domain.com' otherwise.

watch "ps auxw | grep 'defunct' | grep -v 'grep' | grep -v 'watch'"
2009-08-11 12:22:13
Functions: watch
5

Shows all those processes; useful when building some massively forking script that could lead to zombies when you don't have your waitpid()'s done just right.

awk -F ',' '{ x = x + $4 } END { print x }' test.csv
curl icanhazip.com
php -r phpinfo();
2009-08-11 07:00:51
User: amaymon
Tags: PHP
2

Run the function phpinfo() on the shell

echo "<?php phpinfo(); ?>" >> /srv/www/htdocs/test.php
exiv2 rename *.jpg
find . -name '*'.tiff -exec bash -c "mogrify -format jpg -quality 85 -resize 75% {} && rm {}" \;
2009-08-10 18:27:10
Functions: bash find
4

Simple command to convert a large number of images into jpeg-format. Will delete originals after conversion.

find . -name \*.pdf -exec pdfinfo {} \; | grep Pages | sed -e "s/Pages:\s*//g" | awk '{ sum += $1;} END { print sum; }'
ipconfig getifaddr <Interface>
du -hd 1
2009-08-10 13:11:22
User: Tuirgin
Functions: du
Tags: disk usage osx
3

OSX's BSD version of the du command uses the -d argument instead of --max-depth.

curl http://www.commandlinefu.com/commands/by/<your username>/rss|gzip ->commandlinefu-contribs-backup-$(date +%Y-%m-%d-%H.%M.%S).rss.gz
2009-08-10 12:43:33
Functions: date gzip
10

Use `zless` to read the content of your *rss.gz file:

zless commandlinefu-contribs-backup-2009-08-10-07.40.39.rss.gz
jhead -n%Y%m%d-%H%M%S *.jpg
2009-08-10 03:49:30
4

jhead is a very nice tool to do all sorts of things with photographs, in a batch-oriented way. It has a specific function to rename files based on dates, and the format I used above was just an example.

ls -1 *.jpg | while read fn; do export pa=`exiv2 "$fn" | grep timestamp | awk '{ print $4 " " $5 ".jpg"}' | tr ":" "-"`; mv "$fn" "$pa"; done
2009-08-10 00:52:22
User: axanc
Functions: awk export grep ls mv read tr
0

Renames all the jpg files as their timestamps with ".jpg" extension.

old='apt-get'; new="su-${old}"; command="sudo ${old}"; alias "${new}=${command}"; $( complete | sed -n "s/${old}$/${new}/p" ); alias ${new}; complete -p ${new}
2009-08-10 00:15:05
User: Josay
Functions: alias sed
4

In Bash, when defining an alias, one usually loses the completion related to the function used in that alias (that completion is usually defined in /etc/bash_completion using the complete builtin).

It's easy to reuse the work done for that completion in order to have smart completion for our alias.

That's what is done by this command line (that's only an example but it may be very easy to reuse).

Note 1 : You can use given command line in a loop "for old in apt-get apt-cache" if you want to define aliases like that for many commands.

Note 2 : You can put the output of the command directly in your .bashrc file (after the ". /etc/bash_completion") to always have the alias and its completion