Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,604 results
find ~user/ -name "*~" -exec rm {} \;
2009-05-09 12:55:47
Functions: find rm
4

I use this simple command for remove all backup files generated usually by editors like Vim and Emacs.

iptables -D fail2ban-SSH -s <ip_address_to_be_set_free> -j DROP
2009-05-08 19:22:15
User: mheadd
Functions: iptables
3

Removes an iptables rule created by fail2ban. This example shows how to remove a rule for an IP from the fail2ban-SSH chain. Can be used for any service monitored by fail2ban.

For more on fail2ban, see http://www.fail2ban.org/wiki/index.php/Main_Page

tar -cf - folder/ | gpg -c > folder.tpg
2009-05-08 19:20:08
User: copremesis
Functions: gpg tar
-1

gpg's compression is as suitable as gzip's however your backups can now be encrypted.

to extract use:

gpg < folder.tpg | tar -xf -
shred -n33 -zx file; rm file
2009-05-08 19:15:41
User: copremesis
Functions: rm shred
1

remove file that has sensitive info safely. Overwrites it 33 times with zeros

aptitude remove $(deborphan)
for a in path/* ; do ccenrypt -K <password> $a; done
2009-05-08 18:33:23
User: P17
Tags: Encryption
4

To decrypt the files replace "ccenrypt" with "ccdecrypt.

ccrypt(1) must be installed. It uses the AES (Rijndael) block cipher.

To make it handier create an alias.

vim `which <scriptname>`
2009-05-08 17:21:47
User: bunedoggle
Functions: vim
Tags: vim which script
1

Often I need to edit a bash or perl script I've written. I know it's in my path but I don't feel like typing the whole path (or I don't remember the path).

rm_cache() { rm -f $HOME/.mozilla/firefox/<profile>/Cache/* }; alias rmcache='rm_cache'
cd !$
2009-05-08 09:48:14
Functions: cd
-3

During this operation :

# mv Joomla_1.5.10-Stable-Full_Package.zip /var/www/joomla/

I invoke /var/www/joomla/ as last command argument. To change in this directory I can use

# cd !$

So I go to

hob:/var/www/joomla#

echo 00{1..9} 0{10..99} 100
find /var/www/html/ -type f -mtime +30 -exec basename {} \;
cat /etc/*issue
!219 ; !229 ; !221
2009-05-07 20:51:36
8

Assuming that 219,229 and 221 are entries in history, I recall them in a single line for execute multiple commands

219 ifdown wlan0

...

221 ifup wlan0

...

229 iwconfig wlan0 mode Managed

so the result is execution of # ifdown wlan0 ; iwconfig wlan0 mode Managed ; ifup wlan0

tailf file.log
2009-05-07 20:13:41
Functions: tailf
5

tailf same as tail -f follow the flow of a log file, showing it in real time to stdout.

for files in $(ls -A directory_name); do sed 's/search/replaced/g' $files > $files.new && mv $files.new $files; done;
2009-05-07 20:13:07
User: bassu
Functions: ls mv sed
-3

Yeah, there are many ways to do that.

Doing with sed by using a for loop is my favourite, because these are two basic things in all *nix environments. Sed by default does not allow to save the output in the same files so we'll use mv to do that in batch along with the sed.

lsb_release -d
alias s='ssh -l root'
2009-05-07 15:57:12
User: GouNiNi
Functions: alias
-20

When you have to manage lot of servers, it's boring to type ssh root@myhost for each connection. Now you can type juste "s someting" and you are connected.

You can too add bash_completion script to complet with tab the name of your servers. This will be the next tips from me ;)

./my-really-long-job.sh && notify-send "Job finished"
2009-05-07 15:50:27
User: root
Tags: notify-send
24

You will need libnotify-bin for this to work:

sudo aptitude install libnotify-bin
pwsafe -qa "gpg keys"."$(finger `whoami` | grep Name | awk '{ print $4" "$5 }')"
2009-05-07 14:49:56
User: denzuko
0

From time to time one forgets either thier gpg key or other passphrases. This can be very problematic in most cases. But luckily there's this script. Its based off of pwsafe which is a unix commandline program that manages encrypted password databases. For more info on pwsafe visit, http://nsd.dyndns.org/pwsafe/.

What this script does is it will help you store all your passphrases for later on and allow you to copy it to your clipboard so you can just paste it in, all with one password. Pretty neat no?

You can find future releases of this and many more scripts at The Teachings of Master Denzuko - denzuko.wordpress.com.

expanded_script=$(eval "echo \"$(cat ${sed_script_file})\"") && sed -e "${expanded_script}" your_input_file
2009-05-07 14:21:14
Functions: eval sed
-1

With this command you can use shell variables inside sed scripts.

This is useful if the script MUST remain in an external file, otherwise you can simply use an inline -e argument to sed.

curl -s http://bash.org/?random1|grep -oE "<p class=\"quote\">.*</p>.*</p>"|grep -oE "<p class=\"qt.*?</p>"|sed -e 's/<\/p>/\n/g' -e 's/<p class=\"qt\">//g' -e 's/<p class=\"qt\">//g'|perl -ne 'use HTML::Entities;print decode_entities($_),"\n"'|head -1
2009-05-07 13:13:21
User: Iftah
Functions: grep head perl sed
6

bash.org is a collection of funny quotes from IRC.

WARNING: some of the quotes contain "adult" jokes... may be embarrassing if your boss sees them...

Thanks to Chen for the idea and initial version!

This script downloads a page with random quotes, filters the html to retrieve just one liners quotes and outputs the first one.

Just barely under the required 255 chars :)

Improvment:

You can replace the head -1 at the end by:

awk 'length($0)>0 {printf( $0 "\n%%\n" )}' > bash_quotes.txt

which will separate the quotes with a "%" and place it in the file.

and then:

strfile bash_quotes.txt

which will make the file ready for the fortune command

and then you can:

fortune bash_quotes.txt

which will give you a random quote from those in the downloaded file.

I download a file periodically and then use the fortune in .bashrc so I see a funny quote every time I open a terminal.

fc-list | cut -d ':' -f 1 | sort -u
Boot up destination machine with Knoppix live CD and run nc -l -p 9000 | dd of=/dev/sda Then on the master dd if=/dev/sda | nc <dest-ip> 9000 You can monitor bandwidth usage to see progress: nload eth0 -u M
2009-05-07 05:26:58
User: lv4tech
Functions: dd
-1

This is a bit to bit copy so if you have a 500GB hard disk it will take a long time even if have Gigabit Ethernet

screen -d -m nautilus --no-desktop `pwd`
2009-05-07 00:49:07
User: windsurfer
Functions: screen
-10

This opens up nautilus in the current directory, which is useful for some quick file management that isn't efficiently done from a terminal.

for i in *jpg; do jpeginfo -c $i | grep -E "WARNING|ERROR" | cut -d " " -f 1 | xargs -I '{}' find /mnt/sourcerep -name {} -type f -print0 | xargs -0 -I '{}' cp -f {} ./ ; done
2009-05-07 00:30:36
User: vincentp
Functions: cp cut find grep xargs
0

Find all corrupted jpeg in the current directory, find a file with the same name in a source directory hierarchy and copy it over the corrupted jpeg file.

Convenient to run on a large bunch of jpeg files copied from an unsure medium.

Needs the jpeginfo tool, found in the jpeginfo package (on debian at least).