Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using rm from sorted by
Terminal - Commands using rm - 248 results
for f in *.html; do head -n -1 $f > temp; cat temp > $f; rm temp; done
2009-10-12 12:49:18
User: Sunng
Functions: cat head rm
-1

Some malicious program appends a iframe or script tag to you web pages on some server, use this command to clean them in batch.

( trap '' 1; ( nice -n 19 sleep 2h && command rm -v -rf /garbage/ &>/dev/null && trap 1 ) & )
2

Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour.

It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU.

Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.

rm ~/.bash_history; ln -s /dev/null ~/.bash_history
2009-10-08 17:40:48
Functions: ln rm
-5

Remove your BASH history and then link it to /dev/null

rm ~/.bash_history && kill -9 $$
2009-10-08 12:25:47
User: Velenux
Functions: kill rm
-5

Best way I know to get rid of .bash_history and don't allow bash to save the current one on exit

Edit: added ~/ before .bash_history, just in case... ;)

dhclient -r && rm -f /var/lib/dhcp3/dhclient* && sed "s=$(hostname)=REPLACEME=g" -i /etc/hosts && hostname "$(echo $RANDOM | md5sum | cut -c 1-7 | tr a-z A-Z)" && sed "s=REPLACEME=$(hostname)=g" -i /etc/hosts && macchanger -e eth0 && dhclient
2009-09-28 22:07:31
User: syssyphus
Functions: hostname rm sed
Tags: privacy
7

this string of commands will release your dhcp address, change your mac address, generate a new random hostname and then get a new dhcp lease.

gate() { mkfifo /tmp/sock1 /tmp/sock2 &> /dev/null && nc -p $1 -l < /tmp/sock1 | tee /tmp/sock2 & PID=$! && nc $2 $3 < /tmp/sock2 | tee /tmp/sock1; kill -KILL $PID; rm -f /tmp/sock1 /tmp/sock2 ; }
2009-09-25 08:10:23
User: true
Functions: kill mkfifo rm tee
1

USAGE: gate listening_port host port

Creates listening socket and connects to remote device at host:port. It uses pipes for connection between two sockets. Traffic which goes through pipes is wrote to stdout. I use it for debug network scripts.

rm -rf [a-bd-zA-Z0-9]* c[b-zA-Z0-9]*
2009-09-15 14:22:56
User: arcege
Functions: rm
Tags: shell rm
1

Remove everything in current directory except files starting with "ca".

cat /var/lib/dpkg/info/*.list > /tmp/listin ; ls /proc/*/exe |xargs -l readlink | grep -xvFf /tmp/listin; rm /tmp/listin
2009-09-09 18:09:14
User: kamathln
Functions: cat grep ls readlink rm xargs
Tags: Debian find dpkg
11

This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.

find /backup/directory -name "FILENAME_*" -mtime +15 -exec rm -vf {};
rm -vf /backup/directory/**/FILENAME_*(m+15)
find /backup/directory -name "FILENAME_*" -mtime +15 | xargs rm -vf
for i in `grep "unable to stat" /var/log/syslog | cut -d "/" -f 3 | sort | uniq`; do find /var/qmail/queue -name $i -type f -exec rm -v {} \; ; done
wget http://checkip.dyndns.org && clear && echo && echo My IP && egrep -o '([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}' index.html && echo && rm index.html
rm -d **/*(/^F)
2009-08-06 21:41:19
User: claytron
Functions: rm
Tags: find zsh glob
4

This command uses the recursive glob and glob qualifiers from zsh. This will remove all the empty directories from the current directory down.

The **/* recurses down through all the files and directories

The glob qualifiers are added into the parenthesis. The / means only directories. The F means 'full' directories, and the ^ reverses that to mean non-full directories. For more info on these qualifiers see the zsh docs: http://zsh.dotsrc.org/Doc/Release/Expansion.html#SEC87

buffer () { tty -s && return; tmp=$(mktemp); cat > "${tmp}"; if [ -n "$1" ] && ( ( [ -f "$1" ] && [ -w "$1" ] ) || ( ! [ -a "$1" ] && [ -w "$(dirname "$1")" ] ) ); then mv -f "${tmp}" "$1"; else echo "Can't write in \"$1\""; rm -f "${tmp}"; fi }
2009-07-27 20:21:15
User: Josay
Functions: cat echo mv rm tty
Tags: redirection
2

A common mistake in Bash is to write command-line where there's command a reading a file and whose result is redirected to that file.

It can be easily avoided because of :

1) warnings "-bash: file.txt: cannot overwrite existing file"

2) options (often "-i") that let the command directly modify the file

but I like to have that small function that does the trick by waiting for the first command to end before trying to write into the file.

Lots of things could probably done in a better way, if you know one...

unzip -lt foo.zip | grep testing | awk '{print $2}' | xargs rm -r
tar -tf <file.tar.gz> | xargs rm -r
for i in $(tar -tf <file.tar.gz>); do rm $i; done;
2009-07-06 19:57:23
User: din7
Functions: rm tar
-4

Remove annoying improperly packaged files that untar into the incorrect directory.

Example, When you untar and it extracts hundreds of files into the current directory.... bleh.

rm strangedirs -rf
2009-06-30 15:10:31
User: ioggstream
Functions: rm
Tags: rm safe
-3

avoid rm to be recursive until you complete the command: put the -rf at the end!

unrar e file.part1.rar; if [ $? -eq 0 ]; then rm file.part*.rar; fi
2009-06-13 11:11:43
User: mrttlemonde
Functions: rm
4

It's also possible to delay the extraction (echo "unrar e ... fi" |at now+20 minutes) wich is really convenient!

function my_irc { tmp=`mktemp`; cat > $tmp; { echo -e "USER $username x x :$ircname\nNICK $nick\nJOIN $target"; while read line; do echo -e "PRIVMSG $target :$line"; done < $tmp; } | nc $server > /dev/null ; rm $tmp; }
2009-06-11 22:14:48
User: Josay
Functions: cat echo read rm
Tags: netcat irc nc
1
command | my_irc

Pipe whatever you want to this function, it will, if everything goes well, be redirected to a channel or a user on an IRC server.

Please note that :

- I am not responsible of flood excesses you might provoke.

- that function does not reply to PINGs from the server. That's the reason why I first write in a temporary file. Indeed, I don't want to wait for inputs while being connected to the server. However, according to the configuration of the server and the length of your file, you may timeout before finishing.

- Concerning the server, the variable content must be on the form "irc.server.org 6667" (or any other port). If you want to make some tests, you can also create a fake IRC server on "localhost 55555" by using

netcat -l -p 55555

- Concerning the target, you can choose a channel (beginning with a '#' like "#chan") or a user (like "user")

- The other variables have obvious names.

find . -type f -print0|xargs -0 md5sum|sort|perl -ne 'chomp;$ph=$h;($h,$f)=split(/\s+/,$_,2);print "$f"."\x00" if ($h eq $ph)'|xargs -0 rm -v --
2009-06-07 03:14:06
Functions: find perl rm xargs
18

This one-liner will the *delete* without any further confirmation all 100% duplicates but one based on their md5 hash in the current directory tree (i.e including files in its subdirectories).

Good for cleaning up collections of mp3 files or pictures of your dog|cat|kids|wife being present in gazillion incarnations on hd.

md5sum can be substituted with sha1sum without problems.

The actual filename is not taken into account-just the hash is used.

Whatever sort thinks is the first filename is kept.

It is assumed that the filename does not contain 0x00.

As per the good suggestion in the first comment, this one does a hard link instead:

find . -xdev -type f -print0 | xargs -0 md5sum | sort | perl -ne 'chomp; $ph=$h; ($h,$f)=split(/\s+/,$_,2); if ($h ne $ph) { $k = $f; } else { unlink($f); link($k, $f); }'
find ~/Desktop/ \( -regex '.*/\..*' \) -print -exec rm -Rf {} \;
tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5` 2> /dev/null | xargs rm -fr ;
2009-05-26 17:15:52
User: angleto
Functions: rm tar xargs
Tags: backup
7

create an archive of files with access time older than 5 days, and remove original files.

rm -rf `find -maxdepth 1 -mindepth 1 -mtime +7`
2009-05-22 11:46:57
User: tatwright
Functions: rm
-4

This is useful for command line 'recycle bins' and such like