Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,611 results
tokill=`ps -fea|grep process|awk '{ printf $2" "}'`; kill -9 $tokill;
wget -c -v -S -T 100 --tries=0 `curl -s http://ms1.espectador.com/ podcast/espectador/la_venganza_sera_terrible.xml | grep -v xml | grep link | sed 's/]*>//g'`
2009-03-04 13:12:28
User: fmdlc
Functions: grep link sed wget
-3

This download a complete audio podcast

alias backup_dir='mkdir -p .backup && cp * .backup'
2009-04-06 14:43:21
User: k00pa
Functions: alias cp
-3

Add this to .bashrc, then you can quickly create backups from files on current directory, but it only backups files on current directory.

useful when changing config files, coding something or just trying something stupid.

find . -type f | sed 's,.*,stat "&" | egrep "File|Modify" | tr "\\n" " " ; echo ,' | sh | sed 's,[^/]*/\(.*\). Modify: \(....-..-.. ..:..:..\).*,\2 \1,' | sort
for file in $(find -type f -iname "*wav"); do mv $file "$file"_orig.WAV; mplayer -ao pcm "$file"_orig.WAV -ao pcm:file=$file; done
alias clear='( for ((i=1;i<$LINES;i++)) ; do echo "" ; done ) ; clear'
2009-10-27 14:38:31
User: Marcio
Functions: alias echo
-3

If you receives a lot of compiling errors, type 'clear', then reedit your code and press "SHIFT+PGUP".

sed ?s/[sub_str]/[sub_str]\n/g? [text_file] | wc -l
for i in $(find . -mtime +30); do mv $i old/; done
2014-02-05 01:24:45
User: valferon
Functions: find mv
Tags: bash file
-3

Will move in that case every file in the current folder older than 30 days to the "old" folder

Replace "mv $i old/" by any command such as rm / echo to do something different.

du | sort -n | tail -11 | head
2009-03-04 16:06:34
User: phage
Functions: du sort tail
-3

The pipe to head removes the listing of . as the largest directory.

for files in $(ls -A directory_name); do sed 's/search/replaced/g' $files > $files.new && mv $files.new $files; done;
2009-05-07 20:13:07
User: bassu
Functions: ls mv sed
-3

Yeah, there are many ways to do that.

Doing with sed by using a for loop is my favourite, because these are two basic things in all *nix environments. Sed by default does not allow to save the output in the same files so we'll use mv to do that in batch along with the sed.

TIMEUNIT=$( cat a | grep -n "timescale" | awk -F ":" '{ print $1 } ' )
sudo ping -f -s 56500 192.168.1.100
2010-01-27 17:42:33
User: alamati
Functions: ping sudo
-3

A ping flood is a simple DoS attack where the attacker overwhelms the victim with ICMP Echo Request (ping) packets. It only succeeds if the attacker has more bandwidth than the victim (for instance an attacker with a DSL line and the victim on a dial-up modem).

In this command replace 192.168.1.100 with victim IP address.

chown -R webuser:webgroup /var/www/vhosts/domain.com/httpdocs
gconftool-2 --set /apps/metacity/global_keybindings/panel_main_menu --type string "Super_L"
synclient TouchPadOff=1
synclient TouchPadOff=0
grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}' OR USE ALTERNATE WAY awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'
2009-03-05 07:28:26
User: rommelsharma
Functions: awk grep
-3

grep 'HOME.*' data.txt | awk '{print $2}' | awk '{FS="/"}{print $NF}'

OR

awk '/HOME/ {print $2}' data.txt | awk -F'/' '{print $NF}'

In this example, we are having a text file that is having several entries like:

---

c1 c2 c3 c4

this is some data

HOME /dir1/dir2/.../dirN/somefile1.xml

HOME /dir1/dir2/somefile2.xml

some more data

---

for lines starting with HOME, we are extracting the second field that is a 'file path with file name', and from that we need to get the filename only and ignore the slash delimited path.

The output would be:

somefile1.xml

somefile2.xml

(In case you give a -ive - pls give the reasons as well and enlighten the souls :-) )

cd !$
2009-05-08 09:48:14
Functions: cd
-3

During this operation :

# mv Joomla_1.5.10-Stable-Full_Package.zip /var/www/joomla/

I invoke /var/www/joomla/ as last command argument. To change in this directory I can use

# cd !$

So I go to

hob:/var/www/joomla#

gzip *
2010-03-29 10:58:40
User: funky
Functions: gzip
Tags: gzip
-3

Should do exactly the same - compress every file in the current directory. You can even use it recursively:

gzip -r .
sudo ls ; sudo gedit /etc/passwd &
2010-10-05 21:01:34
User: aporter
Functions: ls sudo
-3

Take advantage of sudo keeping you authenticated for ~15 minutes.

The command is a little longer, but it does not require X (it can run on a headless server).

list the naming contexts of a directory server (no need to search in config files)
pon dsl-provider
grep -r "mystring" . |uniq | cut -d: -f1 | xargs sed -i "s/mystring//"
2009-04-09 12:49:01
Functions: cut grep sed uniq xargs
-3

Linux : these script enable you to edit multiple files and remove exact phrase from multiple files

ls .[!.]*
2009-09-29 13:50:13
User: danam
Functions: ls
-3

Although rm is protected against it, there are many commands that would wreak havoc on entering the obvious ".*" to address "dot-files". This sweet little expression excludes the dirs "." and ".." that cause the problems.

for ((i=0;i<5;i++)) ; do xpenguins & done