Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,491 results
for i in $(seq 1 11) 13 14 15 16; do man iso-8859-$i; done
2009-03-31 19:40:15
User: penpen
Functions: man seq
Tags: Linux unix
-2

Depending on the installation only certain of these man pages are installed. 12 is left out on purpose because ISO/IEC 8859-12 does not exist. To also access those manpages that are not installed use opera (or any other browser that supports all the character sets involved) to display online versions of the manpages hosted at kernel.org:

for i in $(seq 1 11) 13 14 15 16; do opera http://www.kernel.org/doc/man-pages/online/pages/man7/iso_8859-$i.7.html; done
perror NUMBER
2009-03-31 19:19:44
User: alperyilmaz
1

perror should be installed if mysql-server package is installed

wget --server-response --spider http://www.example.com/
2009-03-31 18:49:14
User: penpen
Functions: wget
4

Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for

Spider mode enabled. Check if remote file exists.

--2009-03-31 20:42:46-- http://www.example.com/

Resolving www.example.com... 208.77.188.166

Connecting to www.example.com|208.77.188.166|:80... connected.

HTTP request sent, awaiting response...

and the second one looks for

Length: 438 [text/html]

Remote file exists and could contain further links,

but recursion is disabled -- not retrieving.

lynx -dump -head http://www.example.com/
2009-03-31 18:41:36
User: penpen
-1

Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is

w3m -dump_head http://www.example.com/
ssh -vN hostname 2>&1 | grep "remote software version"
2009-03-31 18:28:41
User: sud0er
Functions: grep hostname ssh
Tags: ssh
1

I used this to confirm an upgrade to an SSH daemon was successful

date +%m/%d/%y%X|tr -d 'n' >>datemp.log&& sensors|grep +5V|cut -d "(" -f1|tr -d 'n'>> datemp.log && sensors |grep Temp |cut -d "(" -f1|tr -d 'n'>>datemp.log
2009-03-31 18:13:23
User: f241vc15
Functions: cut date grep sensors tr
0
cat datemp.log

04/01/0902:11:42

Sys Temp: +11.0?C

CPU Temp: +35.5?C

AUX Temp: +3.0?C

find . -type f -exec grep StringToFind \{\} --with-filename \;|sed -e '/svn/d'|sed -e '/~/d'
2009-03-31 18:09:31
User: f241vc15
Functions: find grep sed
-3

Look for a string in one of your codes, excluding the files with svn and ~ (temp/back up files). This can be useful when you're looking for a particular string in one of your source codes for example, inside a directory which is under version control (e.g. svn), removing all the annoying files with ~ (tilde) from the search. you can even change the command after -exec to delete (rm) or view (cat) files found by 'find' for example

curlftpfs ftp://YourUsername:YourPassword@YourFTPServerURL /tmp/remote-website/ && rsync -av /tmp/remote-website/* /usr/local/data_latest && umount /tmp/remote-website
2009-03-31 18:01:00
User: nadavkav
Functions: rsync umount
7

connect to a remote server using ftp protocol over FUSE file system, then rsync the remote folder to a local one and then unmount the remote ftp server (FUSE FS)

it can be divided to 3 different commands and you should have curlftpfs and rsync installed

wget --http-user=YourUsername --http-password=YourPassword http://YourWebsiteUrl:2082/getbackup/backup-YourWebsiteUrl-`date +"%-m-%d-%Y"`.tar.gz
2009-03-31 17:50:41
User: nadavkav
Functions: wget
4

this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally.

(do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )

nmap -PN -T4 -p139,445 -n -v --script=smb-check-vulns --script-args safe=1 192.168.0.1-254
2009-03-31 15:15:17
User: cowholio4
6

This was posted on reddit. replace 192.168.0.1-256 with the IP's you want to check.

awk '{print > $3".txt"}' FILENAME
2009-03-31 15:14:13
User: alperyilmaz
Functions: awk
2

This command will sort the contents of FILENAME by redirecting the output to individual .txt files in which 3rd column will be used for sorting. If FILENAME contents are as follows:

foo foo A foo

bar bar B bar

lorem ipsum A lorem

Then two files called A.txt and B.txt will be created and their contents will be:

A.txt

foo foo A foo

lorem ipsum A lorem

and B.txt will be

bar bar B bar

rev <<< 'lorem ipsum' | tee /dev/stderr | rev
2009-03-31 13:12:09
User: penpen
Functions: rev tee
Tags: Linux unix
2

In the above example 'muspi merol' (the output of the first rev command) is sent to stderr and 'lorem ipsum' (the output of the second rev command) is sent to stdout. rev reverse lines of a file or files. This use of tee allows testing if a program correctly handles its input without using files that hold the data.

ssh user@host "(cd /path/to/remote/top/dir ; tar cvf - ./*)" | tar xvf -
2009-03-31 13:08:45
User: dopeman
Functions: ssh tar
Tags: copy files
1

This command will copy files and directories from a remote machine to the local one.

Ensure you are in the local directory you want to populate with the remote files before running the command.

To copy a directory and it's contents, you could:

ssh user@host "(cd /path/to/a/directory ; tar cvf - ./targetdir)" | tar xvf -

This is especially useful on *nix'es that don't have 'scp' installed by default.

ps aux | sort +2n | tail -20
2009-03-31 12:03:34
User: dopeman
Functions: ps sort tail
3

This command will show the 20 processes using the most CPU time (hungriest at the bottom).

You can see the 20 most memory intensive processes (hungriest at the bottom) by running:

ps aux | sort +3n | tail -20

Or, run both:

echo "CPU:" && ps aux | sort +2n | tail -20 && echo "Memory:" && ps aux | sort +3n | tail -20
man ascii
[[ $(COLUMNS=200 ps faux | awk '/grep/ {next} /ssh -N -R 4444/ {i++} END {print i}') ]] || nohup ssh -N -R 4444:localhost:22 user@relay &
2009-03-31 09:39:59
User: j0rn
Functions: awk nohup ps ssh
Tags: ssh cronjob
4

I find it ugly & sexy at the same time isn't it ?

rm -rf ~/.local/share/Trash/files/*
date -d @$(echo $((2 ** 31 - 1)))
2009-03-30 19:42:20
User: jnash
Functions: date echo
1

http://en.wikipedia.org/wiki/Year_2038_problem

Some other notable dates that have passed:

date -d@1234567890 date -d@1000000000
alias rot13="tr '[A-Za-z]' '[N-ZA-Mn-za-m]'"
2009-03-30 19:08:49
User: penpen
Functions: alias
Tags: Linux unix
8

rot13 maps a..mn..z (A..MN..Z) to n..za..m (n..za..m) and so does this alias.

mkdir() { /bin/mkdir $@ && eval cd "\$$#"; }
alias l='ls -CFlash'
2009-03-30 17:11:31
Functions: alias
2

create a short alias for 'ls' with multi-column (-C), file type syntax additions (slashes after directories, @ for symlinks, etc... (-F), long format (-l), including hidden directories (all ./, ../, .svn, etc) (-a), show file-system blocks actually in use (-s), human readable file sizes (-h)

cat /var/log/secure | grep smtp | awk '{print $9}' | cut -f2 -d= | sort | uniq -c | sort -n | tail
2009-03-30 15:49:54
User: empulse
Functions: awk cat cut grep sort uniq
-2

Searches /var/log/secure for smtp connections then lists these by number of connections made and hosts.

cat /var/log/secure | grep sshd | grep Failed | sed 's/invalid//' | sed 's/user//' | awk '{print $11}' | sort | uniq -c | sort -n
2009-03-30 15:48:24
User: empulse
Functions: awk cat grep sed sort sshd uniq
8

Searches the /var/log/secure log file for Failed and/or invalid user log in attempts.

whiptail --checklist "Simple checkbox menu" 11 35 5 tag item status repeat tags 1
2009-03-30 12:21:48
Tags: ncurses,
6

Not so much handy by itself, but very nice in shell scripts.

This makes you a handy ncurses based checklist. Much like terminal installers, just use the arrow keys and hit 'Space' to adjust the selections. Returns all selected tags as strings, with no newline at the end. So, your output will be something like:

"one" "two" "three" "four" "etc"

For those who prefer bash expansion over gratuitious typing:

whiptail --checklist "Simple checkbox menu" 12 35 3 $(echo {one,two,three,four}" '' 0"} )

Things to note:

The height must includes the outer border and padding: add 7 to however many items you want to show up at the same time.

If the status is 1, it will be selected by default. anything else, will be deselected.

du -xk | sort -n | tail -20
2009-03-30 11:37:43
User: dopeman
Functions: du sort tail
7

This command will tell you the 20 biggest directories starting from your working directory and skips directories on other filesystems. Useful for resolving disk space issues.