Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,422 results
ssh somemachine "cd some dir; tar zcpf - somedirname" |tar zxpf -
alias counts=sort | uniq -c | sort -nr
mplayer -ao pcm -vo null -vc dummy -dumpaudio -dumpfile <output-file> <input-file>
diff <(sort file1) <(sort file2)
2009-02-04 22:20:13
User: systemj
Functions: diff sort
90

bash/ksh subshell redirection (as file descriptors) used as input to diff

echo "rm -rf /unwanted-but-large/folder" | batch
2009-02-04 19:07:52
User: root
Functions: echo
41

Good for one off jobs that you want to run at a quiet time. The default threshold is a load average of 0.8 but this can be set using atrun.

realpath examplefile.txt
2009-02-04 15:41:44
User: root
5

Useful in scripts when the file is passed in as an argument. Eg.

filepath=$(realpath $1)
ssh [email protected] cat /path/to/remotefile | diff /path/to/localfile -
2009-02-04 11:33:19
User: root
Functions: cat diff ssh
162

Useful for checking if there are differences between local and remote files.

zcat /usr/share/man/man1/grep.1.gz | grep "color"
2009-02-04 09:38:45
User: root
Functions: grep zcat
-7

This decompresses the file and sends the output to STDOUT so it can be grepped. A good one to put in loops for searching directories of gzipped files, such as man pages.

bind -p | grep -F "\C"
2009-02-03 16:22:14
User: root
Functions: grep
5

Useful for getting to know the available keyboard shortcuts.

function sshdel { perl -i -n -e "print unless (\$. == $1)" ~/.ssh/known_hosts; }
2009-02-03 16:20:50
User: xsawyerx
Functions: perl
-1

sometimes you got conflicts using SSH (host changing ip, ip now belongs to a different machine) and you need to edit the file and remove the offending line from known_hosts. this does it much easier.

(> errors.log) && tail -f !^
2009-02-03 16:08:19
User: root
Functions: tail
2

This is useful for keeping an eye on an error log while developing. The !^ pulls the first arg from the previous command (which needs to be run in a sub-shell for this shortcut to work).

cat /etc/issue
iptables -A INPUT -s 222.35.138.25/32 -j DROP
2009-02-02 12:42:04
User: root
Functions: iptables
10

This appends (-A) a new rule to the INPUT chain, which specifies to drop all packets from a source (-s) IP address.

wget -O - http://www.commandlinefu.com/commands/browse/rss 2>/dev/null | awk '/\s*<title/ {z=match($0, /CDATA\[([^\]]*)\]/, b);print b[1]} /\s*<description/ {c=match($0, /code>(.*)<\/code>/, d);print d[1]"\n"} '
curl --basic --user "user:pass" --data-ascii "status=tweeting%20from%20%the%20linux%20command%20line" http://twitter.com/statuses/update.json
2009-01-30 18:08:35
User: g__j
2

great for outputting tweets from cron jobs and batch scripts

echo -e "[mysql]\npager=less -niSFX" >> ~/.my.cnf
2009-01-29 11:45:52
User: boombastic
Functions: echo
5

Changes standard mysql client output to 'less'.

In another words makes query results of mysql command line client to look much better.

tar czv file1 file2 folder1 | ssh [email protected] tar zxv -C /destination
2009-01-29 10:38:26
User: xsawyerx
Functions: ssh tar
13

it compresses the files and folders to stdout, secure copies it to the server's stdin and runs tar there to extract the input and output to whatever destination using -C. if you emit "-C /destination", it will extract it to the home folder of the user, much like `scp file [email protected]:`.

the "v" in the tar command can be removed for no verbosity.

svn status |grep '\?' |awk '{print $2}'| xargs svn add
2009-01-29 10:33:22
User: xsawyerx
Functions: xargs
13

checks which files are not under version control, fetches the names and runs them through "svn add". WARNING: doesn't work with white spaces.

ping google.com | tee ping-output.txt
2009-01-29 10:26:59
User: root
Functions: ping tee
2

The tee (as in "T" junction) command is very useful for redirecting output to two places.

perl -pi.bk -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:51:11
User: xsawyerx
Functions: perl
10

the addition of ".bk" to the regular "pie" idiom makes perl create a backup of every file with the extension ".bk", in case it b0rks something and you want it back

perl -pi -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:47:01
User: xsawyerx
Functions: perl
0

The "g" at the end is for global, meaning replace all occurrences and not just the first one.

reset
2009-01-28 22:22:01
User: root
Functions: reset
298

If you bork your terminal by sending binary data to STDOUT or similar, you can get your terminal back using this command rather than killing and restarting the session. Note that you often won't be able to see the characters as you type them.

git log master | awk '/commit/ {id=$2} /\s+\w+/ {print id, $0}'
2009-01-28 13:32:08
User: root
Functions: awk
3

Useful when quickly looking for a commit id from a branch to use with git cherry-pick.

URL=www.example.com && wget -rq --spider --force-html "http://$URL" && find $URL -type d > url-list.txt && rm -rf $URL
2009-01-27 17:59:08
User: root
Functions: find rm wget
1

This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.

wget -r -l1 --no-parent -nH -nd -P/tmp -A".gif,.jpg" http://example.com/images
2009-01-27 17:31:22
User: root
Functions: wget
35

This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.