All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Install pip with Proxy
Installs pip packages defining a proxy

Mark packages installed with build-dep for autoremove (on Debian/Ubuntu)
Replace PACKAGE with desired package name. Found here: http://mikebeach.org/2011/04/undo-apt-get-build-dep/

quickly backup or copy a file with bash
less symbols, tab completion. including # export SIMPLE_BACKUP_SUFFIX="_`date +%F`" in your .bashrc provides you to easily timestamp your files

move messages directly from one IMAP inbox to another
This one-liner was useful in helping someone I know to get off of MS Exchange. `mailutil` proved to be a much better alternative than `fetchmail` or `getmail` in this case. It quickly moved all mails to the destination server (a simple Dovecot/Maildir setup), with no need to convert back and forth between mbox/maildir on the user's own system.

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Show changed files, ignoring permission, date and whitespace changes
Only shows files with actual changes to text (excluding whitespace). Useful if you've messed up permissions or transferred in files from windows or something like that, so that you can get a list of changed files, and clean up the rest.

Testing ftp server status
I must monitorize a couple of ftp servers every morning WITHOUT a port-scanner Instead of ftp'ing on 100 ftp servers manually to test their status I use this loop. It might be adaptable to other services, however it may require a 'logout' string instead of 'quit'. The file ftps.txt contains the full list of ftp servers to monitorize.

ssh tunnel with auto reconnect ability

Insert a line for each n lines
specially usefull for sql scripts with insert / update statements, to add a commit command after n statements executed.

Insert commas to make reading numbers easier in the output of ls
This modifies the output of ls so that the file size has commas every three digits. It makes room for the commas by destructively eating any characters to the left of the size, which is probably okay since that's just the "group".   Note that I did not write this, I merely cleaned it up and shortened it with extended regular expressions. The original shell script, entitled "sl", came with this description:    : '  : For tired eyes (sigh), do an ls -lF plus whatever other flags you give  : but expand the file size with commas every 3 digits. Really helps me  : distinguish megabytes from hundreds of kbytes...  :  : Corey Satten, corey@cac.washington.edu, 11/8/89  : '   Of course, some may suggest that fancy new "human friendly" options, like "ls -Shrl", have made Corey's script obsolete. They are probably right. Yet, at times, still I find it handy. The new-fangled "human-readable" numbers can be annoying when I have to glance at the letter at the end to figure out what order of magnitude is even being talked about. (There's a big difference between 386M and 386P!). But with this nifty script, the number itself acts like a histogram, a quick visual indicator of "bigness" for tired eyes. :-)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: