All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

phpdoc shortcut
A shortcut to generate documentation with phpdoc. Defaults to HTML; optionally to PDF if third argument is given. Stores documentation in cwd under ./docs/. I forget the syntax to the output, -o, option, so this is easier.

add the result of a command into vi
':r!ls -l' results in listing the files in the current directory and paste it into vi

command shell generate random strong password
shell generate random strong password

Find default gateway (proper at ppp connections too)

Empty a file
The downside of output redirection is that you need permissions. So something like $ > file won't play nicely w/ sudo. You'd need to do something like $ bash -c '> file' instead, you could go w/ $ sudo truncate -s0 file

Android PNG screenshot
Works with *rooted* Android devices. 400x800 are the screen dimensions of a typical handheld smartphone.

Fastest segmented parallel sync of a remote directory over ssh
Mirror a remote directory using some tricks to maximize network speed. lftp:: coolest file transfer tool ever -u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa) -e: execute internal lftp commands set sftp:connect-program: use some specific command instead of plain ssh ssh:: -a -x -T: disable useless things -c arcfour: use the most efficient cipher specification -o Compression=no: disable compression to save CPU mirror: copy remote dir subtree to local dir -v: be verbose (cool progress bar and speed meter, one for each file in parallel) -c: continue interrupted file transfers if possible --loop: repeat mirror until no differences found --use-pget-n=3: transfer each file with 3 independent parallel TCP connections -P 2: transfer 2 files in parallel (totalling 6 TCP connections) sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate) You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network. If the files are compressible removing "-o Compression=n" can be beneficial. Better create an alias for the command.

kill all processes using a directory/file/etc
This command will kill all processes using a directory. It's quick and dirty. One may also use a -9 with kill in case regular kill doesn't work. This is useful if one needs to umount a directory.

Create the four oauth keys required for a Twitter stream feed
Twitter stream feeds now require authentication. This command is the FIRST in a set of five commands you'll need to get Twitter authorization for your final Twitter command. *** IMPORTANT *** Before you start, you have to get some authorization info for your "app" from Twitter. Carefully follow the instructions below: Go to dev.twitter.com/apps and choose "Create a new application". Fill in the form. You can pick any name for your app. After submitting, click on "Create my access token". Keep the resulting page open, as you'll need information from it below. If you closed the page, or want to get back to it in the future, just go to dev.twitter.com/apps Now customize FIVE THINGS on the command line as follows: 1. Replace the string "Consumer key" by copying & pasting your custom consumer key from the Twitter apps page. 2. Replace the string "Consumer secret" by copying & pasting your consumer secret from the Twitter apps page. 3. Replace the string "Access token" by copying & pasting your access token from the Twitter apps page. 4. Replace string "Access token secret" by copying & pasting your own token secret from the Twitter apps page. 5. Replace the string 19258798 with the Twitter UserID NUMBER (this is **NOT** the normal Twitter NAME of the user you want the tweet feed from. If you don't know the UserID number, head over to www.idfromuser.com and type in the user's regular Twitter name. The site will return their Twitter UserID number to you. 19258798 is the Twitter UserID for commandlinefu, so if you don't change that, you'll receive commandlinefu tweets, uhm... on the commandline :) Congratulations! You're done creating all the keys! Environment variables k1, k2, k3, and k4 now hold the four Twitter keys you will need for your next step. The variables should really have been named better, e.g. "Consumer_key", but in later commands the 256-character limit forced me to use short, unclear names here. Just remember k stands for "key". Again, remember, you can always review your requested Twitter keys at dev.twitter.com/apps. Our command line also creates four additional environment variables that are needed in the oauth process: "once", "ts", "hmac" and "id". "once" is a random number used only once that is part of the oauth procedure. HMAC is the actual key that will be used later for signing the base string. "ts" is a timestamp in the Posix time format. The last variable (id) is the user id number of the Twitter user you want to get feeds from. Note that id is ***NOT*** the twitter name, if you didn't know that, see www.idfromuser.com If you want to learn more about oauth authentication, visit oauth.net and/or go to dev.twitter.com/apps, click on any of your apps and then click on "Oauth tool" Now go look at my next command, i.e. step2, to see what happens next to these eight variables.

Instantly load bash history of one shell into another running shell
By default bash history of a shell is appended (appended on Ubuntu by default: Look for 'shopt -s histappend' in ~/.bashrc) to history file only after that shell exits. Although after having written to the history file, other running shells do *not* inherit that history - only newly launched shells do. This pair of commands alleviate that.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: