Useful for getting to know the available keyboard shortcuts. Show Sample Output
sometimes you got conflicts using SSH (host changing ip, ip now belongs to a different machine) and you need to edit the file and remove the offending line from known_hosts. this does it much easier. Show Sample Output
This is useful for keeping an eye on an error log while developing. The !^ pulls the first arg from the previous command (which needs to be run in a sub-shell for this shortcut to work).
This appends (-A) a new rule to the INPUT chain, which specifies to drop all packets from a source (-s) IP address.
Self-referential use of wget. Show Sample Output
great for outputting tweets from cron jobs and batch scripts Show Sample Output
Changes standard mysql client output to 'less'. In another words makes query results of mysql command line client to look much better.
it compresses the files and folders to stdout, secure copies it to the server's stdin and runs tar there to extract the input and output to whatever destination using -C. if you emit "-C /destination", it will extract it to the home folder of the user, much like `scp file user@server:`. the "v" in the tar command can be removed for no verbosity.
checks which files are not under version control, fetches the names and runs them through "svn add". WARNING: doesn't work with white spaces.
The tee (as in "T" junction) command is very useful for redirecting output to two places. Show Sample Output
the addition of ".bk" to the regular "pie" idiom makes perl create a backup of every file with the extension ".bk", in case it b0rks something and you want it back
The "g" at the end is for global, meaning replace all occurrences and not just the first one.
Useful when quickly looking for a commit id from a branch to use with git cherry-pick. Show Sample Output
This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.
This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.
This runs the uptime command every 30 seconds to avoid an SSH connection dropping due to inactivity. Granted there are better ways of solving this problem but this is sometimes the right tool for the job.
The --force option bypasses the warning if files are already in SVN.
This counts the number of httpd processes running. Show Sample Output
Simple use of find and grep to recursively search a directory for files that contain a certain term.
You will be prompted for a password unless you have your public keys set-up.
Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run:
echo "no typozs"
you can correct it with
^z
Uses shell expansion to create a back-up called file.txt.bak
This grabs all lines that make an instantation or static call, then filters out the cruft and displays a summary of each class called and the frequency. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: