This appends (-A) a new rule to the INPUT chain, which specifies to drop all packets from a source (-s) IP address.
Self-referential use of wget. Show Sample Output
The tee (as in "T" junction) command is very useful for redirecting output to two places. Show Sample Output
Useful when quickly looking for a commit id from a branch to use with git cherry-pick. Show Sample Output
This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.
This recursively downloads all images from a given website to your /tmp directory. The -nH and -nd switches disable downloading of the directory structure.
This runs the uptime command every 30 seconds to avoid an SSH connection dropping due to inactivity. Granted there are better ways of solving this problem but this is sometimes the right tool for the job.
The --force option bypasses the warning if files are already in SVN.
This counts the number of httpd processes running. Show Sample Output
Simple use of find and grep to recursively search a directory for files that contain a certain term.
You will be prompted for a password unless you have your public keys set-up.
Really useful for when you have a typo in a previous command. Also, arguments default to empty so if you accidentally run:
echo "no typozs"
you can correct it with
^z
Uses shell expansion to create a back-up called file.txt.bak
This grabs all lines that make an instantation or static call, then filters out the cruft and displays a summary of each class called and the frequency. Show Sample Output
Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using
sed '999995,1000005!d' < my_massive_file
Using xargs is better than:
find /path/to/dir -type f -exec rm \-f {} \;
as the -exec switch uses a separate process for each remove. xargs splits the streamed files into more managable subsets so less processes are required.
This greps all PHP files for a given classname and displays both the file and the usage. Show Sample Output
Useful when you forget to use sudo for a command. "!!" grabs the last run command.
This can be useful when a large remove operation is taking place.
This is an alternative to cron which allows a one-off task to be scheduled for a certain time.
This uses awk to grab the IP address from each request and then sorts and summarises the top 10.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: