commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
This command should be copy-pasted in Windows, but very similar one will work on Linux.
It uses wget and sed.
You can use sshpass command to provide password for ssh based login. sshpass is a utility designed for running ssh using the mode referred to as "keyboard-interactive" password authentication, but in non-interactive mode.
For this example, all files in the current directory that end in '.xml.skippy' will have the '.skippy' removed from their names.
Prints out the version of exim
Reads n lines from stdin and puts the contents in a variable. Yes, I know the read command and its options, but find this logical even for one line.
Try it this way...
The "-k" flag will tell wget to convert links for local browsing; it works with mirroring (ie with "-r") or single-file downloads.
I used this to mass install a lot of perl stuff. Threw it together because I was feeling *especially* lazy. The 'perl' and the 'module' can be replaced with whatever you like.
remove files with access time older than a given date.
If you want to remove files with a given modification time replace %A@ with %T@. Use %C@ for the modification time.
The time is expressed in epoc but is easy to use any other format.
This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.
You also may keep simple with, but you don't have the progress info:
cp -rpf /some/directory /other/path
Remove dashes, also validates if it's a valid UUID (in contrast to simple string-replacement)
Deletes files older than "n" minutes ago. Note the plus sign before the n is important and means "greater than n". This is more precise than atime, since atime is specified in units of days. NOTE that you can use amin/atime, mmin/mtime, and cmin/ctime for access, modification, and change times, respectively. Also, using -delete is faster than piping to xargs, since no piping is needed.
If your site is struck with the white screen of death you can find the syntax error quickly with php lint
You want bash to keep running the command until it is successful (until the exit code is 0). Give a dummy command, which sets the exit code to 1 then keep running your command until it exits cleanly