commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.
You also may keep simple with, but you don't have the progress info:
cp -rpf /some/directory /other/path
You set the file/dirname transfer variable, in the end point you set the path destination, this command uses pipe view to show progress, compress the file outut and takes account to change the ssh cipher. Support dirnames with spaces.
Merged ideas and comments by http://www.commandlinefu.com/commands/view/4379/copy-working-directory-and-compress-it-on-the-fly-while-showing-progress and http://www.commandlinefu.com/commands/view/3177/move-a-lot-of-files-over-ssh
This is a handy way to circumvent the "Maximum line length of 2048 exceeded" grep error.
Once you have run the above command (or put it in your .bashrc), files can be searched using:
lgrep search-string /file/to/search
Just refining last proposal for this check, showing awk power to make more complex math (instead /1024/1024, 2^20). We don't need declare variable before run lsof, because $(command) returns his output. Also, awk can perform filtering by regexp instead to call grep. I changed the 0.0000xxxx messy output, with a more readable form purging all fractional numbers and files less than 1 MB.
Although Exim will purge frozen (undeliverable) messages over time, the command "exim -Mrm #id#" where #id# is a particular message ID will purge a message immediately. Being lazy, I don't want to type the command for each frozen message, so I wrote the one-liner to do it for me.
Prints the unique IP Addresses as they arrive from an Apache `access.log` file.
The '-W interactive' tells awk to start writing to stdout immediately and not buffer the output.
This command builds on the uniq lines without sorting command (http://www.commandlinefu.com/commands/view/4389/remove-duplicate-entries-in-a-file-without-sorting.)
My old Solaris server does not have lsof, so I have to use pfiles.
awk extract every nth line.
Generic is:
awk '{if (NR % LINE == POSITION) print $0}' foo
where "last" position is always 0 (zero).
Change the number 50 to whatever number of characters you want. Change the character inside the double quotes to whatever you want printed.
Can use lsof, but since it's not part of the base OS, it's not always available.
Command line to get which PID is opening a socket on IP and PORT.
Only useful under Solaris.
Searches all log files (including archived bzip2 files) for invalid user and PAM authentication errors, both of which are indicative of brute force attempts at logging into computer. A list of all unique IP addresses and domain names is appended to hosts.deny. The command (and grep error messages) will work on Mac OS X 10.6, small adjustments may be needed for other OSs.
If you gzip an empty file it becomes 20 bytes. Some backup checks i do check to see if the file is greater than zero size (-s flag) but this is no good here. Im sure someone has a better check than me for this? No check to see if file exists before checking it's size.
Exclude 400 client hosts with NFS auto-mounted home directories.
Easily modified for inclusion in your scripts.
Using awk, find duplicates in a file without sorting, which reorders the contents. awk will not reorder them, and still find and remove duplicates which you can then redirect into another file.
What happens here is we tell tar to create "-c" an archive of all files in current dir "." (recursively) and output the data to stdout "-f -". Next we specify the size "-s" to pv of all files in current dir. The "du -sb . | awk ?{print $1}?" returns number of bytes in current dir, and it gets fed as "-s" parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way "pv" knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish.
Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/