commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
This command will :
-Archive all *.dmp files individually (one file per archive) from current directory .
-Delete original file after has been compressed.
This works more reliable for me ("cut -c 8-" had one more space, so it did not work)
The original suggestion did not work for me, when operating on folders located on an external mount (ie other than the root device) in Ubuntu. A variation using xargs does the trick.
Backup your entire system on a tar ball file format.
the -a flag causes tar to automatically pick the right compressor to filter the archive through, based on the file extension. e.g.
"tar -xaf archive.tar.xz" is equivalent to "tar -xJf archive.tar.xz"
"tar -xaf archive.tar.gz" is equivalent to "tar -xzf archive.tar.gz"
No need to remember -z is gzip, -j is bzip2, -Z is .Z, -J is xz, and so on :)
Simple Compressed Backup of the /etc
Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)
The result of this command is a tar with all files that have been modified/added since revision 1792 until HEAD. This command is super useful for incremental releases.
Useful when you have multiple files or binary files that you need to transfer to a different host and scp or the like is unavailable.
To unpack on the destination host copy paste output using the opposite order:
openssl enc -d -base64 | gunzip | tar -x
Terminate openssl input using ^d
Note: gzip is outside of tar because using -z in tar produces lots of extra padding.
Create a encrypted tar.gz file from a directory on the fly. The encryption is done by GPG with a public key. The resulting filename is tagged with the date of creation. Very usefull for encrypted snapshots of folders.
`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)
Backups $DIR_TO_BACKUP into tape, creating on the fly a MD5SUM file of the backup.
Then rewinds one record on tape and checks if it's well written.
tar does not have a -mtime option as find. tar appends all the file to an existing tar file.
This will uncompress the file while it's being downloaded which makes it much faster
A quick find command to identify all TAR files in a given path, extract a list of files contained within the tar, then search for a given string in the filelist. Returns to the user as a list of TAR files found (enclosed in ) followed by any matching files that exist in that archive. TAR can easily be swapped for JAR if required.
in fact, I want to know, how to only get the modified files.
If you want to decompress the files from an archive to current directory by stripping all directory paths, use --transform option to strip path information. Unfortunately, --strip-components option is good if the target files have same and constant depth of folders.
The idea was taken from http://www.unix.com/solaris/145941-how-extract-files-tar-file-without-creating-directories.html