commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
This commands compresses the "tmp" directory into an initrd file.
This command is for UNIX OSes that have plain vanilla System V UNIX commands instead of their more functional GNU counterparts, such as IBM AIX.
This is a useful command to backup an sd card with relative total size for piping to pv with a progressbar
this one works on user crontab
use this command to gzip the file and write to stdout and from the stdout redirect to the another file
Archive all .sh files in a directory into a gzip archive.
The command uses ssh(1) to get to a remote host, uses tar(1) to archive a remote directory, prints the result to STDOUT, which is piped to gzip(1) to compress to a local file. In other words, we are archiving and compressing a remote directory to our local box.
just better bash
Similar, but uses tarball instead of zip file
Sometimes you might need to have two copies of data that is in tar. You might unpack, and then copy, but if IO is slow, you might lower it by automatically writing it twice (or more times)
This version compresses the data for transport.
Useful when you have multiple files or binary files that you need to transfer to a different host and scp or the like is unavailable.
To unpack on the destination host copy paste output using the opposite order:
openssl enc -d -base64 | gunzip | tar -x
Terminate openssl input using ^d
Note: gzip is outside of tar because using -z in tar produces lots of extra padding.
This is a safest variation for "sitepass function" that includes a SALT over a long loop for sha512sum hash
It is an easy method unzip a file and copy it to remote machine. No unziped file on local hard drive
Opens a snapshot of a live UFS2 filesystem, runs dump to generate a full filesystem backup which is run through gzip. The filesystem must support snapshots and have a .snap directory in the filesystem root.
To restore the backup, one can do
zcat /path/to/adXsYz.dump.gz | restore -rf -
In this example we convert a .tar.bz2 file to a .tar.gz file.
If you don't have Pipe Viewer, you'll have to download it via apt-get install pv, etc.
tar directory and compress it with showing progress and Disk IO limits. Pipe Viewer can be used to view the progress of the task, Besides, he can limit the disk IO, especially useful for running Servers.
Should do exactly the same - compress every file in the current directory. You can even use it recursively:
gzip -r .
Dumps a compressed svn backup to a file, and emails the files along with any messages as the body of the email
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.