commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
this command can be added to crontab so as to execute a nightly backup of directories and store only the 10 last backup files.
Without the bashisms and unnecessary sed dependency. Substitutions quoted so that filenames with whitespace will be handled correctly.
helpful when you see something like this:
zsh: argument list too long: cp
In this example I am returning all the files in /usr/bin that weren't put there by pacman, so that they can be moved to /usr/local/bin where they (most likely) belong.
nice trick with the :>! this is a variant to do a bunch of files (e.g. *.log) in one go
Strips the audio track from a webm video. Use this in combination with clive or youtube-dl.
Useful for transferring large file over a network during operational hours
Really helpfull when play with files having spaces an other bad name. Easy to store and access names and path in just a field while saving it in a file.
This format (URL) is directly supported by nautilus and firefox (and other browsers)
For some reason split will not let you add extension to the files you split. Just add this to a .sh script and run with bash or sh and it will split your text file at 12000 lines for each file and then add a .sql extension to the file name.
It can be used to create an index of a backup directory or to find some file.
Created to deal with an overzealous batch rename on our server that renamed all files to .jpg files.
To ignore aspect ratio, run:
for file in *; do convert $file -resize 800x600! resized-$file; done
and all images will be exactly 800x600.
Use your shell of choice.. This was done in BASH.
You can implement a FOR loop to act on one or more files returned from the IN clause. We originally found this in order to GPG decrypt a file using wildcards (where you don't know exactly the entire file name, i.e.: Test_File_??????.txt, where ?????? = the current time in HHMMSS format). Since we won't know the time the file was generated, we need to use wildcards. And as a result of GPG not handling wildcards, this is the perfect solution. Thought I would share this revelation. :-)
Let the shell handle the repetition in stead of find :)
You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display.
Best off putting this in your bashrc or bash_profile file