commandlinefu.com is the place to record those command-line gems that you return to again and again.
You can sign-in using OpenID credentials, or register a traditional username and password.
Subscribe to the feed for:
this command can be added to crontab so as to execute a nightly backup of directories and store only the 10 last backup files.
Without the bashisms and unnecessary sed dependency. Substitutions quoted so that filenames with whitespace will be handled correctly.
helpful when you see something like this:
zsh: argument list too long: cp
In this example I am returning all the files in /usr/bin that weren't put there by pacman, so that they can be moved to /usr/local/bin where they (most likely) belong.
nice trick with the :>! this is a variant to do a bunch of files (e.g. *.log) in one go
Strips the audio track from a webm video. Use this in combination with clive or youtube-dl.
Useful for transferring large file over a network during operational hours
Really helpfull when play with files having spaces an other bad name. Easy to store and access names and path in just a field while saving it in a file.
This format (URL) is directly supported by nautilus and firefox (and other browsers)
For some reason split will not let you add extension to the files you split. Just add this to a .sh script and run with bash or sh and it will split your text file at 12000 lines for each file and then add a .sql extension to the file name.
It can be used to create an index of a backup directory or to find some file.
Created to deal with an overzealous batch rename on our server that renamed all files to .jpg files.
To ignore aspect ratio, run:
for file in *; do convert $file -resize 800x600! resized-$file; done
and all images will be exactly 800x600.
Use your shell of choice.. This was done in BASH.
You can implement a FOR loop to act on one or more files returned from the IN clause. We originally found this in order to GPG decrypt a file using wildcards (where you don't know exactly the entire file name, i.e.: Test_File_??????.txt, where ?????? = the current time in HHMMSS format). Since we won't know the time the file was generated, we need to use wildcards. And as a result of GPG not handling wildcards, this is the perfect solution. Thought I would share this revelation. :-)
Let the shell handle the repetition in stead of find :)
You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display.
Best off putting this in your bashrc or bash_profile file