commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
newly downloaded videos
This command assumes you've already downloaded some YouTube .mp4 or .flv video files via other means. Requires 'shuf', or your own stdin shuffler.
Change open-command and type to suit your needs. One example would be to open the last .jpg file with Eye Of Gnome:
eog $(ls -rt *.jpg | tail -n 1)
The wherepath function will search all the directories in your PATH and print a unique list of locations in the order they are first found in the PATH. (PATH often has redundant entries.) It will automatically use your 'ls' alias if you have one or you can hardcode your favorite 'ls' options in the function to get a long listing or color output for example.
'whereis' only searches certain fixed locations.
'which -a' searches all the directories in your path but prints duplicates.
'locate' is great but isn't installed everywhere (and it's often too verbose).
This is a slight variation of an existing submission, but uses regular expression to look for files instead. This makes it vastly more versatile, and one can easily verify the files to be kept by running ls | egrep "[REGULAR EXPRESSION]"
Compresses each file individually, creating a $fileneame.tar.gz and removes the uncompressed version, usefull if you have lots of files and don't want 1 huge archive containing them all. you could replace ls with ls *.pdf to just perform the action on pdfs for example.
specially usefull for sql scripts with insert / update statements, to add a commit command after n statements executed.
When your wtmp files are being logrotated, here's an easy way to unpack them all on the fly to see more than a week in the past. The rm is the primitive way to prevent symlink prediction attack.
You WILL have problems if the files have the same name.
Use cases: consolidate music library and unify photos (especially if your camera separates images by dates).
After running the command and verifying if there was no name issues, you can use
ls -d */ | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs rm -r
to remove now empty subdirectories.
no need for rpm, no need for piping to another command. also no real fu but lacking in unnecessary complexity and distro specific commands.
displays the output of ls -l without the rest of the crud. pretty simple but useful.
xargs deals badly with special characters (such as space, ' and "). To see the problem try this:
touch 'not important_file'
ls not* | xargs rm
Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.
ls -Q will show the filenames in quotes. xargs -p rm will print all the filenames piped from ls -Q and ask for confirmation before deleting the files.
without the -Q switch, if we have spaces in names, then the files won't be deleted.
no loop, only one call of grep, scrollable ("less is more", more or less...)