Remove all empty directories below the current directory. If directories become empty as the results of this, remove those too.
xargs avoids having to remember the "{} \;" (although definitely a useful thing to know. Unfortunately I always forget it). xargs version runs 2x faster on my test fwiw. edit: fixed to handle spaces in filenames correctly.
I used this to mass install a lot of perl stuff. Threw it together because I was feeling *especially* lazy. The 'perl' and the 'module' can be replaced with whatever you like.
I have some problems with gnome panel which does not load completely leaving me without the actual GUI. This commands helps to kill the gnome-panel process then it should be relaunch automatically.
Calc the rough time from Twitter. Now with leading Zeroes. Show Sample Output
This command lists all the directories in SEARCHPATH by size, displaying their size in a human readable format. Show Sample Output
This is a more concise answer to http://blog.commandlinekungfu.com/2011/09/episode-158-old-switcheroo.html in my opinion.
be careful where you execute this from do a 'sudo ls' beforehand to prime sudo to not ask for your password
defunct processes (zombies) usually have to be killed by killing their parent processes. this command retrieves such zombies and their immediate parents and kills all of the matching processes.
When downloading files on a Mac, Apple adds the x-attribute: com.apple.quarantine. Often, this makes it so you can't even run a ./configure. This command gets rid of the quarantine for all files in the current directory.
executed on SLES 11.2
Using xargs is usually much quicker as it does not have to execute chmod for every file
First use find to find all the images that end with jpg or JPG in the current dir and all its children. Then pipe that to xargs. The -I{} makes it so spaces in filenames don't matter. The 1024">" makes it so it takes any image greater in dimension than 1024 and resizes them to 1024 width, but keeping aspect ratio on height. Then it sets the image quality to 40. Piping it through xargs means you avoid the file count limit, and you could run this on your entire file system if you wanted.
Broken in two parts, first get the number of cores with cat /proc/cpuinfo |grep proc|wc -l and create a integer sequence with that number (xargs seq), then have GNU parallel loop that many times over the given command. Cheers! Show Sample Output
Replace $CMDLINE_FILENAME with the name of the cmdline file you copied from /proc/pid, and $COMMAND with the command to execute with those arguments.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: