Commands using rm (301)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

add a backup (or any other) suffix to a file
Very helpful when you've got complex filenames and needs to change just some small parts of it. Renaming a file called "i-made-a-small-typo-right-here" to "i-made-a-big-typo-right-here": $ mv -vi i-made-a-{small,big}-typo-right-here You could also copy multiple files, edit, remove, process, etc.

Backup all starred repositories from Github

delay execution of a command that needs lots of memory and CPU time until the resources are available
[ 2000 -ge "$(free -m | awk '/buffers.cache:/ {print $4}')" ] returns true if less than 2000 MB of RAM are available, so adjust this number to your needs. [ $(echo "$(uptime | awk '{print $10}' | sed -e 's/,$//' -e 's/,/./') >= $(grep -c ^processor /proc/cpuinfo)" | bc) -eq 1 ] returns true if the current machine load is at least equal to the number of CPUs. If either of the tests returns true we wait 10 seconds and check again. If both tests return false, i.e. 2GB are available and machine load falls below number of CPUs, we start our command and save it's output in a text file. The ( ( ... ) & ) construct lets the command run in background even if we log out. See http://www.commandlinefu.com/commands/view/3115/ .

'hpc' in the box - starts a maximum of n compute commands modulo n controlled in parallel
the block of the loop is useful whenever you have huge junks of similar jobs, e.g., convert high res images to thumbnails, and make usage out of all the SMP power on your compute box without flooding the system. note: c is used as counter and the random sleep $ r=`echo $RANDOM%5 |bc`; echo "sleep $r"; sleep $r is just used as a dummy command.

Creates a 'path' command that always prints the full path to any file
The command creates an alias called 'path', so it's useful to add it to your .profile or .bash_profile. The path command then prints the full path of any file, directory, or list of files given. Soft links will be resolved to their true location. This is especially useful if you use scp often to copy files across systems. Now rather then using pwd to get a directory, and then doing a separate cut and paste to get a file's name, you can just type 'path file' and get the full path in one operation.

List all NPM global packages installed

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Have subversion ignore a file pattern in a directory
If you don't want to commit files to subversion, and don't want those file to show up when doing an "svn stat", this command is what you need

Parallel file downloading with wget
xargs can be used in this manner to download multiple files at a time, and xargs will in this case run 10 processes at a time and initiate a new one when the number running falls below 10.

Empty a file
The downside of output redirection is that you need permissions. So something like $ > file won't play nicely w/ sudo. You'd need to do something like $ bash -c '> file' instead, you could go w/ $ sudo truncate -s0 file


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: