All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Yet Another Rename (bash function)
Implementation of `rename` for systems on which I don't have access to it.

a function to find the fastest DNS server
http://public-dns.info gives a list of online dns servers. you need to change the country in url (br in this url) with your country code. this command need some time to ping all IP in list.

Go to the Nth line of file

skip broken piece of a loop but not exit the loop entirely
useful for loops like for i in $(cat list_of_servers); do ssh -q $i hostname; done if there is an unreachable server, you can just press ctrl + \ to skip that server and continue on with the loop

Create a tar archive using 7z compression
Using 7z to create archives is OK, but when you use tar, you preserve all file-specific information such as ownership, perms, etc. If that's important to you, this is a better way to do it.

Check availability of Websites based on HTTP_CODE

Calculate pi to an arbitrary number of decimal places
Change the scale to adjust number of decimal places prefix the command with "time" to benchmark the computer (compare how long it takes to calculate 10000 digits of pi on various computers).

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Clean up after a poorly-formed tar file
These days, most software distributed in tar files will just contain a directory at the top level, but some tar files don't have this and can leave you with a mess of files in the current folder if you blindly execute $ tar zxvf something.tar.gz This command can help you clean up after such a mistake. However, note that this has the potential to do bad things if someone has been *really* nasty with filenames.

Mapreduce style processing
parallel can be installed on your central node and can be used to run a command multiple times. In this example, multiple ssh connections are used to run commands. (-j is the number of jobs to run at the same time). The result can then be piped to commands to perform the "reduce" stage. (sort then uniq in this example). This example assumes "keyless ssh login" has been set up between the central node and all machines in the cluster. bashreduce may also do what you want.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: