All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get the date for the last Saturday of a given month
If your locale has Monday as the first day of the week, like mine in the UK, change the two $7 into $6

Escape potential tarbombs
This Anti-TarBomb function makes it easy to unpack a .tar.gz without worrying about the possibility that it will "explode" in your current directory. I've usually always created a temporary folder in which I extracted the tarball first, but I got tired of having to reorganize the files afterwards. Just add this function to your .zshrc / .bashrc and use it like this; $ atb arch1.tar.gz and it will create a folder for the extracted files, if they aren't already in a single folder. This only works for .tar.gz, but it's very easy to edit the function to suit your needs, if you want to extract .tgz, .tar.bz2 or just .tar. More info about tarbombs at http://www.linfo.org/tarbomb.html Tested in zsh and bash. UPDATE: This function works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in zsh (not working in bash): $ atb() { l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t(ar.gz||ar.bz2||gz||bz||ar)} && tar xf $1 -C ${1%.t(ar.gz||ar.bz2||gz||bz||ar)}; fi ;} UPDATE2: From the comments; bepaald came with a variant that works for .tar.gz, .tar.bz2, .tgz, .tbz and .tar in bash: $ atb() {shopt -s extglob ; l=$(tar tf $1); if [ $(echo "$l" | wc -l) -eq $(echo "$l" | grep $(echo "$l" | head -n1) | wc -l) ]; then tar xf $1; else mkdir ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)} && tar xf $1 -C ${1%.t@(ar.gz|ar.bz2|gz|bz|ar)}; fi ; shopt -u extglob}

HTTP Get of a web page via proxy server with login credentials
If you are behind a proxy server and have to authenticate with proxy server to browser web pages then you have to pass proxy server address and its port number along with user credentials to curl to got GET the page using curl. Example : "curl -U srikanth -x 167.85.103.70:8080 http://www.yahoo.com". In case you don't specify the password (as in the above example), curl will prompt to enter the password at the command line.

FizzBuzz in one line of Bash
The (in)famous "FizzBuzz" programming challenge, answered in a single line of Bash code. The "|column" part at the end merely formats the output a bit, so if "column" is not installed on your machine you can simply omit that part. Without "|column", the solution only uses 75 characters. The version below is expanded to multiple lines, with comments added. for i in {1..100} # Use i to loop from "1" to "100", inclusive. do ((i % 3)) && # If i is not divisible by 3... x= || # ...blank out x (yes, "x= " does that). Otherwise,... x=Fizz # ...set x to the string "Fizz". ((i % 5)) || # If i is not divisible by 5, skip (there's no "&&")... x+=Buzz # ...Otherwise, append (not set) the string "Buzz" to x. echo ${x:-$i} # Print x unless it is blanked out. Otherwise, print i. done | column # Wrap output into columns (not part of the test).

easily find megabyte eating files or directories

Octal ls
Do ls with permissions written in octal form.

Merges given files line by line
In the above example all files have 4 lines. In "file1" consecutive lines are: "num, 1, 2, 3", in "file2": "name, Jack, Jim, Frank" and in "file3": "scores, 1300, 1100, 980". This one liner can save considerate ammount of time when you're trying to process serious portions of data. "-d" option allows one to set series of characters to be used as separators between data originating from given files.

Print number of mb of free ram
This will show the amount of physical RAM that is left unused by the system.

geoip information
geoip from maxmind try to get the small utility via: apt-get install geoip

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: