All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Get lines count of a list of files
This command gives you the number of lines of every file in the folder and its subfolders matching the search options specified in the find command. It also gives the total amount of lines of these files. The combination of print0 and files0-from options makes the whole command simple and efficient.

Read funny developer comments in the Linux source tree
These are way better than fortune(6).

a for loop with filling 0 format, with seq
seq allows you to format the output thanks to the -f option. This is very useful if you want to rename your files to the same format in order to be able to easily sort for example: $for i in `seq 1 3 10`; do touch foo$i ;done And $ls foo* | sort -n foo1 foo10 foo4 foo7 But: $for i in `seq -f %02g 1 3 10`; do touch foo$i ;done So $ls foo* | sort -n foo01 foo04 foo07 foo10

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Get the /dev/disk/by-id fragment for a physical drive
Substitute for #11720 Can probably be even shorter and easier.

Makes the permissions of file2 the same as file1
Also works with: $chgrp --reference file1 file2 $chown --reference file1 file2

check open ports without netstat or lsof

Working random fact generator
extension to tali713's random fact generator. It takes the output & sends it to notify-osd. Display time is proportional to the lengh of the fact.

Using PIPEs, Execute a command, convert output to .png file, upload file to imgur.com, then returning the address of the .png.
imgur < /etc/issue % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 2360 0 635 100 1725 1027 2792 --:--:-- --:--:-- --:--:-- 4058 http://i.imgur.com/bvbUD.png

Port scan a range of hosts with Netcat.
Simple one-liner for scanning a range of hosts, you can also scan a range of ports with Netcat by ex.: nc -v -n -z -w 1 192.168.0.1 21-443 Useful when Nmap is not available:) Range declaration like X..X "for i in {21..29}" is only works with bash 3.0+


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: