All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

check open ports without netstat or lsof

reverse-i-search: Search through your command line history
"What it actually shows is going to be dependent on the commands you've previously entered. When you do this, bash looks for the last command that you entered that contains the substring "ls", in my case that was "lsof ...". If the command that bash finds is what you're looking for, just hit Enter to execute it. You can also edit the command to suit your current needs before executing it (use the left and right arrow keys to move through it). If you're looking for a different command, hit Ctrl+R again to find a matching command further back in the command history. You can also continue to type a longer substring to refine the search, since searching is incremental. Note that the substring you enter is searched for throughout the command, not just at the beginning of the command." - http://www.linuxjournal.com/content/using-bash-history-more-efficiently

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

live ssh network throughput test
connects to host via ssh and displays the live transfer speed, directing all transferred data to /dev/null needs pv installed Debian: 'apt-get install pv' Fedora: 'yum install pv' (may need the 'extras' repository enabled)

Find the package that installed a command

Insert a colon between every two digits
I sometimes have large files of MAC addresses stored in a file, some databases need the information stored with the semicolon (makes for easier programming a device) others don't. I have a barcode to text file scanner which usually butchers MAC addresses so this was the fix> I initially did this in awk ;) awk '{for(i=10;i>=2;i-=2)$0=substr($0,1,i)":"substr($0,i+1);print}' mac_address_list

Find top 10 largest files in /var directory (subdirectories and hidden files included )
Should work even when very large files exist.

Create an SSH connection (reverse tunnel) through your firewall.
Allows you to establish a tunnel (encapsulate packets) to your (Server B) remote server IP from your local host (Server A). On Server B you can then connect to port 2001 which will forward all packets (encapsulated) to port 22 on Server A. -- www.fir3net.com --

Schedule a script or command in x num hours, silently run in the background even if logged out
doesn't require "at", change the "2h" to whatever you want... (deafult unit for sleep is seconds)

find process associated with a port
e.g. fuser 25/tcp (see which pid is listening on smtp)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: