What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands tagged Linux from sorted by
Terminal - Commands tagged Linux - 240 results
pv -tpreb /dev/sdc2 | dd of=/dev/sdb2 bs=64K conv=noerror,sync
2016-12-22 03:18:09
User: 4fthawaiian
Functions: dd

uses the wonderful 'pv' command to give a progress bar when copying one partition to another. Amazing for long running dd commands

find . -type f -printf '%T@ %TY-%Tm-%Td %TH:%TM:%.2TS %p\n' | sort -nr | head -n 5 | cut -f2- -d" "
2016-03-23 11:56:39
User: paulera
Functions: cut find head sort

The output format is given by the -printf parameter:

%T@ = modify time in seconds since Jan. 1, 1970, 00:00 GMT, with fractional part. Mandatory, hidden in the end.

%TY-%Tm-%Td %TH:%TM:%.2TS = modify time as YYYY-MM-DD HH:MM:SS. Optional.

%p = file path

Refer to http://linux.die.net/man/1/find for more about -printf formatting.


sort -nr = sort numerically and reverse (higher values - most recent timestamp - first)

head -n 5 = get only 5 first lines (change 5 to whatever you want)

cut -f2- -d" " = trim first field (timestamp, used only for sorting)


Very useful for building scripts for detecting malicious files upload and malware injections.

$if [[ "$(sleep 1 | telnet -c <host> <port> 2>&1 | grep '^SSH')" == SSH* ]]; then <command when up>; else <command when down>; fi;
2016-02-02 13:06:51
User: paulera

This command telnet and and looks for a line starting with "SSH" - works for OpenSSH since the SSH banner is something like "SSH-2.0-OpenSSH_6.0p1 Debian-4+deb7u3". Then it triggers an action accordingly.

It can be packed as a script file to echo 0/1 indicating the SSH service availability:

if [[ "$(sleep 1 | telnet -c <host> <port> 2>&1 | grep '^SSH')" == SSH* ]]; then echo 1; else echo 0; fi;

Alternative uses:

Trigger an action when server is UP (using &&):

[[ "$(sleep 1 | telnet -c <host> <port> 2>&1 | grep '^SSH')" == SSH* ]] && <command when up>

Trigger an action when server is DOWN (using ||):

[[ "$(sleep 1 | telnet -c <host> <port> 2>&1 | grep '^SSH')" == SSH* ]] || <command when down>
find / -name \*.php -exec grep -Hn preg_replace {} \;|grep /e|grep POST
ps h -o %a 21679
2015-09-27 11:00:07
User: BeniBela
Functions: ps
Tags: Linux ps

Show the command line for a PID with ps

tr '\0' ' ' </proc/21679/cmdline ; echo
xargs -0a /proc/27288/cmdline echo
2015-09-25 17:35:11
User: dennisw
Functions: xargs
Tags: Linux /proc

If you cat the file, all the parts of the command line are bunched up. If you use tr to convert the nulls to spaces, you're still left without a newline unless you add another step. This command does everything for you.

while cat energy_now; do sleep 1; done |awk -v F=$(cat energy_full) -v C=60 'NR==1{P=B=$1;p=100/F} {d=$1-P; if(d!=0&&d*D<=0){D=d;n=1;A[0]=B=P}; if(n>0){r=g=($1-B)/n;if(n>C){r=($1-A[n%C])/C}}; A[n++%C]=P=$1; printf "%3d %+09.5f %+09.5f\n", p*$1, p*g, p*r}'
2015-09-19 15:45:40
User: sqweek
Functions: awk cat printf sleep

Needs to be run in a battery sysfs dir, eg. /sys/class/power_supply/BAT0 on my system.

Displays the battery's current charge and the rate per-second at which energy is {dis,}charging. All values are displayed as percentages of "full" charge.

The first column is the current charge. The second is the rate of change averaged over the entire lifetime of the command (or since the AC cable was {un,}plugged), and the third column is the rate of change averaged over the last minute (controlled by the C=60 variable passed to awk).

The sample output captures a scenario where I ran 'yes' in another terminal to max out a CPU. My battery was at 76% charge and you can see the energy drain starts to rise above 0.01% per-second as the cpu starts working and the fan kicks in etc. While idle it was more like 0.005% per-second.

I tried to use this to estimate the remaining battery life/time until fully charged, but found it to be pretty useless... As my battery gets more charged it starts to charge slower, which meant the estimate was always wrong. Not sure if that's common for batteries or not.

exec 5<>/dev/tcp/time.nist.gov/13; cat <&5 & cat >&5; exec 5>&-
2015-07-30 21:12:38
User: tyzbit
Functions: cat exec
Tags: bash Linux unix

Ever needed to test firewalls but didn't have netcat, telnet or even FTP?

Enter /dev/tcp, your new best friend. /dev/tcp/(hostname)/(port) is a bash builtin that bash can use to open connections to TCP and UDP ports.

This one-liner opens a connection on a port to a server and lets you read and write to it from the terminal.

How it works:

First, exec sets up a redirect for /dev/tcp/$server/$port to file descriptor 5.

Then, as per some excellent feedback from @flatcap, we launch a redirect from file descriptor 5 to STDOUT and send that to the background (which is what causes the PID to be printed when the commands are run), and then redirect STDIN to file descriptor 5 with the second cat.

Finally, when the second cat dies (the connection is closed), we clean up the file descriptor with 'exec 5>&-'.

It can be used to test FTP, HTTP, NTP, or can connect to netcat listening on a port (makes for a simple chat client!)

Replace /tcp/ with /udp/ to use UDP instead.

xsel -bc
2015-02-26 01:11:03
User: benjabean1

Clears your clipboard if xsel is installed on your machine.

If your xsel is dumb, you can also use

xsel --clear --clipboard
wget -q -O - http://www.example.com/automation/remotescript.sh | bash /dev/stdin parameter1 parameter2
2015-02-16 16:55:09
User: paulera
Functions: bash wget

Use this command to execute the contents of http://www.example.com/automation/remotescript.sh in the local environment. The parameters are optional.

Alterrnatives to wget:


curl -s http://www.example.com/automation/remotescript.sh | bash /dev/stdin param1 param2


w3m -dump http://www.example.com/automation/remotescript.sh | bash /dev/stdin [param1] [param2]


lynx -source http://www.example.com/automation/remotescript.sh | bash /dev/stdin [param1] [param2]
ip -o -4 a s | awk -F'[ /]+' '$2!~/lo/{print $4}'
2015-02-13 11:19:31
User: paulera
Functions: awk

To show ipv6 instead, use [[ -6 ]] instead of [[ -4 ]]

ip -o -6 a s | awk -F'[ /]+' '$2!~/lo/{print $4}'

To show only the IP of a specific interface, in case you get more than one result:

ip -o -4 a s eth0 | awk -F'[ /]+' '$2!~/lo/{print $4}' ip -o -4 a s wlan0 | awk -F'[ /]+' '$2!~/lo/{print $4}'
curl -XGET 'localhost:9200'
2015-01-23 15:01:29
User: paulera

Replace localhost:9200 with your server location and port. This is the ElasticSearch's default setup for local instances.

lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
2014-10-17 00:29:34
User: colemar
Functions: lftp

Mirror a remote directory using some tricks to maximize network speed.

lftp:: coolest file transfer tool ever

-u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa)

-e: execute internal lftp commands

set sftp:connect-program: use some specific command instead of plain ssh


-a -x -T: disable useless things

-c arcfour: use the most efficient cipher specification

-o Compression=no: disable compression to save CPU

mirror: copy remote dir subtree to local dir

-v: be verbose (cool progress bar and speed meter, one for each file in parallel)

-c: continue interrupted file transfers if possible

--loop: repeat mirror until no differences found

--use-pget-n=3: transfer each file with 3 independent parallel TCP connections

-P 2: transfer 2 files in parallel (totalling 6 TCP connections)

sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate)

You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network.

If the files are compressible removing "-o Compression=n" can be beneficial.

Better create an alias for the command.

mtr www.google.com
mkdir tmp ; cd tmp ; zcat ../initrd.gz | cpio -i
2014-09-24 14:06:38
User: akiuni
Functions: cd cpio mkdir zcat

this command extracts an initrd files into the "tmp" directory

for file in $(find /var/backup -name "backup*" -type f |sort -r | tail -n +10); do rm -f $file; done ; tar czf /var/backup/backup-system-$(date "+\%Y\%m\%d\%H\%M-\%N").tgz --exclude /home/dummy /etc /home /opt 2>&- && echo "system backup ok"
2014-09-24 14:04:11
User: akiuni
Functions: date echo file find rm sort tail tar
Tags: backup Linux cron

this command can be added to crontab so as to execute a nightly backup of directories and store only the 10 last backup files.

curl -X POST -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","method":"GUI.ShowNotification","params":{"title":"This is the title of the message","message":"This is the body of the message"},"id":1}' http://i3c.pla.lcl:8080/jsonrpc
2014-08-24 21:49:13
User: PLA

Send a text message to an Kodi (XBMC) device. Uses curl to post a JSON request to the Kodi JSON-RPC API.

apt-get update && apt-get dist-upgrade -y --show-progress && apt-get autoremove -y && apt-get check && apt-get autoclean -y

# AllInOne: Update what packages are available, upgrade to new versions, remove unneeded packages

# (some are no longer needed, replaced by the ones from ap upgrade), check for dependencies

# and clean local cached packages (saved on disk but not installed?,some are needed? [this only cleans unneeded unlike ap clean]).

# aliases (copy into ~/.bashrc file):

alias a='alias'

a ap='apt-get'

a r='ap autoremove -y'

a up='ap update'

a u='up && ap upgrade -y --show-progress && r && ap check && ap autoclean'

# && means "and run if the previous succeeded", you can change it to ; to "run even if previous failed".

I'm not sure if ap check should be before or after ap upgrade -y, you can also change the alias names.

# To expand aliases in bash use ctrl alt e or see this ow.ly/zBKHs

# For more useful aliases go to ow.ly/zBMOx

hdparm -S5 /dev/sda
hdparm -y /dev/sda
awk '/text to grep/{print \$1}' logs... | sort -n | uniq -c | sort -rn | head -n 100
2014-07-10 20:36:02
User: impinball
Functions: awk head sort uniq
Tags: Linux sh

Accepts multiple files via logs.... Substitute "text to grep" for your search string.

If you want to alias this, you could do something like this:

alias parse-logs='awk "/$1/{print \$1}" ${@[@]:1} | sort -n | uniq -c | sort -rn | head -n 100'
find /some/directory/* -prune -type f -name *.log
2014-05-02 00:14:32
User: bigstupid
Functions: find

This find syntax seems a little easier to remember for me when I have to use -prune on AIX's find. It works with gnu find, too.

Add whatever other find options after -prune

psql -X -A -t -c "SELECT version();"
2014-05-01 18:10:20
User: malathion

Without using a pipe.

-X ignores the user's .psqlrc configuration file

-A sets un-aligned table output mode

-t prints rows only (no headers or footers)

psql -h <SERVER NAME HERE> -t -c 'SELECT version();' |head -1