What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Maintained by Jon H.

Site originally by David Winterbottom (user root).

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using xargs from sorted by
Terminal - Commands using xargs - 638 results
find -type f | xargs -I{} du -sk "{}" | sort -rn | head
xargs -n 2 mv < file_with_colums_of_names
2010-12-27 18:06:15
User: Juluan
Functions: mv xargs

Maybe simpler, but again, don't know how it will work with space in filename.

find /deep/tree/ -type f -print0|xargs -0 -n1 -I{} ln -s '{}' .
2010-12-21 13:00:33
User: dinomite
Functions: find ln xargs
Tags: find xargs links

If you want to pull all of the files from a tree that has mixed files and directories containing files, this will link them all into a single directory. Beware of filesystem files-per-directory limits.

git status | grep deleted | awk '{print $3}' | xargs git rm
2010-12-17 02:08:55
Functions: awk grep xargs

delete multiple files from git index that have already been deleted from disk. this is pretty terrible, I'm looking for a better way.

(much better!! http://www.commandlinefu.com/commands/view/1246/git-remove-files-which-have-been-deleted)

find . -name .svn -type d |xargs rm -rf
find . ! -name "." -print0 | xargs -0 -I '{}' mv -n '{}' ..; rmdir "$PWD"
2010-12-15 22:12:06
User: bartonski
Functions: find mv rmdir xargs

Robust means of moving all files up by a directory. Will handle dot files, filenames containing spaces, and filenames with almost any printable characters. Will not handle filenames containing a single-quote (but if you are moving those, it's time to go yell at whoever created them in the first place).

find . -type f -size +20000k -print0 | xargs -0 du -h | awk -F"\t" '{printf "%s : %s\n", $2, $1}'
2010-12-15 17:51:09
User: depesz
Functions: awk du find xargs

Output made so that it will match initial suggestion for this task. Personally, I think that output of du -h is more readable.

find . -name .svn -print0 | xargs -0 rm -rf
curl -s http://boards.4chan.org/wg/|sed -r 's/.*href="([^"]*).*/\1\n/g'|grep images|xargs wget
2010-12-12 06:32:19
User: rodolfoap
Functions: grep sed xargs

Im' not interested in images, but that's how I would do it.

function 4get () { curl $1 | grep -i "File<a href" | awk -F '<a href="' '{print $4}' | awk -F '" ' '{print $1}' | xargs wget }
2010-12-11 09:01:32
User: gml
Functions: awk grep wget xargs

Useful for ripping wallpaper from 4chan.org/wg

wget -qO- "VURL" | grep -o "googleplayer.swf?videoUrl\\\x3d\(.\+\)\\\x26thumbnailUrl\\\x3dhttp" | grep -o "http.\+" | sed -e's/%\([0-9A-F][0-9A-F]\)/\\\\\x\1/g' | xargs echo -e | sed 's/.\{22\}$//g' | xargs wget -O OUPUT_FILE
2010-12-03 17:27:08
Functions: echo grep sed wget xargs

Download google video with wget. Or, if you wish, pass video URL to ie mplayer to view as stream.

1. VURL: replace with url. I.e. http://video.google.com/videoplay?docid=12312312312312313#

2. OUPUT_FILE : optionally change to a more suited name. This is the downloaded file. I.e. foo.flv

# Improvements greatly appreciated. (close to my first linux command after ls -A :) )

Breakedown pipe by pipe:

1. wget: html from google, pass to stdout

2. grep: get the video url until thumbnailUrl (not needed)

3. grep: Strip off everything before http://

4. sed: urldecode

5. echo: hex escapes

6. sed: stipr of tailing before thumbnailUrl

7. wget: download. Here one could use i.e. mplayer or other...

tail -f /var/www/logs/domain.com.log | grep "POST /scripts/blog-post.php" | grep -v 192.168. | awk '{print $1}' | xargs -I{} iptables -I DDOS -s {} -j DROP
2010-11-30 06:22:18
User: tehusr
Functions: awk grep iptables tail xargs

Takes IP from web logs and pipes to iptables, use grep to white list IPs.. use if a particular file is getting requested by many different addresses.

Sure, its already down pipe and you bandwidth may suffer but that isnt the concern. This one liner saved me from all the traffic hitting the server a second time, reconfigure your system so your system will work like blog-post-1.php or the similar so legitimate users can continue working while the botnet kills itself.

apt-cache search perl | grep module | awk '{print $1;}' | xargs sudo apt-get install -y

I used this to mass install a lot of perl stuff. Threw it together because I was feeling *especially* lazy. The 'perl' and the 'module' can be replaced with whatever you like.

dpkg -l | grep ^rc | awk '{print $2}' | xargs dpkg -P
find . -type f -print0 | xargs -0 perl -pi.save -e 'tr/A-Z/a-z/'
2010-11-25 13:55:34
User: depesz
Functions: find perl xargs
Tags: perl find regex

In this way it doesn't have problems with filenames with spaces.

tail -f file |xargs -IX printf "$(date -u)\t%s\n" X
echo "10 i 2 o $(date +"%H%M"|cut -b 1,2,3,4 --output-delimiter=' ') f"|dc|tac|xargs printf "%04d\n"|tr "01" ".*"
2010-11-24 23:49:21
User: unefunge
Functions: echo printf tr xargs

displays current time in "binary clock" format

(loosely) inspired by: http://www.thinkgeek.com/homeoffice/lights/59e0/



.... - 1st hour digit: 0

*..* - 2nd hour digit: 9 (8+1)

.*.. - 1st minutes digit: 4

*..* - 2nd minutes digit: 9 (8+1)

Prompt-command version:

PROMPT_COMMAND='echo "10 i 2 o $(date +"%H%M"|cut -b 1,2,3,4 --output-delimiter=" ") f"|dc|tac|xargs printf "%04d\n"|tr "01" ".*"'

PROMPT_COMMAND='seq $COLUMNS | xargs -IX printf "%Xs\r" @'
dpkg -l | grep ^rc | cut -d' ' -f3 | xargs dpkg -P
find <src-path-to-search> -name "<folder-name>" | xargs -i cp -avfr --parent {} /<dest-path-to-copy>
2010-11-22 10:58:42
User: crxz0193
Functions: cp find xargs

This command will a particular folder-name recursively found under the src-path-to-search to the dest-path-to-copy retaining the folder structure

find . -type d -print0 | (cd $DESTDIR; xargs -0 mkdir)
2010-11-18 09:33:51
User: rocketraman
Functions: cd find xargs

Here is how to replicate the directory structure in the current directory to a destination directory (given by the variable DESTDIR), without copying the files.

find /home/ -type f -exec du {} \; 2>/dev/null | sort -n | tail -n 10 | xargs -n 1 du -h 2>/dev/null
2010-11-10 07:24:17
User: mxc
Functions: du find sort tail xargs
Tags: disk usage

This combines the above two command into one. Note that you can leave off the last two commands and simply run the command as

"find /home/ -type f -exec du {} \; 2>/dev/null | sort -n | tail -n 10"

The last two commands above just convert the output into human readable format.

find / -type f 2>/dev/null | xargs du 2>/dev/null | sort -n | tail -n 10 | cut -f 2 | xargs -n 1 du -h
2010-11-09 13:45:11
User: mxc
Functions: cut du find sort tail xargs
Tags: disk usage

Often you need to find the files that are taking up the most disk space in order to free up space asap. This script can be run on the enitre filesystem as root or on a home directory to find the largest files.

ps -u $USER -lf | grep -vE "\-bash|sshd|ps|grep|PPID" > .tmpkill; if (( $(cat .tmpkill | wc -l) > 0 )); then echo "# KILL EM ALL"; cat .tmpkill; cat .tmpkill | awk '{print $4}' | xargs kill -9; else echo "# NOTHING TO KILL"; fi; cat .tmpkill; rm .tmpkill;
2010-11-04 04:16:50
User: zsugiart
Functions: awk cat echo grep kill ps rm wc xargs

Kills all process that belongs to the user that runs it - excluding bash, sshd (so putty/ssh session will be spared). The bit that says grep -vE "..." can be extended to include ps line patterns that you want to spare.

If no process can be found on the hitlist, it will print # NOTHING TO KILL. Otherwise, it will print # KILL EM ALL, with the cull list.

xrandr | sed -n 's/ connected.*//p' | xargs -n1 -tri xrandr --output {} --brightness 0.7 --gamma 2:3:4

[UPDATE: Now works for multiple connected outputs]

I woke up around midnight with an urge to do some late night hacking, but I didn't want a bright monitor screwing up my body's circadian rhythm. I've heard that at night blue (short wavelength) lights are particularly bad for your diurnal clock. That may be a bunch of hooey, but it is true that redder (longer wavelength) colors are easier on my eyes at night.

This command makes the screen dimmer and adjusts the gamma curves to improve contrast, particularly darkening blues and greens (Rɣ=2, Gɣ=3, Bɣ=4). To reset your screen to normal, you can run this command:

xrandr | sed -n 's/ connected.*//p' | xargs -n1 -tri xrandr --output {} --brightness 1 --gamma 1:1:1

or, more briefly,

xgamma -g 1

Note: The sed part is fragile and wrong. I'm doing it this way because of a misfeature in xrandr(1), which requires an output be specified but has no programmatic way of querying available outputs. Someone needs to patch up xrandr to be shell script friendly or at least add virtual outputs named "PRIMARY" and "ALL".


Todo: Screen should dim (gradually) at sunset and brighten at sunrise. I think this could be done with a self-resubmitting at job, but I'm running into the commandlinefu 127 character limit just getting the sunrise time:

wget http://aa.usno.navy.mil/cgi-bin/aa_pap.pl --post-data=$(date "+xxy=%Y&xxm=%m&xxd=%d")"&st=WA&place=Seattle" -q -O- | sed -rn 's/\W*Sunrise\W*(.*)/\1/p'

I hope some clever hacker comes up with a command line interface to Google's "OneBox", since the correct time shows up as the first hit when googling for "sunrise:cityname".


[Thank you to @flatcap for the sed improvement, which is much better than the head|tail|cut silliness I had before. And thank you to @braunmagrin for pointing out that the "connected" output may not be on the second line.]