Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged xargs from sorted by
Terminal - Commands tagged xargs - 123 results
find . -type l | xargs file | grep broken
svn diff -r 1792:HEAD --summarize | awk '{if ($1 != "D") print $2}'| xargs -I {} tar rf incremental_release.tar {}
2011-04-05 15:00:49
User: windfold
Functions: awk diff tar xargs
Tags: bash svn awk xargs tar
0

The result of this command is a tar with all files that have been modified/added since revision 1792 until HEAD. This command is super useful for incremental releases.

find . -type d -name .svn -prune -o -type f -print0 | xargs -r0 ...
find . -type f ! -iwholename \*.svn\* -print0 [ | xargs -0 ]
2011-03-21 16:45:35
User: alustenberg
Functions: find xargs
1

for when find . -print | grep -v .svn | xargs doesnt cut it.

find . -maxdepth 1 -type d | grep -Pv "^.$" | sort -rn --field-separator="-" | sed -n '3,$p' | xargs rm -rf
find . -name "*.java" -type f -perm +600 -print | xargs -I _ sh -c 'grep -q hexianmao _ && iconv -f gb2312 -t utf8 -o _ -c _ '
2011-03-08 13:02:25
User: Sunng
Functions: find iconv sh xargs
Tags: find xargs iconv
-1

One of my friends committed his code in the encoding of GB2312, which broke the build job. I have to find his code and convert.

fdupes -r .
2011-02-19 17:02:30
User: Vilemirth
Tags: xargs parallel
15

If you have the fdupes command, you'll save a lot of typing. It can do recursive searches (-r,-R) and it allows you to interactively select which of the duplicate files found you wish to keep or delete.

files -type f | xargs -n100 | while read l; do mkdir $((++f)); cp $l $f; done
2011-02-15 23:15:16
User: flatcap
Functions: cp mkdir read xargs
-2

Take a folder full of files and split it into smaller folders containing a maximum number of files. In this case, 100 files per directory.

find creates the list of files

xargs breaks up the list into groups of 100

for each group, create a directory and copy in the files

Note: This command won't work if there is whitespace in the filenames (but then again, neither do the alternative commands :-)

grep -l foo *cl*.log | xargs grep -lL bar
2011-01-10 20:18:30
User: dlebauer
Functions: grep xargs
Tags: xargs grep
0

same as

grep -lL "foo" $(grep -l bar *cl*.log)
grep -l bar *.log | xargs grep -l foo
2011-01-10 19:54:46
User: dlebauer
Functions: grep xargs
Tags: bash xargs grep
-1

Uses xargs to call the second grep with the first grep's results as arguments

bargs { while read i; do "$@" "$i"; done }
find /name/of/dir/ -name '*.txt' | xargs grep 'text I am searching for'
2011-01-05 15:20:40
User: erickb
Functions: find grep xargs
Tags: find xargs grep
1

recursively search dir for a a particular file type, search each file for a particular text.

find /deep/tree/ -type f -print0|xargs -0 -n1 -I{} ln -s '{}' .
2010-12-21 13:00:33
User: dinomite
Functions: find ln xargs
Tags: find xargs links
1

If you want to pull all of the files from a tree that has mixed files and directories containing files, this will link them all into a single directory. Beware of filesystem files-per-directory limits.

tail -f file |xargs -IX printf "$(date -u)\t%s\n" X
PROMPT_COMMAND='seq $COLUMNS | xargs -IX printf "%Xs\r" @'
find . -type f | while read line; do NEW_TS=`date -d@$((\`stat -c '%Y' $line\` + <seconds> )) '+%Y%m%d%H%M.%S'`; touch -t $NEW_TS ${line}; done
2010-11-18 14:03:32
User: angleto
Functions: find read touch
1

Increase the modification date for the files selected with the find command.

cd /proc&&ps a -opid=|xargs -I+ sh -c '[[ $PPID -ne + ]]&&echo -e "\n[+]"&&tr -s "\000" " "<+/cmdline&&echo&&tr -s "\000\033" "\nE"<+/environ|sort'
1

Grabs the cmdline used to execute the process, and the environment that the process is being run under. This is much different than the 'env' command, which only lists the environment for the shell. This is very useful (to me at least) to debug various processes on my server. For example, this lets me see the environment that my apache, mysqld, bind, and other server processes have.

Here's a function I use:

aa_ps_all () { ( cd /proc && command ps -A -opid= | xargs -I'{}' sh -c 'test $PPID -ne {}&&test -r {}/cmdline&&echo -e "\n[{}]"&&tr -s "\000" " "<{}/cmdline&&echo&&tr -s "\000\033" "\nE"<{}/environ|sort&&cat {}/limits' ); }

From my .bash_profile at http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

each() { (IFS=$'\n'; echo "$*") }
2010-10-02 06:51:44
User: BobbyTables
Functions: echo
Tags: xargs IFS
1

This can be useful for transforming command-line args into input for xargs (one per line). This can also be done with ls if the args are filenames, but that's getting awfully close to Useless Use of Cat territory (http://partmaps.org/era/unix/award.html).

xargsb() { while read -r cmd; do ${@//'{}'/$cmd}; done; }
2010-09-28 06:35:39
User: BobbyTables
Functions: read
3

Similar to xargs -i, but works with builtin bash commands (rather than running "bash -c ..." through xargs)

find ./ ! -name 'excludepattern' | xargs -i cp --parents {} destdir
2010-09-27 21:36:50
User: starchox
Functions: cp find xargs
Tags: find xargs cp
3

Preserve file structure when coping and exclude some file o dir patterns

find . -name "*noticia*" -name "*jhtm*" -name "*.tpl" -exec grep -li "id=\"col-direita\"" '{}' \; | xargs -n1 mate
for file in *.jpg; do convert "$file" -resize 800000@ -quality 80 "small.$file"; done
2010-09-13 19:06:14
User: grinob
Functions: file
Tags: xargs convert
8

Convert all jpegs in the current directory into ~1024*768 pixels and ~ 150 KBytes jpegs

ls *.JPG | cut -d . -f 1 | xargs -L1 -i convert -resize 684 {}.JPG {}.jpg
grep -ZlRr -e BAD_SCRIPT_LINE * |xargs -0 sed -i 's/BAD_SCRIPT_LINE//g'
2010-08-30 22:12:57
User: homoludens
Functions: grep sed xargs
0

recursive find and replace. important stuff are grep -Z and zargs -0 which add zero byte after file name so sed can work even with file names with spaces.

xargs -n1 -P100 -I{} sh -c 'ssh {} uptime >output/{} 2>error/{}' <hostlist
2010-08-20 11:03:11
User: dooblem
Functions: sh uptime xargs
3

Do the same as pssh, just in shell syntax.

Put your hosts in hostlist, one per line.

Command outputs are gathered in output and error directories.