Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using xargs from sorted by
Terminal - Commands using xargs - 598 results
find . -iname '*jpg' -print0 | xargs -0 exiftool -warning; find . -iname '*jpg' -print0 | xargs -0 jpeginfo -c
2013-01-28 16:44:19
Functions: find xargs
0

This checks jpeg data and metadata, should be grepped as needed, maybe a -B1 Warning for the first, and a -E "WARNING|ERROR" for the second part....

for i in `pfiles pid|grep S_IFREG|awk '{print $5}'|awk -F":" '{print $2}'`; do find / -inum $i |xargs ls -lah; done
2013-01-24 13:57:19
User: giorger
Functions: awk find grep ls xargs
0

Executing pfiles will return a list of all descriptors utilized by the process

We are interested in the S_IFREG entries since they are pointing usually to files

In the line, there is the inode number of the file which we use in order to find the filename.

The only bad thing is that in order not to search from / you have to suspect where could possibly be the file.

Improvements more than welcome.

lsof was not available in my case

find-duplicates () { find "$@" -not -empty -type f -printf "%s\0" | sort -rnz | uniq -dz | xargs -0 -I{} -n1 find "$@" -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate; }
2013-01-23 23:20:26
User: mpeschke
Functions: find md5sum sort uniq xargs
-1

This is a modified version of the OP, wrapped into a bash function.

This version handles newlines and other whitespace correctly, the original has problems with the thankfully rare case of newlines in the file names.

It also allows checking an arbitrary number of directories against each other, which is nice when the directories that you think might have duplicates don't have a convenient common ancestor directory.

largest() { dir=${1:-"./"}; count=${2:-"10"}; echo "Getting top $count largest files in $dir"; du -sx "$dir/"* | sort -nk 1 | tail -n $count | cut -f2 | xargs -I file du -shx file; }
2013-01-21 09:45:21
User: jhyland87
Functions: cut du echo file sort tail xargs
1

You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display.

Best off putting this in your bashrc or bash_profile file

ps -xaw -o state,ppid | grep Z | grep -v PID | awk '{ print $2 }' | xargs kill -9
2013-01-09 04:21:54
User: terrywang
Functions: awk grep kill ps xargs
-4

Did some research and found the previous command wrong, we don't kill a zombie but its parent. Just made some modifcation to khashmeshab's command.

find -type f | xargs file | grep ".*: .* text" | sed "s;\(.*\): .* text.*;\1;"
heroku manager:apps --org org-name | xargs -I {} heroku apps:delete {} --confirm {}
exipick -zi | xargs --max-args=1000 exim -Mrm
2012-12-12 20:46:22
User: jasen
Functions: xargs
Tags: bash awk exim
0

do 1000 at a time so that if your doodoo is deep you can avoid avoid "command-line too big" error

find . | xargs perl -p -i.bak -e 's/oldString/newString/;'
2012-11-28 17:11:18
User: RedFox
Functions: find perl xargs
0

find . = will set up your recursive search. You can narrow your search to certain file by adding -name "*.ext" or limit buy using the same but add prune like -name "*.ext" -prune

xargs =sets it up like a command line for each file find finds and will invoke the next command which is perl.

perl = invoke perl

-p sets up a while loop

-i in place and the .bak will create a backup file like filename.ext.bak

-e execute the following....

's/ / /;' your basic substitute and replace.

grep -l <string-to-match> * | xargs grep -c <string-not-to-match> | grep '\:0'
find . \( -name \*.cgi -o -name \*.txt -o -name \*.htm -o -name \*.html -o -name \*.shtml \) -print | xargs grep -s pattern
find . -name "*" -print | xargs grep -s pattern
find / -xdev \( -perm -4000 \) -type f -print0 | xargs -0 ls -l
grep -rl string_to_find public_html/css/ | xargs -I '{}' vim +/string_to_find {} -c ":s/string_to_find/string_replaced"
2012-11-07 14:44:51
User: algol
Functions: grep vim xargs
-1

Open all files which have some string go directly to the first line where that string is and run command on it.

Other examples:

Run vim only once with multiple files (and just go to string in the first one):

grep -rl string_to_find public_html/css/ | xargs vim +/string_to_find

Run vim for each file, go to string in every one and run command (to delete line):

grep -rl string_to_find public_html/css/ | xargs -I '{}' vim +/string_to_find {} -c ":delete"
ls /var/log/sa/sa[0-9]*|xargs -I '{}' sar -u -f {}|awk '/^[0-9]/&&!/^12:00:01|RESTART|CPU/{print "%user: "$4" %system: "$6" %iowait: "$7" %nice: "$5" %idle: "$9}'|sort -nk10|head
ls /var/log/sa/sa[0-9]*|xargs -I '{}' sar -q -f {}| awk '/Average/'|awk '{runq+=$2;plist+=$3}END{print "average runq-sz:",runq/NR; print "average plist-sz: "plist/NR}'
diff ../source-dir.orig/ ../source-dir.post/ | grep "Only in" | sed -e 's/^.*\:.\(\<.*\>\)/\1/g' | xargs rm -r
2012-10-17 14:12:32
User: bigc00p
Functions: diff grep rm sed xargs
0

Good for when your working on building a clean source install for RPM packaging or what have you. After testing, run this command to compare the original extracted source to your working source directory and it will remove the differences that are created when running './configure' and 'make'.

find /var/cache/apt -not -mtime -7 | sudo xargs rm
ls /dev/disk* | xargs -n 1 -t sudo zdb -l | grep GPTE_
2012-10-06 20:19:45
User: grahamperrin
Functions: grep ls sudo xargs
1

Show the UUID-based alternate device names of ZEVO-related partitions on Darwin/OS X. Adapted from the lines by dbrady at http://zevo.getgreenbytes.com/forum/viewtopic.php?p=700#p700 and following the disk device naming scheme at http://zevo.getgreenbytes.com/wiki/pmwiki.php?n=Site.DiskDeviceNames

find . -type d -maxdepth 1 | xargs du -sh
find site/ -type d | xargs sudo chmod 755
find ./ -type f | xargs sudo chmod 644
find /var/cache/pacman/pkg -not -mtime -7 | sudo xargs rm
2012-09-20 12:36:44
User: brejktru
Functions: find sudo xargs
1

Sometimes my /var/cache/pacman/pkg directory gets quite big in size. If that happens I run this command to remove old package files. Packages that we're upgraded in last N days are kept in case you are forced to downgrade a specific package. The command is obviously Arch Linux related.

<cmd> | xargs -0 <cmd>
find . -type f -size -80k -print0|xargs -0 rm
2012-09-19 12:15:32
User: DeepThought
Functions: find xargs
0

Probably neither faster nor better than -delete in find. It's just that I generally dislike teaching find builtin actions.