Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using xargs from sorted by
Terminal - Commands using xargs - 612 results
\ls -1 | xargs -l readlink
2009-11-23 19:28:03
User: TeacherTiger
Functions: xargs
Tags: readlink
2

For those who don't have the symlinks command, you can use readlink. This command is not straightforward because readlink is very picky. The backslash in front of 'ls' means not to use an alias (e.g. color escape codes from an aliased 'ls' could mess up readlink), and the -1 (one) means to print the entries separated by newlines. xargs -l (the letter L) means to process each input separated by newlines as separate commands.

du | sort -nr | cut -f2- | xargs du -hs
find <dir> -printf '%p : %A@\n' | awk '{FS=" : " ; if($2 < <time in epoc> ) print $1 ;}' | xargs rm --verbose -fr ;
2009-11-20 16:31:58
User: angleto
Functions: awk find rm xargs
-2

remove files with access time older than a given date.

If you want to remove files with a given modification time replace %A@ with %T@. Use %C@ for the modification time.

The time is expressed in epoc but is easy to use any other ordered format.

awk '{print $1}' "/proc/modules" | xargs modinfo | awk '/^(filename|desc|depends)/'
for item in *;do echo -n "$item - ";find "$item" -type f -print0 | xargs -0 file -iNf - | grep video | cut -d: -f1 | xargs -d'\n' /usr/share/doc/mplayer/examples/midentify | grep ID_LENGTH | awk -F= '{sum+=$2} END {print(sum/60)}'; done | grep -v ' - 0$'
2009-11-19 06:28:15
User: jnash
Functions: awk cut echo file grep xargs
0

I know this has been beaten to death but finding video files using mime types and printing the "hours of video" for each directory is (IMHO) easier to parse than just a single total. Output is in minutes.

Among the other niceties is that it omits printing of non-video files/folders

PS: Barely managed to fit it within the 255 character limit :D

find ./ -type f -print0 | xargs -0 file -iNf - | grep video | cut -d: -f1
2009-11-19 06:05:36
User: jnash
Functions: cut file find grep xargs
0

Uses mime-type of files rather than relying on file extensions to find files of a certain type.

This can obviously be extended to finding files of any other type as well.. like plain text files, audio, etc..

In reference to displaying the total hours of video (which was earlier posted in command line fu, but relied on the user having to supply all possible video file formats) we can now do better:

find ./ -type f -print0 | xargs -0 file -iNf - | grep video | cut -d: -f1 | xargs -d'\n' /usr/share/doc/mplayer/examples/midentify | grep ID_LENGTH | awk -F "=" '{sum += $2} END {print sum/60/60; print "hours"}'
getent passwd|cut -d: -f1|xargs -n1 passwd -e
2009-11-18 19:46:15
User: romulusnr
Functions: cut getent passwd xargs
0

Alternately for those without getent or only want to work on local users it's even easier:

cut -d: -f1 /etc/passwd|xargs -n1 passwd -e

Note that not all implementations of passwd support -e. On RH it would be passwd -x0 (?) and on Solaris it would be passwd -f.

lsmod | sed -e '1d' -e 's/\(\([^ ]*\) \)\{1\}.*/\2/' | xargs modinfo | sed -e '/^dep/s/$/\n/g' -e '/^file/b' -e '/^desc/b' -e '/^dep/b' -e d
2009-11-17 22:51:08
User: marssi
Functions: lsmod modinfo sed xargs
1

Liked command 4077 so I improved it, by doing all text manipulation with sed.

"Run this as root, it will be helpful to quickly get information about the loaded kernel modules." THX mohan43u

lsmod | cut -d' ' -f1 | xargs modinfo | egrep '^file|^desc|^dep' | sed -e'/^dep/s/$/\n/g'
2009-11-17 02:13:34
User: mohan43u
2

Run this as root, it will be helpful to quickly get information about the loaded kernel modules.

(ls; mkdir subdir; echo subdir) | xargs mv
2009-11-08 11:40:55
User: mechmind
Functions: echo mkdir xargs
Tags: xargs pipes
4

With this form you dont need to cut out target directory using grep/sed/etc.

mount | awk '/:/ { print $3 } ' | xargs sudo umount
find . -size 0 -print0 | xargs -0 rm
2009-10-29 14:10:02
User: osvaldofilho
Functions: find xargs
-1

The command find search commands with size zero and erase them.

find . -name '*.java' | xargs -L 1 cpp -fpreprocessed | grep . | wc -l
2009-10-29 09:58:43
User: rbossy
Functions: cpp find grep wc xargs
2

I took java to make the find command simpler and to state that it works for any language supported by cpp.

cpp is the C/C++ preprocessor (interprets macros, removes comments, inserts includes, resolves trigraphs). The -fpreprocessor option tells cpp to assume the input has already been preprocessed so it will only replace comment lines with blank lines.

The -L 1 option tells xargs to launch one process for each line, indeed cpp can only process one file at the time...

git ls-files | xargs -n1 -d'\n' -i git-blame {} | perl -n -e '/\s\((.*?)\s[0-9]{4}/ && print "$1\n"' | sort -f | uniq -c -w3 | sort -r
2009-10-25 01:44:03
User: askedrelic
Functions: perl sort uniq xargs
Tags: statistics git
3

Figures out total line contribution per author for an entire GIT repo. Includes binary files, which kind of mess up the true count.

If crashes or takes too long, mess with the ls-file option at the start:

git ls-files -x "*pdf" -x "*psd" -x "*tif" to remove really random binary files

git ls-files "*.py" "*.html" "*.css" to only include specific file types

Based off my original SVN version: http://www.commandlinefu.com/commands/view/2787/prints-total-line-count-contribution-per-user-for-an-svn-repository

perl -e '$i=0;while($i<10){open(WGET,qq/|xargs lynx -dump/);printf WGET qq{http://www.google.com/search?q=site:g33kinfo.com&hl=en&start=$i&sa=N},$i+=10}'|grep '\/\/g33kinfo.com\/'
2009-10-16 12:20:17
User: op4
Functions: grep perl xargs
Tags: web browser
0

not my cmd... found on the web

seq 10 |xargs -n1 echo Printing line
2009-10-15 11:05:35
User: Waldirio
Functions: echo seq xargs
Tags: echo xargs seq
0

Nice command to create a list, you can create too with for command, but this is so faster.

ls | xargs -n1 gzip
seq 4|xargs -n1 -i bash -c "echo -n 164.85.216.{} - ; nslookup 164.85.216.{} |grep name"|tr -s ' ' ' '|awk '{print $1" - "$5}'|sed 's/.$//'
dpkg --get-selections | cut -f1 | while read pkg; do dpkg -L $pkg | xargs -I'{}' bash -c 'if [ ! -d "{}" ]; then echo "{}"; fi' | tr '\n' '\000' | du -c --files0-from - | tail -1 | sed "s/total/$pkg/"; done
2009-10-12 14:57:54
User: pykler
Functions: bash cut du echo read sed tail tr xargs
Tags: Debian wajig
4

Calculates the size on disk for each package installed on the filesystem (or removed but not purged). This is missing the

| sort -rn

which would put the biggest packges on top. That was purposely left out as the command is slightly on the slow side

Also you may need to run this as root as some files can only be checked by du if you can read them ;)

find . -iname ".project"| xargs -I {} dirname {} | LC_ALL=C xargs -I {} svn info {} | grep "Last Changed Rev\|Path" | sed "s/Last Changed Rev: /;/" | sed "s/Path: //" | sed '$!N;s/\n//'
2009-10-07 16:13:27
User: hurz
Functions: dirname find grep info sed xargs
0

Searches for all .project files in current folder and below and uses "svn info" to get the last changed revision. The last sed joins every two lines.

find /proc -user myuser -maxdepth 1 -type d -mtime +7 -exec basename {} \; | xargs kill -9
ls [FILENAME] | xargs openssl sha1
2009-10-03 02:05:43
User: m00dimus
Functions: ls xargs
1

List files and pass to openssl to calculate the hash for each file.

sh -c 'S=askapache R=htaccess; find . -mount -type f|xargs -P5 -iFF grep -l -m1 "$S" FF|xargs -P5 -iFF sed -i -e "s%${S}%${R}%g" FF'
9

I needed a way to search all files in a web directory that contained a certain string, and replace that string with another string. In the example, I am searching for "askapache" and replacing that string with "htaccess". I wanted this to happen as a cron job, and it was important that this happened as fast as possible while at the same time not hogging the CPU since the machine is a server.

So this script uses the nice command to run the sh shell with the command, which makes the whole thing run with priority 19, meaning it won't hog CPU processing. And the -P5 option to the xargs command means it will run 5 separate grep and sed processes simultaneously, so this is much much faster than running a single grep or sed. You may want to do -P0 which is unlimited if you aren't worried about too many processes or if you don't have to deal with process killers in the bg.

Also, the -m1 command to grep means stop grepping this file for matches after the first match, which also saves time.

mysql -e 'show databases' | sed -n '2,$p' | xargs -I DB 'mysqldump DB > DB.sql'
2009-09-25 08:43:06
User: mislav
Functions: sed xargs
Tags: mysqldump
5

No need to loop when we have `xargs`. The sed command filters out the first line of `show databases` output, which is always "Database".

find -type f -name "*.avi" -print0 | xargs -0 mplayer -vo dummy -ao dummy -identify 2>/dev/null | perl -nle '/ID_LENGTH=([0-9\.]+)/ && ($t +=$1) && printf "%02d:%02d:%02d\n",$t/3600,$t/60%60,$t%60' | tail -n 1
2009-09-24 15:50:39
User: syssyphus
Functions: find perl printf tail xargs
8

change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.