Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged xargs from sorted by
Terminal - Commands tagged xargs - 123 results
svn st | grep -e '^M' | awk '{print $2}' | xargs svn revert
lsof /dev/snd/pcm*p /dev/dsp | awk ' { print $2 }' | xargs kill
2010-07-23 20:24:16
User: alustenberg
Functions: awk xargs
2

for when a program is hogging the sound output. finds, and kills. add -9 to the end for wedged processes. add in 'grep ^program' after lsof to filter.

grep -R --include=*.cpp --include=*.h --exclude=*.inl.h "string" .
2010-07-14 16:32:28
User: sweinst
Functions: grep
Tags: find xargs grep
0

Gnu grep allows to restrict the search to files only matching a given pattern. It also allows to exclude files.

find . -name '*.?pp' -exec grep -H "string" {} \;
find . -name '*.?pp' | xargs grep -H "string"
2010-07-14 14:41:07
User: cout
Functions: find grep xargs
Tags: find xargs grep
2

I like this better than some of the alternatives using -exec, because if I want to change the string, it's right there at the end of the command line. That means less editing effort and more time to drink coffee.

find ~/.thunderbird/*.default/ -name *.msf | sed 's/ /\\ /g' | xargs rm {} \;
2010-06-04 12:35:24
User: allrightname
Functions: find rm sed xargs
-1

The thunderbird message datastores get corrupt some times causing random failures, compaction to fail and general suck in thunderbird. Removing them causes thunderbird to rebuild the indexes and makes things quick again.

find -type f -print0 | xargs -r0 stat -c %y\ %n | sort
2010-05-29 13:40:18
User: dooblem
Functions: find stat xargs
2

Works with files containing spaces and for very large directories.

killall -9 rouge-process
pgrep rouge-process | xargs sudo kill -9
2010-05-09 22:30:05
User: mheadd
Functions: kill sudo xargs
Tags: xargs pgrep
-3

Find and kill multiple instances of a process with one simple command.

find . -name 'pattern'| xargs du -hc
find * \( -name "*.[hc]pp" -or -name "*.py" -or -name "*.i" \) -print0 | xargs -0 wc -l | tail -n 1
2010-03-25 18:58:29
User: neologism
Functions: find tail wc xargs
Tags: find xargs wc
1

Finds all C++, Python, SWIG files in your present directory (uses "*" rather than "." to exclude invisibles) and counts how many lines are in them. Returns only the last line (the total).

echo $(( `ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l` ))
2010-03-12 08:42:49
User: AskApache
Functions: echo wc
1

There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100.

OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 ))

This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript).

An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command.

find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p'

I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this:

cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}'

I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this.

cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}'

If you were building a process-killer (very common for cheap hosting) this would also be handy.

Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

ls -d */* | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs mv -t $(pwd)
2010-03-01 23:43:26
User: leovailati
Functions: ls mv sed xargs
-1

You WILL have problems if the files have the same name.

Use cases: consolidate music library and unify photos (especially if your camera separates images by dates).

After running the command and verifying if there was no name issues, you can use

ls -d */ | sed -e 's/^/\"/g' -e 's/$/\"/g' | xargs rm -r

to remove now empty subdirectories.

ls . | xargs file | grep text | sed "s/\(.*\):.*/\1/" | xargs gedit
find . -type f | parallel -j+0 grep -i foobar
2010-01-30 02:08:46
Functions: find grep
3

Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores.

With parallel you only need -0 (and -print0) if your filenames contain a '\n'.

Parallel is from https://savannah.nongnu.org/projects/parallel/

du -s * | sort -nr | head | cut -f2 | parallel -k du -sh
2010-01-28 12:59:14
Functions: cut du head sort
Tags: du xargs parallel
-2

If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.

tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5 -type f` 2> /dev/null | parallel -X rm -f
2010-01-28 12:41:41
Functions: rm tar
-3

This deals nicely with files having special characters in the file name (space ' or ").

Parallel is from https://savannah.nongnu.org/projects/parallel/

ls -t1 | sed 1d | parallel -X rm
2010-01-28 12:28:18
Functions: ls sed
-1

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

svn st | cut -c 9- | parallel -X tar -czvf ../backup.tgz
2010-01-28 11:43:16
Functions: cut tar
-2

xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'.

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

Both solutions work bad if the number of files is more than the allowed line length of the shell.

svn status |grep '\?' |awk '{print $2}'| parallel -Xj1 svn add
2010-01-28 08:47:54
Functions: awk grep
Tags: xargs parallel
-2

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

grep -rl oldstring . | parallel sed -i -e 's/oldstring/newstring/'
2010-01-28 08:44:16
Functions: grep sed
3

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

find -not -empty -type f -printf "%s\n" | sort | uniq -d | parallel find -type f -size {}c | parallel md5sum | sort | uniq -w32 --all-repeated=separate
2010-01-28 08:40:18
Functions: find md5sum sort uniq
Tags: xargs parallel
-1

A bit shorter and parallelized. Depending on the speed of your cpu and your disk this may run faster.

Parallel is from https://savannah.nongnu.org/projects/parallel/

tar -tf <file.tar.gz> | parallel rm
2010-01-28 08:28:16
Functions: tar
-2

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

find . -iname "*.jpg" -print0 | tr '[A-Z]' '[a-z]' | xargs -0 cp --backup=numbered -dp -u --target-directory {location} &
2009-12-10 08:47:04
User: oracular
Functions: cp find tr xargs
4

Use if you have pictures all over the place and you want to copy them to a central location

Synopsis:

Find jpg files

translate all file names to lowercase

backup existing, don't overwrite, preserve mode ownership and timestamps

copy to a central location

awk '{print $1}' "/proc/modules" | xargs modinfo | awk '/^(filename|desc|depends)/'