What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.

Top Tags





Commands using xargs from sorted by
Terminal - Commands using xargs - 625 results
find . -name '*pdf*' -print0 | xargs -0 ls -lt | head -20
2013-10-03 21:58:51
User: fuats
Functions: find head ls xargs

Sorts by latest modified files by looking to current directory and all subdirectories

wget -q -O- http://example-podcast-feed.com/rss | grep -o "<enclosure[ -~][^>]*" | grep -o "http://[ -~][^\"]*" | xargs wget -c
2013-09-24 12:38:08
User: talha131
Functions: grep wget xargs

This script can be used to download enclosed files from a RSS feed. For example, it can be used to download mp3 files from a podcasts RSS feed.

echo '#! /usr/bin/ksh\ncat $2 | openssl dgst -sha256 | read hashish; if [[ $hashish = $1 ]]; then echo $2: OK; else echo $2: FAILED; fi;' > shacheck; chmod +x shacheck; cat hashishes.sha256 | xargs -n 2 ./shacheck;
2013-09-18 21:51:20
User: RAKK
Functions: cat chmod echo read xargs

This command is used to verify a sha256sum-formatted file hash list on IBM AIX or any other UNIX-like OS that has openssl but doesn't have sha256sum by default. Steps:

1: Save to the filesystem a script that:

A: Receives as arguments the two parts of one line of a sha256sum listing

B: Feeds a file into openssl on SHA256 standard input hash calculation mode, and saves the result

C: Compares the calculated hash against the one received as argument

D: Outputs the result in a sha256sum-like format

2: Make the script runnable

3: Feed the sha256sum listing to xargs, running the aforementioned script and passing 2 arguments at a time

ipcs -q | grep foo | awk '{print $2}' | xargs -I ipcid ipcrm -q ipcid
db=example.mdb; backend=mysql; mdb-schema "$db" $backend | grep -v 'COMMENT ON COLUMN' && mdb-tables -1 "$db" | xargs -I {} mdb-export -I $backend "$db" {}
2013-08-25 18:39:27
User: garex
Functions: grep xargs

schema & tables

To export into another backends style, just change backend to one of: access, sybase, oracle, postgres, mysql and sqlite

brew cleanup -n | awk '{print $3}' | xargs du -s | awk '{s+=$1} END {print s}'
ipcs -s | grep apache | awk ' { print $2 } ' | xargs ipcrm sem
2013-08-12 16:29:32
User: kernel01
Functions: awk grep ipcrm ipcs xargs

Solves these pesky errors you see in the Apache log:

[Fri Jun 28 17:51:00 2013] [emerg] (28)No space left on device: Couldn't create accept lock (/monsoon/opt/apache2/logs/accept.lock.356) (5)

Naturally, can be used to get rid of other semaphores. Note: change the apache user in accordance to your ENV.

pbpaste | xargs wget
2013-08-11 23:12:10
User: loopkid
Functions: xargs

On Linux substitute pbpaste with `xsel --clipboard --output` or `xclip -selection clipboard -o` (untested)

find . -name "*.h" -o -name "*.cpp" | xargs sed -i 's/addVertexes/addVertices/g'
find . -type f -print0 | xargs -0 stat -c'%Y :%y %12s %n' | sort -nr | cut -d: -f2- | head
2013-08-03 09:53:46
User: HerbCSO
Functions: cut find sort stat xargs

Goes through all files in the directory specified, uses `stat` to print out last modification time, then sorts numerically in reverse, then uses cut to remove the modified epoch timestamp and finally head to only output the last 10 modified files.

Note that on a Mac `stat` won't work like this, you'll need to use either:

find . -type f -print0 | xargs -0 stat -f '%m%t%Sm %12z %N' | sort -nr | cut -f2- | head

or alternatively do a `brew install coreutils` and then replace `stat` with `gstat` in the original command.

ls | xargs -I{} du -sh {}
curl -s $1 | grep -o -i '<a href="//images.4chan.org/[^>]*>' | sed -r 's%.*"//([^"]*)".*%\1%' | xargs wget
2013-07-22 10:33:55
User: bugmenot
Functions: grep xargs

first grep all href images then sed the url part then wget

find ./ -type f -print0 | xargs -0 md5sum
find ./ -type f | sed "s:[\ \',\"]:\\\&:g" | xargs md5sum
2013-07-17 18:54:14
User: crazedsanity
Functions: find sed xargs

Recursively list all files in the current directory & get their md5sum, even if the filename has bad characters.

ssh user@remotehost "find basedir -type d" | xargs -I {} -t mkdir -p {}
2013-07-17 07:14:32
User: neomefistox
Functions: mkdir ssh xargs

The directories are created in the local host with the same structure below of a remote base directory, including the 'basedir' in case that it does not exists.

You must replace user and remotehost (or IP address) with your proper values

ssh will ask for the password of the user in remotehost, unless you had included properly your hostname in the remote .ssh/known_hosts file.

qdbus | grep kscreenlocker_greet | xargs -I {} qdbus {} /MainApplication quit
2013-07-11 10:50:03
User: Murz
Functions: grep xargs

Do the unlock KDE screen saver locked session with lightdm display manager used in Kubuntu 12.10 +

find . | sort | awk 'NR%2==0' | xargs rm $1
2013-07-11 07:36:18
User: sucotronic
Functions: awk find rm sort xargs

If you have a directory with lot of backups (full backups I mean), when it gets to some size, you could want to empty some space. With this command you'll remove half of the files. The command assumes that your backup files starts with YYYYMMDD or that they go some alphabetical order.

find . -name ".DS_Store" -print0 | xargs -0 rm -rf
diff <(cd A; find -type f|xargs md5sum ) <(cd B; find -type f | xargs md5sum )
2013-07-02 18:02:05
User: glaudiston
Functions: cd diff find md5sum xargs

This is usefull to diff 2 paths in branches of software, or in different versions of a same zip file. So you can get the real file diff.

find . -empty -type d -print0 | xargs -0 rmdir -p
2013-07-01 02:44:57
User: rafar
Functions: find rmdir xargs

It starts in the current working directory.

It removes the empty directory and its ancestors (unless the ancestor contains other elements than the empty directory itself).

It will print a failure message for every directory that isn't empty.

This command handles correctly directory names containing single or double quotes, spaces or newlines.

If you do not want only to remove all the ancestors, just use:

find . -empty -type d -print0 | xargs -0 rmdir
find . -type d -print0 | xargs -0 du -s | sort -n | tail -10 | cut -f2 | xargs -I{} du -sh {} | sort -rn
find . -name '*.jpg' -o -name '*.JPG' -print0 | xargs -0 mogrify -resize 1024">" -quality 40
2013-06-20 16:09:41
User: minnmass
Functions: find xargs

The "find $stuff -print0 | xargs -0 $command" pattern causes both find and xargs to use null-delineated paths, greatly reducing the probability of either hiccuping on even the weirdest of file/path names.

It's also not strictly necessary to add the {} at the end of the xargs command line, as it'll put the files there automatically.

Mind, in most environments, you could use find's "-exec" option to bypass xargs entirely:

find . -name '*.jpg' -o -name '*.JPG' -exec mogrify -resize 1024">" -quality 40 {} +

will use xargs-like "make sure the command line isn't too long" logic to run the mogrify command as few times as necessary (to run once per file, use a ';' instead of a '+' - just be sure to escape it properly).

find . -name '*.jpg' -o -name '*.JPG' | xargs -I{} mogrify -resize 1024">" -quality 40 {}
2013-06-20 15:20:29
Functions: find xargs

First use find to find all the images that end with jpg or JPG in the current dir and all its children.

Then pipe that to xargs. The -I{} makes it so spaces in filenames don't matter.

The 1024">" makes it so it takes any image greater in dimension than 1024 and resizes them to 1024 width, but keeping aspect ratio on height.

Then it sets the image quality to 40.

Piping it through xargs means you avoid the file count limit, and you could run this on your entire file system if you wanted.

mysql -BNe 'select table_name from tables where table_schema="DB-NAME" and table_type="BASE TABLE" and table_name not like "PREFIX%";' information_schema | xargs mysqldump DB-NAME > test.sql
2013-06-20 13:26:18
User: sesom42
Functions: xargs

Replace DB-NAME and PREFIX with your settings. MySQL username and password handled by ~/.my.cnf.

mysql -uuser -ppass -e 'use information_schema; SELECT table_name FROM tables where table_schema="DB-NAME" and table_name NOT LIKE "PREFIX";' | grep -v table_name | xargs mysqldump DB-NAME -uuser -ppass > dump.sql
2013-06-17 13:44:15
User: moosak
Functions: grep xargs

Required : information_schema

** Only replace the "DB-NAME" and "PREFIX" with your DB-name and wildcard prefix match.

** Also replace with your own username and password for mysql server.

This command uses the information_schema to wildcard match tables that we donot need from a database and than pipes the remaining tables out through "xargs" command to the mysqldump utility which than dumps those remaining tables into a sql dump file.