What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

UpGuard checks and validates configurations for every major OS, network device, and cloud provider.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags



Commands tagged parallel from sorted by
Terminal - Commands tagged parallel - 30 results
lc() { od="$1"; nd="$2"; of=$3; nf=$4; cp -rl "$od" "$nd"; parallel -0 "ffmpeg -i {1} -loglevel error -q:a 6 {1.}.{2} && { rm {1}; echo {1.}.{2}; }" :::: <(find "$nd" -type f -iname \*$of -print0) ::: "$nf"; }
2017-03-02 17:37:34
User: snipertyler
Functions: cp find

Uses parallel processing

Reiteration of my earlier command



lc Old_Directory New_DIrectory Old_Format New_Format

lc ~/Music ~/Music_ogg mp3 ogg

timeDNS() { parallel -j0 --tag dig @{} "$*" ::: | grep Query | sort -nk5; }
lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
2014-10-17 00:29:34
User: colemar
Functions: lftp

Mirror a remote directory using some tricks to maximize network speed.

lftp:: coolest file transfer tool ever

-u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa)

-e: execute internal lftp commands

set sftp:connect-program: use some specific command instead of plain ssh


-a -x -T: disable useless things

-c arcfour: use the most efficient cipher specification

-o Compression=no: disable compression to save CPU

mirror: copy remote dir subtree to local dir

-v: be verbose (cool progress bar and speed meter, one for each file in parallel)

-c: continue interrupted file transfers if possible

--loop: repeat mirror until no differences found

--use-pget-n=3: transfer each file with 3 independent parallel TCP connections

-P 2: transfer 2 files in parallel (totalling 6 TCP connections)

sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate)

You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network.

If the files are compressible removing "-o Compression=n" can be beneficial.

Better create an alias for the command.

pbc () { parallel -C, -k -j100% "echo '$@' | bc -l"; }
2014-06-02 19:08:03
User: eroenj

Define a function that applies bc, the *nix calculator, with the specified expression to all rows of the input CSV. The first column is mapped to {1}, second one to {2}, and so forth. See sample output for an example. This function uses all available cores thanks to GNU Parallel.

Requires GNU Parallel

dd if=file | tee >(sha1sum) >(md5sum) >(sha256sum) >/dev/null
2013-11-07 17:43:54
User: dubbaluga
Functions: dd tee
Tags: tee parallel I/O

This is to overcome the issue of slow I/O by reading once and forwarding the output to several processes (e. g. 3 in the given command). One could also invoke grep or other programs to work on read data.

mysql -e 'show databases' -s --skip-column-names | egrep -v "^(test|mysql|performance_schema|information_schema)$" | parallel --gnu "mysqldump --routines {} > {}_daily.sql"
2013-07-24 15:37:58
User: intel352
Functions: egrep

Backs up all databases, excluding test, mysql, performance_schema, information_schema.

Requires parallel to work, install parallel on Ubuntu by running: sudo aptitude install parallel

cat item_list | xargs -n1 -P<n> process_item
alias sortfast='sort -S$(($(sed '\''/MemF/!d;s/[^0-9]*//g'\'' /proc/meminfo)/2048)) $([ `nproc` -gt 1 ]&&echo -n --parallel=`nproc`)'

sort is way slow by default. This tells sort to use a buffer equal to half of the available free memory. It also will use multiple process for the sort equal to the number of cpus on your machine (if greater than 1). For me, it is magnitudes faster.

If you put this in your bash_profile or startup file, it will be set correctly when bash is started.

sort -S1 --parallel=2 <(echo) &>/dev/null && alias sortfast='sort -S$(($(sed '\''/MemF/!d;s/[^0-9]*//g'\'' /proc/meminfo)/2048)) $([ `nproc` -gt 1 ]&&echo -n --parallel=`nproc`)'


echo|sort -S10M --parallel=2 &>/dev/null && alias sortfast="command sort -S$(($(sed '/MemT/!d;s/[^0-9]*//g' /proc/meminfo)/1024-200)) --parallel=$(($(command grep -c ^proc /proc/cpuinfo)*2))"
function xzv() { THREADS=`grep processor /proc/cpuinfo | wc -l`; for file in $*; do pv -s `stat -c%s $file` < $file | pxz -q -T $THREADS > $file.xz ; done; }
2011-12-14 08:22:08
User: oernii2
Functions: file wc

You need: pxz for the actual work (http://jnovy.fedorapeople.org/pxz/). The function could be better with better multifile and stdin/out support.

parallel echo -n {}"\ "\;echo '$(du -s {} | awk "{print \$1}") / $(find {} | wc -l)' \| bc -l ::: *
xargs -P 3 -n 1 <COMMAND> < <FILE_LIST>
2011-07-25 22:53:32
User: h3xx
Functions: xargs

For instance:

find . -type f -name '*.wav' -print0 |xargs -0 -P 3 -n 1 flac -V8

will encode all .wav files into FLAC in parallel.

Explanation of xargs flags:

-P [max-procs]: Max number of invocations to run at once. Set to 0 to run all at once [potentially dangerous re: excessive RAM usage].

-n [max-args]: Max number of arguments from the list to send to each invocation.

-0: Stdin is a null-terminated list.

I use xargs to build parallel-processing frameworks into my scripts like the one here: http://pastebin.com/1GvcifYa

parallel -j4 cd {}\; pwd\; git pull :::: <(git submodule status | awk '{print $2}')
2011-06-20 00:20:26
User: clvv
Functions: awk cd

Make sure to run this command in your git toplevel directory. Modify `-j4` as you like. You can also run any arbitrary command beside `git pull` in parallel on all of your git submodules.

seq 1 255 | parallel -j+0 'nc -w 1 -z -v 192.168.1.{} 80'
2011-06-11 14:40:51
User: devrick0
Functions: seq

It takes over 5 seconds to scan a single port on a single host using nmap

time (nmap -p 80 &> /dev/null)

real 0m5.109s

user 0m0.102s

sys 0m0.004s

It took netcat about 2.5 minutes to scan port 80 on the class C

time (for NUM in {1..255} ; do nc -w 1 -z -v 192.168.1.${NUM} 80 ; done &> /dev/null)

real 2m28.651s

user 0m0.136s

sys 0m0.341s

Using parallel, I am able to scan port 80 on the entire class C in under 2 seconds

time (seq 1 255 | parallel -j255 'nc -w 1 -z -v 192.168.1.{} 80' &> /dev/null)

real 0m1.957s

user 0m0.457s

sys 0m0.994s

fdupes -r .
2011-02-19 17:02:30
User: Vilemirth
Tags: xargs parallel

If you have the fdupes command, you'll save a lot of typing. It can do recursive searches (-r,-R) and it allows you to interactively select which of the duplicate files found you wish to keep or delete.

parallel -j+0 "zcat {} | bzip2 >{.}.bz2 && rm {}" ::: *.gz
echo "uptime" | tee >(ssh host1) >(ssh host2) >(ssh host3)
echo "uptime" | pee "ssh host1" "ssh host2" "ssh host3"
2010-08-20 11:42:40
User: dooblem
Functions: echo
Tags: ssh parallel pee

The pee command is in the moreutils package.

xargs -n1 -P100 -I{} sh -c 'ssh {} uptime >output/{} 2>error/{}' <hostlist
2010-08-20 11:03:11
User: dooblem
Functions: sh uptime xargs

Do the same as pssh, just in shell syntax.

Put your hosts in hostlist, one per line.

Command outputs are gathered in output and error directories.

for host in host1 host2 host3; do ssh -n user@$host <command> > $host.log & done; wait
2010-07-14 14:55:31
User: cout
Functions: host ssh

Ssh to host1, host2, and host3, executing on each host and saving the output in {host}.log.

I don't have the 'parallel' command installed, otherwise it sounds interesting and less cryptic.

find . -type f | parallel -j+0 grep -i foobar
2010-01-30 02:08:46
Functions: find grep

Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores.

With parallel you only need -0 (and -print0) if your filenames contain a '\n'.

Parallel is from https://savannah.nongnu.org/projects/parallel/

du -s * | sort -nr | head | cut -f2 | parallel -k du -sh
2010-01-28 12:59:14
Functions: cut du head sort
Tags: du xargs parallel

If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.

tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5 -type f` 2> /dev/null | parallel -X rm -f
2010-01-28 12:41:41
Functions: rm tar

This deals nicely with files having special characters in the file name (space ' or ").

Parallel is from https://savannah.nongnu.org/projects/parallel/

ls -t1 | sed 1d | parallel -X rm
2010-01-28 12:28:18
Functions: ls sed

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

svn st | cut -c 9- | parallel -X tar -czvf ../backup.tgz
2010-01-28 11:43:16
Functions: cut tar

xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'.

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

Both solutions work bad if the number of files is more than the allowed line length of the shell.

svn status |grep '\?' |awk '{print $2}'| parallel -Xj1 svn add
2010-01-28 08:47:54
Functions: awk grep
Tags: xargs parallel

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.