Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged parallel from sorted by
Terminal - Commands tagged parallel - 28 results
lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
2014-10-17 00:29:34
User: colemar
Functions: lftp
0

Mirror a remote directory using some tricks to maximize network speed.

lftp:: coolest file transfer tool ever

-u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa)

-e: execute internal lftp commands

set sftp:connect-program: use some specific command instead of plain ssh

ssh::

-a -x -T: disable useless things

-c arcfour: use the most efficient cipher specification

-o Compression=no: disable compression to save CPU

mirror: copy remote dir subtree to local dir

-v: be verbose (cool progress bar and speed meter, one for each file in parallel)

-c: continue interrupted file transfers if possible

--loop: repeat mirror until no differences found

--use-pget-n=3: transfer each file with 3 independent parallel TCP connections

-P 2: transfer 2 files in parallel (totalling 6 TCP connections)

sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate)

You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network.

If the files are compressible removing "-o Compression=n" can be beneficial.

Better create an alias for the command.

pbc () { parallel -C, -k -j100% "echo '$@' | bc -l"; }
2014-06-02 19:08:03
User: eroenj
0

Define a function that applies bc, the *nix calculator, with the specified expression to all rows of the input CSV. The first column is mapped to {1}, second one to {2}, and so forth. See sample output for an example. This function uses all available cores thanks to GNU Parallel.

Requires GNU Parallel

dd if=file | tee >(sha1sum) >(md5sum) >(sha256sum) >/dev/null
2013-11-07 17:43:54
User: dubbaluga
Functions: dd tee
Tags: tee parallel I/O
0

This is to overcome the issue of slow I/O by reading once and forwarding the output to several processes (e. g. 3 in the given command). One could also invoke grep or other programs to work on read data.

mysql -e 'show databases' -s --skip-column-names | egrep -v "^(test|mysql|performance_schema|information_schema)$" | parallel --gnu "mysqldump --routines {} > {}_daily.sql"
2013-07-24 15:37:58
User: intel352
Functions: egrep
1

Backs up all databases, excluding test, mysql, performance_schema, information_schema.

Requires parallel to work, install parallel on Ubuntu by running: sudo aptitude install parallel

cat item_list | xargs -n1 -P<n> process_item
alias sortfast='sort -S$(($(sed '\''/MemF/!d;s/[^0-9]*//g'\'' /proc/meminfo)/2048)) $([ `nproc` -gt 1 ]&&echo -n --parallel=`nproc`)'
3

sort is way slow by default. This tells sort to use a buffer equal to half of the available free memory. It also will use multiple process for the sort equal to the number of cpus on your machine (if greater than 1). For me, it is magnitudes faster.

If you put this in your bash_profile or startup file, it will be set correctly when bash is started.

sort -S1 --parallel=2 <(echo) &>/dev/null && alias sortfast='sort -S$(($(sed '\''/MemF/!d;s/[^0-9]*//g'\'' /proc/meminfo)/2048)) $([ `nproc` -gt 1 ]&&echo -n --parallel=`nproc`)'

Alternative

echo|sort -S10M --parallel=2 &>/dev/null && alias sortfast="command sort -S$(($(sed '/MemT/!d;s/[^0-9]*//g' /proc/meminfo)/1024-200)) --parallel=$(($(command grep -c ^proc /proc/cpuinfo)*2))"
function xzv() { THREADS=`grep processor /proc/cpuinfo | wc -l`; for file in $*; do pv -s `stat -c%s $file` < $file | pxz -q -T $THREADS > $file.xz ; done; }
2011-12-14 08:22:08
User: oernii2
Functions: file wc
0

You need: pxz for the actual work (http://jnovy.fedorapeople.org/pxz/). The function could be better with better multifile and stdin/out support.

parallel echo -n {}"\ "\;echo '$(du -s {} | awk "{print \$1}") / $(find {} | wc -l)' \| bc -l ::: *
xargs -P 3 -n 1 <COMMAND> < <FILE_LIST>
2011-07-25 22:53:32
User: h3xx
Functions: xargs
0

For instance:

find . -type f -name '*.wav' -print0 |xargs -0 -P 3 -n 1 flac -V8

will encode all .wav files into FLAC in parallel.

Explanation of xargs flags:

-P [max-procs]: Max number of invocations to run at once. Set to 0 to run all at once [potentially dangerous re: excessive RAM usage].

-n [max-args]: Max number of arguments from the list to send to each invocation.

-0: Stdin is a null-terminated list.

I use xargs to build parallel-processing frameworks into my scripts like the one here: http://pastebin.com/1GvcifYa

parallel -j4 cd {}\; pwd\; git pull :::: <(git submodule status | awk '{print $2}')
2011-06-20 00:20:26
User: clvv
Functions: awk cd
2

Make sure to run this command in your git toplevel directory. Modify `-j4` as you like. You can also run any arbitrary command beside `git pull` in parallel on all of your git submodules.

seq 1 255 | parallel -j+0 'nc -w 1 -z -v 192.168.1.{} 80'
2011-06-11 14:40:51
User: devrick0
Functions: seq
1

It takes over 5 seconds to scan a single port on a single host using nmap

time (nmap -p 80 192.168.1.1 &> /dev/null)

real 0m5.109s

user 0m0.102s

sys 0m0.004s

It took netcat about 2.5 minutes to scan port 80 on the class C

time (for NUM in {1..255} ; do nc -w 1 -z -v 192.168.1.${NUM} 80 ; done &> /dev/null)

real 2m28.651s

user 0m0.136s

sys 0m0.341s

Using parallel, I am able to scan port 80 on the entire class C in under 2 seconds

time (seq 1 255 | parallel -j255 'nc -w 1 -z -v 192.168.1.{} 80' &> /dev/null)

real 0m1.957s

user 0m0.457s

sys 0m0.994s

fdupes -r .
2011-02-19 17:02:30
User: Vilemirth
Tags: xargs parallel
15

If you have the fdupes command, you'll save a lot of typing. It can do recursive searches (-r,-R) and it allows you to interactively select which of the duplicate files found you wish to keep or delete.

parallel -j+0 "zcat {} | bzip2 >{.}.bz2 && rm {}" ::: *.gz
echo "uptime" | tee >(ssh host1) >(ssh host2) >(ssh host3)
echo "uptime" | pee "ssh host1" "ssh host2" "ssh host3"
2010-08-20 11:42:40
User: dooblem
Functions: echo
Tags: ssh parallel pee
15

The pee command is in the moreutils package.

xargs -n1 -P100 -I{} sh -c 'ssh {} uptime >output/{} 2>error/{}' <hostlist
2010-08-20 11:03:11
User: dooblem
Functions: sh uptime xargs
3

Do the same as pssh, just in shell syntax.

Put your hosts in hostlist, one per line.

Command outputs are gathered in output and error directories.

for host in host1 host2 host3; do ssh -n user@$host <command> > $host.log & done; wait
2010-07-14 14:55:31
User: cout
Functions: host ssh
1

Ssh to host1, host2, and host3, executing on each host and saving the output in {host}.log.

I don't have the 'parallel' command installed, otherwise it sounds interesting and less cryptic.

find . -type f | parallel -j+0 grep -i foobar
2010-01-30 02:08:46
Functions: find grep
3

Parallel does not suffer from the risk of mixing of output that xargs suffers from. -j+0 will run as many jobs in parallel as you have cores.

With parallel you only need -0 (and -print0) if your filenames contain a '\n'.

Parallel is from https://savannah.nongnu.org/projects/parallel/

du -s * | sort -nr | head | cut -f2 | parallel -k du -sh
2010-01-28 12:59:14
Functions: cut du head sort
Tags: du xargs parallel
-2

If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.

tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5 -type f` 2> /dev/null | parallel -X rm -f
2010-01-28 12:41:41
Functions: rm tar
-3

This deals nicely with files having special characters in the file name (space ' or ").

Parallel is from https://savannah.nongnu.org/projects/parallel/

ls -t1 | sed 1d | parallel -X rm
2010-01-28 12:28:18
Functions: ls sed
-1

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

svn st | cut -c 9- | parallel -X tar -czvf ../backup.tgz
2010-01-28 11:43:16
Functions: cut tar
-2

xargs deals badly with special characters (such as space, ' and "). In this case if you have a file called '12" record'.

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

Both solutions work bad if the number of files is more than the allowed line length of the shell.

svn status |grep '\?' |awk '{print $2}'| parallel -Xj1 svn add
2010-01-28 08:47:54
Functions: awk grep
Tags: xargs parallel
-2

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

grep -rl oldstring . | parallel sed -i -e 's/oldstring/newstring/'
2010-01-28 08:44:16
Functions: grep sed
3

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

find -not -empty -type f -printf "%s\n" | sort | uniq -d | parallel find -type f -size {}c | parallel md5sum | sort | uniq -w32 --all-repeated=separate
2010-01-28 08:40:18
Functions: find md5sum sort uniq
Tags: xargs parallel
-1

A bit shorter and parallelized. Depending on the speed of your cpu and your disk this may run faster.

Parallel is from https://savannah.nongnu.org/projects/parallel/