Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 11,930 results
dumpe2fs -h /dev/xvda1 | egrep -i 'mount count|check'
2014-10-22 08:38:43
User: manju712
Functions: dumpe2fs egrep
0

To check the total number of mounts, maximum number of mounts before performing the fsck and last time when the fsck was performed.

wait 536; anywait 536; anywaitd 537; anywaitp 5562 5563 5564
2014-10-22 06:31:47
User: colemar
Functions: wait
0

Silent:

anywait () { for pid in "$@"; do while kill -0 "$pid" >/dev/null 2>&1; do sleep 0.5; done; done }

Prints dots:

anywaitd () { for pid in "$@"; do while kill -0 "$pid" >/dev/null 2>&1; do sleep 0.5; echo -n '.'; done; done }

Prints process ids:

anywaitp () { for pid in "$@"; do while kill -0 "$pid" >/dev/null 2>&1; do sleep 0.5; echo -n $pid' '; done; echo; done }

You cannot anywait for other users processes.

rsync --recursive --info=progress2 <src> <dst>
2014-10-21 22:19:44
User: koter84
Functions: rsync
Tags: rsync progress
0

update the output of rsync after completing a file but don't create newlines, just overwrite the last line, this looks a lot better in scripts where you do want to see a progress-indicator, but not the lengthy logs

this option is available since rsync 3.1.0

<command> | curl -F 'clbin=<-' https://clbin.com
2014-10-21 13:02:18
User: colemar
0

Define alias for convenience:

alias clbin='curl -v -F "clbin=<-" https://clbin.com'

Paste man page:

man bash | clbin

Paste image:

curl -F 'clbin=@filename.jpg' https://clbin.com
smartctl -a /dev/sda |grep Writ |awk '{print $NF/2/1024/1024/1024 " TeraBytes Written"}'
2014-10-21 03:40:32
User: khyron320
Functions: awk grep
2

You must have smartmontools installed for this to work. This also assumes you 512 byte sector sizes, this is pretty standard.

rename -fc *
find -not -empty -type f -printf "%-30s'\t\"%h/%f\"\n" | sort -rn -t$'\t' | uniq -w30 -D | cut -f 2 -d $'\t' | xargs md5sum | sort | uniq -w32 --all-repeated=separate
2014-10-19 02:00:55
User: fobos3
Functions: cut find md5sum sort uniq xargs
0

Finds duplicates based on MD5 sum. Compares only files with the same size. Performance improvements on:

find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate

The new version takes around 3 seconds where the old version took around 17 minutes. The bottle neck in the old command was the second find. It searches for the files with the specified file size. The new version keeps the file path and size from the beginning.

H="--header"; wget $H="Accept-Language: en-us,en;q=0.5" $H="Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" $H="Connection: keep-alive" -U "Mozilla/5.0 (Windows NT 5.1; rv:10.0.2) Gecko/20100101 Firefox/10.0.2" --referer=urlhere
lftp -u user,pwd -e "set sftp:connect-program 'ssh -a -x -T -c arcfour -o Compression=no'; mirror -v -c --loop --use-pget-n=3 -P 2 /remote/dir/ /local/dir/; quit" sftp://remotehost:22
2014-10-17 00:29:34
User: colemar
Functions: lftp
0

Mirror a remote directory using some tricks to maximize network speed.

lftp:: coolest file transfer tool ever

-u: username and password (pwd is merely a placeholder if you have ~/.ssh/id_rsa)

-e: execute internal lftp commands

set sftp:connect-program: use some specific command instead of plain ssh

ssh::

-a -x -T: disable useless things

-c arcfour: use the most efficient cipher specification

-o Compression=no: disable compression to save CPU

mirror: copy remote dir subtree to local dir

-v: be verbose (cool progress bar and speed meter, one for each file in parallel)

-c: continue interrupted file transfers if possible

--loop: repeat mirror until no differences found

--use-pget-n=3: transfer each file with 3 independent parallel TCP connections

-P 2: transfer 2 files in parallel (totalling 6 TCP connections)

sftp://remotehost:22: use sftp protocol on port 22 (you can give any other port if appropriate)

You can play with values for --use-pget-n and/or -P to achieve maximum speed depending on the particular network.

If the files are compressible removing "-o Compression=n" can be beneficial.

Better create an alias for the command.

sudo hdparm -B 200 /dev/sda
g?g?
ls | tr '[[:punct:][:space:]]' '\n' | grep -v "^\s*$" | sort | uniq -c | sort -bn
2014-10-14 09:52:28
User: qdrizh
Functions: grep ls sort tr uniq
Tags: sort uniq ls grep tr
3

I'm sure there's a more elegant sed version for the tr + grep section.

uname -p
youtube-dl -tci --write-info-json "https://www.youtube.com/watch?v=dQw4w9WgXcQ"
2014-10-13 21:18:34
User: wires
1

Download video files from a bunch of sites (here is a list https://rg3.github.io/youtube-dl/supportedsites.html).

The options say: base filename on title, ignores errors and continue partial downloads. Also, stores some metadata into a .json file plz.

Paste youtube users and playlists for extra fun.

Protip: git-annex loves these files

gcloud components list | grep "^| Not" | sed "s/|\(.*\)|\(.*\)|\(.*\)|/\2/" | xargs echo gcloud components update
2014-10-13 20:52:25
User: wires
Functions: echo grep sed xargs
0

Google Cloud SDK comes with a package manager `gcloud components` but it needs a bit of `sed` to work. Modify the "^| Not" bit to change the package selection. (The gcloud --format option is currently broken)

ip a s eth0 | awk -F'[/ ]+' '/inet[^6]/{print $3}'
dd if=/dev/hda | ssh root@4.2.2.2 'dd of=/root/server.img'
2014-10-13 13:43:47
User: suyashjain
Functions: dd ssh
0

By this command you can take the snapshot of you harddisk (full) and create the image , the image will be directly store on remote server through ssh. Here i am creating the image of /dev/hda and saving it at 4.2.2.2 as /root/server.img.

cat /etc/httpd/logs/access.log | awk '{ print $6}' | sed -e 's/\[//' | awk -F'/' '{print $1}' | sort | uniq -c
2014-10-13 13:39:53
User: suyashjain
Functions: awk cat sed sort uniq
0

The command will read the apache log file and fetch the virtual host requested and the number of requests.

sed -e '/4.2.2.2/ s/^;//' -i test.txt
2014-10-13 13:37:53
User: suyashjain
Functions: sed
Tags: sed
0

This sed command will search for 4.2.2.2 in all lines of test.txt and replace comment symbol ";" . You can use it for other purpose also.

psql -U quassel quassel -c "SELECT message FROM backlog ORDER BY time DESC LIMIT 1000;" | grep my-query
2014-10-12 19:53:06
User: Tatsh
Functions: grep
0

Replace the credentials to psql if necessary, and the my-query part with your query.

curl -s http://pages.cs.wisc.edu/~ballard/bofh/bofhserver.pl |grep 'is:' |awk 'BEGIN { FS=">"; } { print $10; }'
2014-10-10 21:17:33
User: toj
Functions: awk grep
Tags: curl BOFH
0

Sure, it's dirty, but it's quick, it only displays the excuse, and it works.

ip addr show enp3s0 | awk '/inet[^6]/{print $2}' | awk -F'/' '{print $1}'
<ctrl+u>
for f in */*.ape; do avconv -i "$f" "${f%.ape}.flac"; done
2014-10-10 12:33:00
User: qdrizh
0

Converts all monkey audio files below currently directory to FLAC.

For only current directory, use `for f in *.ape; do avconv -i "$f" "${f%.ape}.flac"; done`

To remove APE files afterward, use `rm */*.ape`

mtr www.google.com