Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

All commands from sorted by
Terminal - All commands - 12,342 results
links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
2016-07-26 12:54:53
User: mogoh
Functions: grep head sort uniq
2

sort -R randomize the list.

head -n1 takes the first.

links $( a=( $( lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort | uniq ) ) ; amax=${#a[@]} ; n=$(( `date '+%s'` % $amax )) ; echo ${a[n]} )
2016-07-26 11:52:12
User: pascalv
Functions: echo grep sort uniq
1

Access a random news web page on the internet.

The Links browser can of course be replaced by Firefox or any modern graphical web browser.

!!:s/foo/bar/
2016-07-25 18:52:42
User: malathion
0

Replaces the first instance of 'foo' with 'bar'. To replace all instances of 'foo' with 'bar': !!:gs/foo/bar/

f=correlation.png; s=400; t=50; a=1.2; convert $f -resize ${s}% -threshold ${t}% bmp:- | potrace -o ${f%.*}.svg -b svg -z black --fillcolor "#FFFFFF" --alphamax ${a}
2016-07-25 10:45:28
User: T4b
0

Uses ImageMagick and potrace to vectorize the input image, with parameters optimized for xkcd-like pictures.

while true ; do wget --quiet --no-check-certificate --post-data 'login=my_account_number&password=my_password&submit=Valider' 'https://wifi.free.fr/Auth' -O '/dev/null' ; sleep 600; done
2016-07-23 16:34:42
User: pascalv
Functions: sleep true wget
Tags: wifi France
0

(In French) Connection aux hotspots FreeWifi, et maintien de la connection active

hpacucli controller all show config detail | grep -A 7 Fail | egrep '(Failed|Last|Serial Number|physicaldrive)'
2016-07-20 17:42:40
User: operat0r
Functions: egrep grep
0

This dumps serial numbers of all the drives but HP warranty check does not say they are valid ...

tree -i -L 1
curl -s http://whatismyip.org/ | grep -oP '(\d{1,3}\.){3}\d+'
lsa() { ls -lart; history -s "joe \"$(\ls -apt|grep -v /|head -1)\"" ; }
2016-07-07 21:27:55
User: knoppix5
Functions: ls
2

Display recursive file list (newest file displayed at the end) and be free to access last file in the list simply by pressing arrow_up_key i.e. open it with joe editor.

BTW IMHO the list of files with newest files at the end is often more informative.

Put this 'lsa' function somewhere in your .bashrc and issue

. ~/.bashrc

or

source ~/.bashrc

to have access to the 'lsa' command immediately.

.

(the function appends command "joe last_file_in_the_list" at the end of command history)

while true; do A=$(stat -c%s FILE); sleep 1; B=$(stat -c%s FILE); echo -en "\r"$(($B-$A))" Bps"; done
2016-06-27 20:39:06
User: Zort
Functions: echo sleep stat
-2

Muestra el crecimiento de un archivo por segundo.

Cambia el texto "FILE" por el nombre del archivo a monitorear.

Comando STAT

while true; do A=$(ls -l FILE | awk '{print $5}'); sleep 1; B=$(ls -l FILE | awk '{print $5}'); echo -en "\r"$(($B-$A))" Bps"; done
2016-06-27 20:33:02
User: Zort
Functions: awk echo ls sleep
-3

Muestra el crecimiento de un archivo por segundo.

Cambia el texto "FILE" por el nombre del archivo a monitorear.

Comando LS + AWK

netstat -n | grep ESTAB |grep :80 | tee /dev/stderr | wc -l
2016-06-26 11:37:19
User: rubenmoran
Functions: grep netstat tee wc
2

Summarize established connections after netstat output.

Using tee and /dev/stderr you can send one command output to terminal before executing wc so you can summarize at the bottom of the output.

sed '/INSERT INTO `unwanted_table`/d' mydb.sql > reduced.sql
2016-06-24 20:13:47
User: sudopeople
Functions: sed
0

Starting with a large MySQL dump file (*.sql) remove any lines that have inserts for the specified table. Sometimes one or two tables are very large and uneeded, eg. log tables. To exclude multiple tables you can get fancy with sed, or just run the command again on subsequently generated files.

less -X /var/log/insecure
2016-06-24 13:53:49
User: sonic
Functions: less
1

From the manpage:

man less

-X or --no-init

Disables sending the termcap initialization and deinitialization strings to the terminal. This is sometimes desirable if the deinitialization string does something unnecessary, like clearing the screen.

Bonus:

If you want to clear the screen after viewing a file this way that had sensitive information, hit or just type clear. Since is readily available, I don't know why less bothers to automatically clear. If you're viewing the file at all, chances are you want to see the output from it after you quit.

echo '"\e\C-i": "\C-awhile true; do ( \C-e ); inotifywait -q -e modify -e close_write *; done\e51\C-b"' >>~/.inputrc
2016-06-10 08:06:50
User: unhammer
Functions: echo
0

Assuming you've written all of

make -j hfst-tokenize && echo doavtter gr?dakursa|./hfst-tokenize --gtd tokeniser-gramcheck-gt-desc.pmhfst

and want that to execute every time you :w in vim (or C-xC-s in Emacs), just hit and it'll turn into

while true; do ( make -j hfst-tokenize && e doavtter gr?dakursa|./hfst-tokenize --gtd tokeniser-gramcheck-gt-desc.pmhfst ); inotifywait -q -e modify -e close_write *; done

with the cursor right before the ')'. Hit enter, and it'll run on each save.

Requires the package inotify-tools installed.

echo "https://www.google.com/maps/place/$(exiftool -ee -p '$gpslatitude, $gpslongitude' -c "%d?%d'%.2f"\" image.jpg 2> /dev/null | sed -e "s/ //g")"
rsync -a --delete empty-dir/ target-dir/
2016-06-07 16:56:55
User: malathion
Functions: rsync
Tags: delete rsync
7

This command works by rsyncing the target directory (containing the files you want to delete) with an empty directory. The '--delete' switch instructs rsync to remove files that are not present in the source directory. Since there are no files there, all the files will be deleted.

I'm not clear on why it's faster than 'find -delete', but it is.

Benchmarks here: https://web.archive.org/web/20130929001850/http://linuxnote.net/jianingy/en/linux/a-fast-way-to-remove-huge-number-of-files.html

sudo mount -o remount,rw / && sudo cp /etc/hosts /etc/hosts.old && wget http://winhelp2002.mvps.org/hosts.txt && cp /etc/hosts ~/ && cat hosts.txt >> hosts && sudo cp hosts /etc/hosts
2016-06-06 15:01:19
User: bugmenot
Functions: cat cp mount sudo wget
1

Will append lines to the hosts file to do some basic ad blocking.

curl --silent --head "${url}" | grep 'Last-Modified:' | cut -c 16- | date -f - +'%s'
2016-06-02 22:20:55
User: odoepner
Functions: cut date grep
0

This command line assumes that "${url}" is the URL of the web resource.

It can be useful to check the "freshness" of a download URL before a GET request.

curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
tree -isafF /var|grep -v "/$"|tr '[]' ' '|sort -k1nr|head
du -a /var | sort -n -r | head -n 10
echo "$USER"|rev | espeak
awk '/CurrConns/{print $NF}' <<< "$(echo "show info" | sudo nc -U /var/lib/haproxy/stats)"
ps -u jboss -o nlwp= | awk '{ num_threads += $1 } END { print num_threads }'