Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Commands using grep from sorted by
Terminal - Commands using grep - 1,653 results
links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
2016-07-26 12:54:53
User: mogoh
Functions: grep head sort uniq
2

sort -R randomize the list.

head -n1 takes the first.

links $( a=( $( lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort | uniq ) ) ; amax=${#a[@]} ; n=$(( `date '+%s'` % $amax )) ; echo ${a[n]} )
2016-07-26 11:52:12
User: pascalv
Functions: echo grep sort uniq
1

Access a random news web page on the internet.

The Links browser can of course be replaced by Firefox or any modern graphical web browser.

hpacucli controller all show config detail | grep -A 7 Fail | egrep '(Failed|Last|Serial Number|physicaldrive)'
2016-07-20 17:42:40
User: operat0r
Functions: egrep grep
0

This dumps serial numbers of all the drives but HP warranty check does not say they are valid ...

curl -s http://whatismyip.org/ | grep -oP '(\d{1,3}\.){3}\d+'
netstat -n | grep ESTAB |grep :80 | tee /dev/stderr | wc -l
2016-06-26 11:37:19
User: rubenmoran
Functions: grep netstat tee wc
2

Summarize established connections after netstat output.

Using tee and /dev/stderr you can send one command output to terminal before executing wc so you can summarize at the bottom of the output.

curl --silent --head "${url}" | grep 'Last-Modified:' | cut -c 16- | date -f - +'%s'
2016-06-02 22:20:55
User: odoepner
Functions: cut date grep
0

This command line assumes that "${url}" is the URL of the web resource.

It can be useful to check the "freshness" of a download URL before a GET request.

curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
tree -isafF /var|grep -v "/$"|tr '[]' ' '|sort -k1nr|head
ASN=32934; for s in $(whois -H -h riswhois.ripe.net -- -F -K -i $ASN | grep -v "^$" | grep -v "^%" | awk '{ print $2 }' ); do echo " blocking $s"; sudo iptables -A INPUT -s $s -j REJECT &> /dev/null || sudo ip6tables -A INPUT -s $s -j REJECT; done
ss -t -o state established '( dport = :443 || dport = :80 )' | grep -Po '([0-9a-z:.]*)(?=:http[s])' | sort -u|netcat whois.cymru.com 43|grep -v "AS Name"|sort -t'|' -k3
ss -t -o state established '( dport = :443 || dport = :80 )'|grep tcp|awk '{ print $5 }'|sed s/:http[s]*//g|sort -u|netcat whois.cymru.com 43|grep -v "AS Name"|sort -t'|' -k3
egrep -v '^\s*($|#)' $(git grep -l '#!/bin/.*sh' *) | wc -l
2016-02-15 11:15:48
User: Natureshadow
Functions: egrep grep wc
Tags: git grep count code
0

Uses git grep for speed, relies on a valid she-bang, ignores leading whitespace when stripping comments and blank lines

ps -eo pmem,comm | grep java | awk '{sum+=$1} END {print sum " % of RAM"}'
2016-02-10 09:00:56
User: bugmenot
Functions: awk grep ps sum
5

This command will add up RAM usage of all processes whose name contains "java" and output the sum of percentages in HRF. Also, unlike the original #15430, it wont fail on processes with a usage of >9.9%.

Pleases note that this command wont work reliably in use cases where a significant portion of processes involved are using less than 0.1% of RAM, because they will be counted as "0", even though a great number of them could add up to significant amounts.

pacman -Ss python | paste - - | grep --color=always -e '/python' | less -R
2016-01-25 14:29:31
User: hute37
Functions: grep less paste python
Tags: less paste pacman
1

Alternative1 (grep support):

pacman -Ss python | paste - - | grep --color=always -e '/python' | less -R

Alternative2 (eye-candy, no grep):

pacman --color=always -Ss "python" | paste - - | less -R

in ~/.bashrc:

pkg-grep() { pacman -Ss "$1" | paste - - | grep --color=always -e "${2:-$1}" | less -R ; }

pkg-search() { pacman --color=always -Ss "python" | paste - - | less -R; }

ps -eo pmem,comm | grep chrome | cut -d " " -f 2 | paste -sd+ | bc
get_iplayer --type=radio --channel "Radio 4 Extra" | grep : | awk '{ if ( NR > 1 ) { print } }'|sed 's/:.*//' |sed '$ d' > pidlist && while read p; do get_iplayer --get --fields=pid $p; done <pidlist && rm pidlist
2016-01-16 17:20:54
User: dunryc
Functions: awk grep read rm sed
0

use get_iplay to download all listed content from http://www.bbc.co.uk/radio4extra run every night to make sure no episodes are missed

for f in `git status | grep new | awk '{print $3}'`; do git reset HEAD $f ; done
pdf2txt myfile.pdf | grep mypattern
2015-11-23 17:46:22
User: grinob
Functions: grep
Tags: pipe grep pdf
2

This is a good alternative to pdf2text for Ubuntu. To install it:

sudo apt-get install python-pdfminer

netstat -np | grep -v ^unix
2015-11-09 17:22:30
User: UnklAdM
Functions: grep netstat
6

I often have to google this so I put it here for quick reference.

find / -name \*.php -exec grep -Hn .1.=.......0.=.......3.=.......2.=.......5.= {} \;
2015-10-28 20:58:53
User: UnklAdM
Functions: find grep
0

If this matches any files on your web server expect to find allot of malware spread throughout your server folders. Seems to target wordpress sites. Be sure to check your themes/theme-name/header.php files manually for various redirect scripting usually in the line right above the close head tag.

Good luck!

find / -name \*.php -exec grep -Hn preg_replace {} \;|grep /e|grep POST
grep -Pooh .*t..r,.* /etc/init.d/*
2015-10-23 17:35:28
User: drewbenn
Functions: grep
0

Someone quoted Pooh in an init script. Let's see it!

(Probably only works on Debian & friends)

while true; do (echo -n $(date +"%F %T"):\ ; xwininfo -id $(xprop -root|grep "ACTIVE_WINDOW("|cut -d\ -f 5) | grep "Window id" | cut -d\" -f 2 ) >> logfile; sleep 60; done
2015-09-23 23:00:14
User: BeniBela
Functions: cut date echo grep sleep
1

This logs the titles of the active windows, thus you can monitor what you have done during which times. (it is not hard to also log the executable name, but then it is gets too long)

weather() { curl -s "http://www.wunderground.com/q/zmw:$1.1.99999" | grep "og:title" | cut -d\" -f4 | sed 's/&deg;/ degrees F/'; }
followers() { curl -s https://twitter.com/$1 | grep -o '[0-9,]* Followers'; }
2015-09-19 07:07:36
Functions: grep
Tags: CLFUContest
0

See how many people are following you (or anyone) on Twitter.

followers cadejscroggins