Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Psst. Open beta.

Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:

  • » The open beta is running a copy of the database that will not carry over to the final version. Don't post anything you don't mind losing.
  • » If you wish to use your user account, you will probably need to reset your password.
Your feedback is appreciated via the form on the beta page. Thanks! -Jon & CLFU Team

Commands using grep from sorted by
Terminal - Commands using grep - 1,656 results
pip freeze --local | grep -v '^\-e' | cut -d = -f 1 | xargs -n1 pip install -U
host -t srv _ldap._tcp | sed "s/.*[ ]\([^ ]*\)[.]$/\1/g" | xargs -i ping -c 1 {} | grep -E "(statistics|avg)" | sed "s/^--- \([^ ]*\).*/,\1:/g"|tr -d "\n" | tr "," "\n" | sed "1d;s|^\([^:]*\).*=[^/]*/\([^/]*\).*|\2\t\1|g" |sort -n
2016-09-02 03:26:29
User: glaudiston
Functions: grep host ping sed sort tr xargs
Tags: ldap
0

This command line detect ldap hosts, by mandatory dns entry, then ping them to detect response average. based on ping response average it sorts and print the faster server in first output line

tail -v -f $(php -i | grep "^[ \t]*error_log" | awk -F"=>" '{ print $2; }' | sed 's/^[ ]*//g')
2016-08-31 12:13:31
User: paulera
Functions: awk grep sed tail
0

Runs "php -i", filter the error_log location, then watches it using "tail"

printf '%s-%s-%s-%s\n' $(grep -v "[A-Z]\|'" /usr/share/dict/british | shuf -n 4)
2016-08-15 08:13:10
User: hendry
Functions: grep printf
0

https://xkcd.com/936/ introduced us to what actually is a good password. Here's such an implementation.

Credit: quinq on #suckless

ps auxw | grep -E 'sbin/(apache|httpd)' | awk '{print"-p " $2}' | xargs strace -F
2016-08-04 10:59:58
User: gormux
Functions: awk grep ps strace xargs
Tags: awk grep ps strace
0

Will open strace on all apache process, on systems using sbin/apache (debian) or sbin/httpd (redhat), and will follow threads newly created.

links `lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort -R | uniq | head -n1`
2016-07-26 12:54:53
User: mogoh
Functions: grep head sort uniq
0

sort -R randomize the list.

head -n1 takes the first.

links $( a=( $( lynx -dump -listonly "http://news.google.com" | grep -Eo "(http|https)://[a-zA-Z0-9./?=_-]*" | grep -v "google.com" | sort | uniq ) ) ; amax=${#a[@]} ; n=$(( `date '+%s'` % $amax )) ; echo ${a[n]} )
2016-07-26 11:52:12
User: pascalv
Functions: echo grep sort uniq
1

Access a random news web page on the internet.

The Links browser can of course be replaced by Firefox or any modern graphical web browser.

hpacucli controller all show config detail | grep -A 7 Fail | egrep '(Failed|Last|Serial Number|physicaldrive)'
2016-07-20 17:42:40
User: operat0r
Functions: egrep grep
1

This dumps serial numbers of all the drives but HP warranty check does not say they are valid ...

curl -s http://whatismyip.org/ | grep -oP '(\d{1,3}\.){3}\d+'
netstat -n | grep ESTAB |grep :80 | tee /dev/stderr | wc -l
2016-06-26 11:37:19
User: rubenmoran
Functions: grep netstat tee wc
2

Summarize established connections after netstat output.

Using tee and /dev/stderr you can send one command output to terminal before executing wc so you can summarize at the bottom of the output.

curl --silent --head "${url}" | grep 'Last-Modified:' | cut -c 16- | date -f - +'%s'
2016-06-02 22:20:55
User: odoepner
Functions: cut date grep
0

This command line assumes that "${url}" is the URL of the web resource.

It can be useful to check the "freshness" of a download URL before a GET request.

curl http://url/rss | grep -o '<enclosure url="[^"]*' | grep -o '[^"]*$' | xargs wget -c
tree -isafF /var|grep -v "/$"|tr '[]' ' '|sort -k1nr|head
ASN=32934; for s in $(whois -H -h riswhois.ripe.net -- -F -K -i $ASN | grep -v "^$" | grep -v "^%" | awk '{ print $2 }' ); do echo " blocking $s"; sudo iptables -A INPUT -s $s -j REJECT &> /dev/null || sudo ip6tables -A INPUT -s $s -j REJECT; done
ss -t -o state established '( dport = :443 || dport = :80 )' | grep -Po '([0-9a-z:.]*)(?=:http[s])' | sort -u|netcat whois.cymru.com 43|grep -v "AS Name"|sort -t'|' -k3
ss -t -o state established '( dport = :443 || dport = :80 )'|grep tcp|awk '{ print $5 }'|sed s/:http[s]*//g|sort -u|netcat whois.cymru.com 43|grep -v "AS Name"|sort -t'|' -k3
egrep -v '^\s*($|#)' $(git grep -l '#!/bin/.*sh' *) | wc -l
2016-02-15 11:15:48
User: Natureshadow
Functions: egrep grep wc
Tags: git grep count code
0

Uses git grep for speed, relies on a valid she-bang, ignores leading whitespace when stripping comments and blank lines

ps -eo pmem,comm | grep java | awk '{sum+=$1} END {print sum " % of RAM"}'
2016-02-10 09:00:56
User: bugmenot
Functions: awk grep ps sum
5

This command will add up RAM usage of all processes whose name contains "java" and output the sum of percentages in HRF. Also, unlike the original #15430, it wont fail on processes with a usage of >9.9%.

Pleases note that this command wont work reliably in use cases where a significant portion of processes involved are using less than 0.1% of RAM, because they will be counted as "0", even though a great number of them could add up to significant amounts.

pacman -Ss python | paste - - | grep --color=always -e '/python' | less -R
2016-01-25 14:29:31
User: hute37
Functions: grep less paste python
Tags: less paste pacman
1

Alternative1 (grep support):

pacman -Ss python | paste - - | grep --color=always -e '/python' | less -R

Alternative2 (eye-candy, no grep):

pacman --color=always -Ss "python" | paste - - | less -R

in ~/.bashrc:

pkg-grep() { pacman -Ss "$1" | paste - - | grep --color=always -e "${2:-$1}" | less -R ; }

pkg-search() { pacman --color=always -Ss "python" | paste - - | less -R; }

ps -eo pmem,comm | grep chrome | cut -d " " -f 2 | paste -sd+ | bc
get_iplayer --type=radio --channel "Radio 4 Extra" | grep : | awk '{ if ( NR > 1 ) { print } }'|sed 's/:.*//' |sed '$ d' > pidlist && while read p; do get_iplayer --get --fields=pid $p; done <pidlist && rm pidlist
2016-01-16 17:20:54
User: dunryc
Functions: awk grep read rm sed
0

use get_iplay to download all listed content from http://www.bbc.co.uk/radio4extra run every night to make sure no episodes are missed

for f in `git status | grep new | awk '{print $3}'`; do git reset HEAD $f ; done
pdf2txt myfile.pdf | grep mypattern
2015-11-23 17:46:22
User: grinob
Functions: grep
Tags: pipe grep pdf
2

This is a good alternative to pdf2text for Ubuntu. To install it:

sudo apt-get install python-pdfminer

netstat -np | grep -v ^unix
2015-11-09 17:22:30
User: UnklAdM
Functions: grep netstat
6

I often have to google this so I put it here for quick reference.

find / -name \*.php -exec grep -Hn .1.=.......0.=.......3.=.......2.=.......5.= {} \;
2015-10-28 20:58:53
User: UnklAdM
Functions: find grep
0

If this matches any files on your web server expect to find allot of malware spread throughout your server folders. Seems to target wordpress sites. Be sure to check your themes/theme-name/header.php files manually for various redirect scripting usually in the line right above the close head tag.

Good luck!