Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using wc from sorted by
Terminal - Commands using wc - 151 results
echo $((`eix --only-names -I | wc -l` * 100 / `eix --only-names | wc -l`))%
find . -maxdepth 1 -type f | wc -l
2009-07-31 14:53:29
User: guckes
Functions: find wc
Tags: wc
6

A simple "ls" lists files *and* directories. So we need to "find" the files (type 'f') only.

As "find" is recursive by default we must restrict it to the current directory by adding a maximum depth of "1".

If you should be using the "zsh" then you can use the dot (.) as a globbing qualifier to denote plain files:

zsh> ls *(.) | wc -l

for more info see the zsh's manual on expansion and substitution - "man zshexpn".

pacman -Q|wc -l
find . -type f -name "*.c" -exec cat {} \; | wc -l
2009-07-30 10:06:51
User: foremire
Functions: cat find wc
1

use find to grep all .c files from the target directory, cat them into one stream, then piped to wc to count the lines

head -$(($RANDOM % $(wc -l < file.txt) +1 )) file.txt | tail -1
lsof -p <process_id> | wc -l
$ grep -or string path/ | wc -l
find . -name "*.sql" -print0 | wc -l --files0-from=-
2009-06-22 17:45:03
User: vincentp
Functions: find wc
Tags: find wc count line
2

This command gives you the number of lines of every file in the folder and its subfolders matching the search options specified in the find command. It also gives the total amount of lines of these files.

The combination of print0 and files0-from options makes the whole command simple and efficient.

sort -n <( for i in $(find . -maxdepth 1 -mindepth 1 -type d); do echo $(find $i | wc -l) ": $i"; done;)
count() { find $@ -type f -exec cat {} + | wc -l; }
Q="reddit|digg"; F=*.log; awk -F\" '{print $4}' $F | egrep $Q | wc -l
2009-05-05 21:51:16
User: jaymzcd
Functions: awk egrep wc
0

I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter).

For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches).

ps -L -p <pid> | wc -l
find ./ -not -type d | xargs wc -l | cut -c 1-8 | awk '{total += $1} END {print total}'
wc -l `find . -name *.php`
date -d "@$(find dir -type f -printf '%C@\n' | sort -n | sed -n "$(($(find dir -type f | wc -l)/2))p")" +%F
2009-03-24 18:48:49
User: allengarvin
Functions: date dir find wc
-1

I needed to get a feel for how "old" different websites were, based on their directories.

find . -name "*.EXT" | xargs grep -n "TODO" | wc -l
cmp -l file1.bin file2.bin | wc -l
# wc -l /var/log/security/writable.today
2009-03-19 12:25:52
User: mpb
Functions: wc
0

Mandriva Linux includes a security tool called "msec" (configurable via "draksec").

One of the many things it regularily checks for is world writeable files.

If any are found, it writes the list to /var/log/security/writable.today.

"wc -l" simply counts the number of lines in the file.

This number should be low.

Browse through /var/log/security/writable.today and consider if any of those files *need* to be world-writeable (and if not, modify the permissions. eg: "chmod o-w $file").

A large number of world-writeable files may indicate that umask is not correctly set in /etc/profile (or ${HOME}/.bash_profile) but could also indicate poor security configuration or even malicious activity.

for file in `find . -type f`; do cat $file; done | wc -l
find . -type f | wc -l
while [ $(deborphan | wc -l) -gt 0 ]; do dpkg --purge $(deborphan); done
2009-02-18 22:31:22
User: mulad
Functions: wc
6

Upgraded Debian/Ubuntu/etc. systems may have a number of "orphaned" packages which are just taking up space, which can be found with the "deborphan" command. While you could just do "dpkg --purge $(deborphan)", the act of purging orphans will often create more orphans. This command will get them all in one shot.

grep "processor" /proc/cpuinfo | wc -l
2009-02-17 05:39:49
User: jbcurtis
Functions: grep wc
4

/proc/cpuinfo contains information about the CPU.

Search for "processor" in the /proc/cpuinfo file

wc -l, counts the number of lines.

i=0; f=$(find . -type f -iregex ".*jpg");c=$(echo $f|sed "s/ /\n/g"| wc -l);for x in $f;do i=$(($i + 1));echo "$x $i of $c"; mogrify -strip $x;done
find . \( -name '*.h' -o -name '*.cc' \) | xargs grep . | wc -l
2009-02-09 11:44:35
User: dgomes
Functions: find grep wc xargs
3

Counts number of lines of code in *.h and *.cc files

find . -name "*.py" | xargs wc -l