Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged uniq from sorted by
Terminal - Commands tagged uniq - 49 results
find . -type f | awk -F'.' '{print $NF}' | sort| uniq -c | sort -g
ls | grep -Eo "\..+" | sort -u
ls -Xp | grep -Eo "\.[^/]+$" | sort | uniq
2011-02-10 20:47:59
User: Amarok
Functions: grep ls sort
Tags: uniq ls grep
4

Works on current directory, with built-in sorting.

svn log -q | grep '^r[0-9]' | cut -f2 -d "|" | sort | uniq -c | sort -nr
2011-01-03 15:23:08
User: kkapron
Functions: cut grep sort uniq
2

list top committers (and number of their commits) of svn repository.

in this example it counts revisions of current directory.

tr -cs A-Za-z '\n' | sort | uniq -ci
2010-10-20 04:12:58
Functions: sort tr uniq
Tags: sort uniq tr
0

Gives the same results as the command by putnamhill using nine less characters.

tr A-Z a-z | tr -cs a-z '\n' | sort | uniq -c
ls | perl -lne '++$x{lc $1} if /[.](.+)$/ }{ print for keys %x'
2010-08-13 20:05:15
User: recursiverse
Functions: ls perl
-3

All with only one pipe. Should be much faster as well (sort is slow). Use find instead of ls for recursion or reliability.

Edit: case insensitive

ls -Xp /path/to/dir | grep -Eo "\.[^/]+$" | uniq
2010-08-12 16:32:54
User: karpoke
Functions: grep ls
Tags: uniq ls grep
0

If we want files with more than one extension, like .tar.gz, only appear the latest, .gz:

ls -Xp /path/to/dir | grep -Eo "\.[^./]+$" | uniq
find /path/to/dir -type f -name '*.*' | sed 's@.*/.*\.@.@' | sort | uniq
2010-08-12 15:48:54
User: putnamhill
Functions: find sed sort
1

If your grep doesn't have an -o option, you can use sed instead.

find /path/to/dir -type f | grep -o '\.[^./]*$' | sort | uniq
alias sorth='sort --help|sed -n "/^ *-[^-]/s/^ *\(-[^ ]* -[^ ]*\) *\(.*\)/\1:\2/p"|column -ts":"'
3

Once you get into advanced/optimized scripts, functions, or cli usage, you will use the sort command alot. The options are difficult to master/memorize however, and when you use sort commands as much as I do (some examples below), it's useful to have the help available with a simple alias. I love this alias as I never seem to remember all the options for sort, and I use sort like crazy (much better than uniq for example).

# Sorts by file permissions

find . -maxdepth 1 -printf '%.5m %10M %p\n' | sort -k1 -r -g -bS 20%

00761 drwxrw---x ./tmp

00755 drwxr-xr-x .

00701 drwx-----x ./askapache-m

00644 -rw-r--r-- ./.htaccess

# Shows uniq history fast

history 1000 | sed 's/^[0-9 ]*//' | sort -fubdS 50%

exec bash -lxv

export TERM=putty-256color

Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html

pgrep -cu ioggstream
tail -n2000 /var/www/domains/*/*/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{ if ($1 > 20)print $1,$2}'
netstat -an | awk '/tcp/ {print $6}' | sort | uniq -c
2010-05-06 17:04:37
User: Kered557
Functions: awk netstat sort uniq
1

Counts TCP states from Netstat and displays in an ordered list.

grep <something> logfile | cut -c2-18 | uniq -c
2010-04-29 11:26:09
User: buzzy
Functions: cut grep uniq
Tags: uniq grep cut
1

The cut should match the relevant timestamp part of the logfile, the uniq will count the number of occurrences during this time interval.

tail -f access_log | cut -c2-21 | uniq -c
2010-04-29 11:16:54
User: buzzy
Functions: cut tail uniq
Tags: uniq tail cut
4

Change the cut range for hits per 10 sec, minute and so on... Grep can be used to filter on url or source IP.

ps hax -o user | sort | uniq -c
grep current_state= /var/log/nagios/status.dat|sort|uniq -c|sed -e "s/[\t ]*\([0-9]*\).*current_state=\([0-9]*\)/\2:\1/"|tr "\n" " "
curl -s http://tinyurl.com/create.php?url=http://<website.url>/ | sed -n 's/.*\(http:\/\/tinyurl.com\/[a-z0-9][a-z0-9]*\).*/\1/p' | uniq
wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
2009-05-04 16:00:39
User: alperyilmaz
Functions: awk grep perl sed sort tr uniq wget
-4

This command might not be useful for most of us, I just wanted to share it to show power of command line.

Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination.

This command removes numbers and single characters from count. I'm sure you can write a shorter version.

netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
2009-03-28 21:02:26
User: tiagofischer
Functions: awk cut netstat sort uniq
14

Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount.

BSD Version:

netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr
cat file.txt | sort | uniq -dc
2009-03-21 18:15:14
User: Vadi
Functions: cat sort uniq
1

Displays the duplicated lines in a file and their occuring frequency.

zgrep "Failed password" /var/log/auth.log* | awk '{print $9}' | sort | uniq -c | sort -nr | less
2009-03-03 13:45:56
User: dbart
Functions: awk sort uniq zgrep
8

This command checks for the number of times when someone has tried to login to your server and failed. If there are a lot, then that user is being targeted on your system and you might want to make sure that user either has remote logins disabled, or has a strong password, or both. If your output has an "invalid" line, it is a summary of all logins from users that don't exist on your system.

netstat -alpn | grep :80 | awk '{print $4}' |awk -F: '{print $(NF-1)}' |sort | uniq -c | sort -n