Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using sort from sorted by
Terminal - Commands using sort - 608 results
ps axo rss,comm,pid | awk '{ proc_list[$2] += $1; } END { for (proc in proc_list) { printf("%d\t%s\n", proc_list[proc],proc); }}' | sort -n | tail -n 10
ps axo rss,comm,pid | awk '{ proc_list[$2]++; proc_list[$2 "," 1] += $1; } END { for (proc in proc_list) { printf("%d\t%s\n", proc_list[proc "," 1],proc); }}' | sort -n | tail -n 10
2010-03-03 16:41:05
User: d34dh0r53
Functions: awk ps sort tail
5

This command loops over all of the processes in a system and creates an associative array in awk with the process name as the key and the sum of the RSS as the value. The associative array has the effect of summing a parent process and all of it's children. It then prints the top ten processes sorted by size.

git log --reverse --pretty=oneline | cut -c41- | nl | sort -nr
lsof -c $processname | egrep 'w.+REG' | awk '{print $9}' | sort | uniq
2010-02-24 16:47:49
User: alustenberg
Functions: awk egrep sort
1

lists all files that are opened by processess named $processname

egrep 'w.+REG' is to filter out non file listings in lsof, awk to get the filenames, and sort | uniq to remove duplciation

du -x --max-depth=1 | sort -n | awk '{ print $2 }' | xargs du -hx --max-depth=0
2010-02-18 19:46:47
User: d34dh0r53
Functions: awk du sort xargs
4

Provides numerically sorted human readable du output. I so wish there was just a du flag for this.

find . -type f |sed "s#.*/##g" |sort |uniq -c -d
2010-02-17 11:59:54
User: shadycraig
Functions: find sed sort uniq
0

Useful for C projects where header file names must be unique (e.g. when using autoconf/automake), or when diagnosing if the wrong header file is being used (due to dupe file names)

ps aux | sed -n '/USER/!s/\([^ ]\) .*/\1/p' | sort -u
2010-02-10 05:56:26
User: infinull
Functions: ps sed sort
1

This is different that `who` in that who only cares about logged-in users running shells, this command will show all daemon users and what not; also users logged in remotely via SSH but are running SFTP/SCP only and not a shell.

find /path/to/dir -type f -printf "%T@|%p\n" 2>/dev/null | sort -n | tail -n 1| awk -F\| '{print $2}'
du -sk ./* | sort -nr
2010-02-04 04:08:05
User: op4
Functions: du sort
4

full command below, would not let me put full command in text box

du -sk ./* | sort -nr | awk 'BEGIN{ pref[1]="K"; pref[2]="M"; pref[3]="G";} { total = total + $1; x = $1; y = 1; while( x > 1024 ) { x = (x + 1023)/1024; y++; } printf("%g%s\t%s\n",int(x*10)/10,pref[y],$2); } END { y = 1; while( total > 1024 ) { total = (total + 1023)/1024; y++; } printf("Total: %g%s\n",int(total*10)/10,pref[y]); }'

while read l; do echo $RANDOM "$l"; done | sort -n | cut -d " " -f 2-
2010-02-03 22:36:34
User: ketil
Functions: cut echo read sort
0

If you need to randomize the lines in a file, but have an old sort commands that doesn't support the -R option, this could be helpful. It's easy enough to remember so that you can create it as a script and use that.

It ain't real fast. It ain't safe. It ain't super random. Do not use it on untrusted data. It requires bash for the $RANDOM variable to work.

export QQ=$(mktemp -d);(cd $QQ; curl -s -O http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/[0-2400:25];for i in $(perl -ne 'print "$1\n" if( /^(\w+\(\))/ )' *|sort -u);do grep -h -m1 -B1 $i *; done)|grep -v '^--' > clf.sh;rm -r $QQ
2010-01-30 19:47:42
User: bartonski
Functions: cd export grep mktemp perl sort
8

Each shell function has its own summary line, as a comment. If there are multiple shell functions with the same name, the function with the highest number of votes is put into the file.

Note: added 'grep -v' to the end of the pipeline, to eliminate extraneous lines containing only '--'. Thanks to matthewbauer for pointing this out.

for i in `netstat -rn |grep lan |cut -c55-60 |sort |uniq`; do ifconfig $i; done
2010-01-28 17:35:20
User: Kaio
Functions: cut grep ifconfig sort
-4

HP UX doesn't have a -a switch in the ifconfig command.

This line emulates the same result shown in Solaris, AIX or Linux

du -s * | sort -nr | head | cut -f2 | parallel -k du -sh
2010-01-28 12:59:14
Functions: cut du head sort
Tags: du xargs parallel
-2

If a directory name contains space xargs will do the wrong thing. Parallel https://savannah.nongnu.org/projects/parallel/ deals better with that.

find -type d -name ".svn" -prune -o -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type d -name ".svn" -prune -o -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate
2010-01-28 09:45:29
User: 2chg
Functions: find md5sum sort uniq xargs
2

Improvement of the command "Find Duplicate Files (based on size first, then MD5 hash)" when searching for duplicate files in a directory containing a subversion working copy. This way the (multiple dupicates) in the meta-information directories are ignored.

Can easily be adopted for other VCS as well. For CVS i.e. change ".svn" into ".csv":

find -type d -name ".csv" -prune -o -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type d -name ".csv" -prune -o -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate
find -not -empty -type f -printf "%s\n" | sort | uniq -d | parallel find -type f -size {}c | parallel md5sum | sort | uniq -w32 --all-repeated=separate
2010-01-28 08:40:18
Functions: find md5sum sort uniq
Tags: xargs parallel
-1

A bit shorter and parallelized. Depending on the speed of your cpu and your disk this may run faster.

Parallel is from https://savannah.nongnu.org/projects/parallel/

sudo netselect -v -s3 $(curl -s http://dns.comcast.net/dns-ip-addresses2.php | egrep -o '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' | sort | uniq)
2010-01-27 00:03:44
User: hackerb9
Functions: egrep sort sudo
2

Comcast is an ISP in the United States that has started hijacking DNS requests as a "service" for its customers. For example, in Firefox, one used to be able to do a quick "I'm Feeling Lucky" Google search by typing a single word into the URL field, assuming the word is not an existing domain when surrounded by www.*.com. Comcast customers never receive the correct NX (non-existent domain) error from DNS. Instead, they are shown a page full of advertising. There is a way to "opt out" from their service, but that requires having the account password and the MAC address of your modem handy. For me, it was easier just to set static DNS servers. But the problem is, which ones to choose? That's what this command answers. It'll show you the three _non-hijacked_ Comcast DNS servers that are the shortest distance away.

Perhaps you don't have Comcast (lucky you!), but hopefully this command can serve as an example of using netselect to find the fastest server from a list. Note that, although this example doesn't show it, netselect will actually perform the uniq and DNS resolution for you.

Requires: netselect, curl, sort, uniq, grep

nmap -sP <subnet>.* | egrep -o '[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+' > results.txt ; for IP in {1..254} ; do echo "<subnet>.${IP}" ; done >> results.txt ; cat results.txt | sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4 | uniq -u
for i in $(ps -ef | awk '{print $2}') ; { swp=$( awk '/Swap/{sum+=$2} END {print sum}' /proc/$i/smaps ); if [[ -n $swp && 0 != $swp ]] ; then echo -n "\n $swp $i "; cat /proc/$i/cmdline ; fi; } | sort -nr
paste -d "." <(curl http://.../dist.female.first http://.../dist.male.first | cut -d " " -f 1 | sort -uR) <(curl http://..../dist.all.last | cut -d " " -f 1 | sort -R | head -5163) | tr "[:upper:]" "[:lower:]" | sed 's/$/@test.domain/g'
2010-01-21 19:52:28
User: connorsy
Functions: cut head paste sed sort tr
0

** Replace the ... in URLS with:

www.census.gov/genealogy/www/data/1990surnames

Couldn't fit in 256

Created on Ubuntu 9.10 but nothing out of the ordinary, should work anywhere with a little tweaking. 5163 is the number of unique first names you get when combine the male and female first name files from. http://www.census.gov/genealogy/www/data/1990surnames/names_files.html

cat -n <file> | sort -k 2 | uniq -f 1 | sort -n | cut -f 2-
2010-01-21 18:55:58
User: fpunktk
Functions: cat cut sort uniq
4

i wanted to delete all duplicate lines from .bash_history and keep the order of the other lines.

the command cat's the file and adds line numbers, then sorts by the second column. afterwards uniq omits repeated lines, but skips the first field (the line number). then it sorts by the line numbers and at the end cuts the numbers off.

lynx --width=200 --dump 'http://quake.usgs.gov/recenteqs/Maps/San_Francisco_eqs.htm'|sed -ne '/MAG.*/,/^References/{;s/\[[0-9][0-9]*\]//;1,/h:m:s/d;/Back to map/,$d;/^$/d;/^[ \t][ \t]*[3-9]\.[0-9][0-9]*[ \t][ \t]*/p; }'|sort -k1nr
2010-01-08 20:52:28
User: KevinM
Functions: sed sort
1

To see only earthquakes for today, add another pipe to egrep "`date '+%Y/%m/%d'`"

git log --format='%aN' | sort -u
2010-01-07 18:55:21
User: jedahan
Functions: sort
Tags: git
14

This should work even if the output format changes.

(bzcat BZIP2_FILES && cat TEXT_FILES) | grep -E "Invalid user|PAM" | grep -o -E "from .+" | awk '{print $2}' | sort | uniq >> /etc/hosts.deny
2010-01-03 04:41:51
User: jayhawkbabe
Functions: awk cat grep sort uniq
3

Searches all log files (including archived bzip2 files) for invalid user and PAM authentication errors, both of which are indicative of brute force attempts at logging into computer. A list of all unique IP addresses and domain names is appended to hosts.deny. The command (and grep error messages) will work on Mac OS X 10.6, small adjustments may be needed for other OSs.

find . \! -type d | rev | sort | rev | tar c --files-from=- --format=ustar | bzip2 --best > a.tar.bz2
2009-12-20 14:04:39
User: pornel
Functions: bzip2 c++ find rev sort tar
2

Avoids creating useless directory entries in archive, and sorts files by (roughly) extension, which is likely to group similar files together for better compression. 1%-5% improvement.

zcat access_log.*.gz | awk '{print $7}' | sort | uniq -c | sort -n | tail -n 20