Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using sort from sorted by
Terminal - Commands using sort - 652 results
sort -s -b -t' ' -k 4.9,4.12n -k 4.5,4.7M -k 4.2,4.3n -k 4.14,4.15n -k 4.17,4.18n -k 4.20,4.21n access.log*
2015-07-16 00:22:03
User: sesom42
Functions: sort
Tags: sort log apache
0

Sort Apache access logs by date and time using sort key field feature

ps -auxf | sort -nr -k 4 | head -10
mosth() { history | awk '{CMD[$2]++;count++;}END { for (a in CMD)print CMD[a] " " CMD[a]/count*100 "% " a;}' | grep -v "./" | column -c3 -s " " -t | sort -nr | nl | head -n10; }
2015-05-11 17:41:55
User: nnsense
Functions: awk column grep head nl sort
0

I copied this (let's be honest) somewhere on internet and I just made it as a function ready to be used as alias. It shows the 10 most used commands from history. This seems to be just another "most used commands from history", but hey.. this is a function!!! :D

du -hs .[^.]* * | sort -h
2015-05-10 12:19:29
User: liminal
Functions: du sort
Tags: du usage disk
2

Same result as with 'du -ks .[^.]* * | sort -n' but with size outputs in human readable format (e.g., 1K 234M 2G)

locate -i /pattern/ | xargs -n1 dirname | sort -u
2015-05-09 21:22:05
User: dardo1982
Functions: dirname locate sort xargs
Tags: find case
0

Uses "locate" instead of "find", "sort -u" instead of "sort | uniq" and it's case insensitive.

du -ks .[^.]* * | sort -n
2015-05-08 12:26:34
User: rdc
Functions: du sort
Tags: du usage disk
0

This command summarizes the disk usage across the files and folders in a given directory, including hidden files and folders beginning with ".", but excluding the directories "." and ".."

It produces a sorted list with the largest files and folders at the bottom of the list

function summaryIP() { < $1 awk '{print $1}' | while read ip ; do verifyIP ${ip} && echo ${ip}; done | awk '{ip_array[$1]++} END { for (ip in ip_array) printf("%5d\t%s\n", ip_array[ip], ip)}' | sort -rn; }
2015-05-01 16:45:05
User: mpb
Functions: awk echo read sort
1

Working with lists of IP addresses it is sometimes useful to summarize a count of how many times an IP address appears in the file.

This example, summarizeIP, uses another function "verifyIP" previously defined in commandlinefu.com to ensure only valid IP addresses get counted. The summary list is presented in count order starting with highest count.

timeDNS() { parallel -j0 --tag dig @{} "$*" ::: 208.67.222.222 208.67.220.220 198.153.192.1 198.153.194.1 156.154.70.1 156.154.71.1 8.8.8.8 8.8.4.4 | grep Query | sort -nk5; }
du -sk -- * | sort -n | perl -pe '@SI=qw(K M G T P); s:^(\d+?)((\d\d\d)*)\s:$1." ".$SI[((length $2)/3)]."\t":e'
2015-04-26 08:07:27
Functions: du perl sort
2

Tested on MacOS and GNU/Linux.

It works in dirs containing files starting with '-'.

It runs 'du' only once.

It sorts according to size.

It treats 1K=1000 (and not 1024)

du -h -d 1 | ack '\d+\.?\d+G' | sort -hr
du -hsx * | sort -rh
sed -n '/url/s#^.*url=\(.*://.*\)#\1#p' ~/.mozilla/firefox/*.[dD]efault/SDBackups/*.speeddial | sort | uniq
2015-02-17 20:56:28
User: return13
Functions: sed sort
0

For all users of https://addons.mozilla.org/de/firefox/addon/speed-dial/

ls -l /dev/disk/by-id |grep -v "wwn-" |egrep "[a-zA-Z]{3}$" |sed 's/\.\.\/\.\.\///' |sed -E 's/.*[0-9]{2}:[0-9]{2}\s//' |sed -E 's/->\ //' |sort -k2 |awk '{print $2,$1}' |sed 's/\s/\t/'
2015-01-25 19:29:40
User: lig0n
Functions: awk egrep grep ls sed sort
Tags: zfs disk info
0

This is much easier to parse and do something else with (eg: automagically create ZFS vols) than anything else I've found. It also helps me keep track of which disks are which, for example, when I want to replace a disk, or image headers in different scenarios. Being able to match a disk to the kernels mapping of said drive the disks serial number is very helpful

ls -l /dev/disk/by-id

Normal `ls` command to list contents of /dev/disk/by-id

grep -v "wwn-"

Perform an inverse search - that is, only output non-matches to the pattern 'wwn-'

egrep "[a-zA-Z]{3}$"

A regex grep, looking for three letters and the end of a line (to filter out fluff)

sed 's/\.\.\/\.\.\///'

Utilize sed (stream editor) to remove all occurrences of "../../"

sed -E 's/.*[0-9]{2}:[0-9]{2}\s//'

Strip out all user and permission fluff. The -E option lets us use extended (modern) regex notation (larger control set)

sed -E 's/->\ //'

Strip out ascii arrows "-> "

sort -k2

Sort the resulting information alphabetically, on column 2 (the disk letters)

awk '{print $2,$1}'

Swap the order of the columns so it's easier to read/utilize output from

sed 's/\s/\t/'

Replace the space between the two columns with a tab character, making the output more friendly

For large ZFS pools, this made creating my vdevs immeasurably easy. By keeping track of which disks were in which slot (spreadsheet) via their serial numbers, I was able to then create my vols simply by copying and pasting the full output of the disk (not the letter) and pasting it into my command. Thereby allowing me to know exactly which disk, in which slot, was going into the vdev. Example command below.

zpool create tank raidz2 -o ashift=12 ata-... ata-... ata-... ata-... ata-... ata-...
ps axo pcpu,args | awk '/[p]hp.*pool/ { sums[$4] += $1 } END { for (pool in sums) { print sums[pool], pool } }' | sort -rn | column -t
find . -printf '%.5m %10M %#9u %-9g %TY-%Tm-%Td+%Tr [%Y] %s %p\n'|sort -nrk8|head
( ps -U nms -o pid,nlwp,cmd:500 | sort -n -k2) && (ps h -U nms -o nlwp | paste -sd+ | bc)
grep 'font-family:[^;]*' <input file.svg> | sed 's/.*font-family:\([^;]*\).*/\1/g' | sort | uniq
2014-11-03 20:38:08
User: caiosba
Functions: grep sed sort
Tags: fonts svg
0

List all fonts used by an SVG file. Useful to find out which fonts you need to have installed in order to open/edit an SVG file appropriately.

netstat -nr|egrep -v "Routing|Interface|lo0"|awk '{print $5}'|sort -u| while read l; do ifconfig $l ; echo " Station Addr: `lanscan -ia|grep "$l "|cut -d ' ' -f 1`" ; done
find -not -empty -type f -printf "%-30s'\t\"%h/%f\"\n" | sort -rn -t$'\t' | uniq -w30 -D | cut -f 2 -d $'\t' | xargs md5sum | sort | uniq -w32 --all-repeated=separate
2014-10-19 02:00:55
User: fobos3
Functions: cut find md5sum sort uniq xargs
1

Finds duplicates based on MD5 sum. Compares only files with the same size. Performance improvements on:

find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate

The new version takes around 3 seconds where the old version took around 17 minutes. The bottle neck in the old command was the second find. It searches for the files with the specified file size. The new version keeps the file path and size from the beginning.

ls | tr '[[:punct:][:space:]]' '\n' | grep -v "^\s*$" | sort | uniq -c | sort -bn
2014-10-14 09:52:28
User: qdrizh
Functions: grep ls sort tr uniq
Tags: sort uniq ls grep tr
3

I'm sure there's a more elegant sed version for the tr + grep section.

cat /etc/httpd/logs/access.log | awk '{ print $6}' | sed -e 's/\[//' | awk -F'/' '{print $1}' | sort | uniq -c
2014-10-13 13:39:53
User: suyashjain
Functions: awk cat sed sort uniq
1

The command will read the apache log file and fetch the virtual host requested and the number of requests.

git reflog --date=local | grep "Oct 2 .* checkout: moving from .* to" | grep -o "[a-zA-Z0-9\-]*$" | sort | uniq
2014-10-03 15:12:22
User: Trindaz
Functions: grep sort
0

Replace "Oct 2" in the first grep pattern to be the date to view branch work from

(ps -U nms -o pid,nlwp,cmd:500 | sort -n -k2) && (ps -U nms -o nlwp | tail -n +2 | paste -sd+ | bc)
2014-09-30 18:25:56
User: cmullican
Functions: paste ps sort tail
0

I occasionally need to see if a machine is hitting ulimit for threads, and what process is responsible. This gives me the total number, sorted low to high so the worst offender is at the end, then gives me the total number of threads, for convenience.

history|awk '{print $2}'|sort|uniq -c|sort -rn|head -30|awk '!max{max=$1;}{r="";i=s=100*$1/max;while(i-->0)r=r"#";printf "%50s %5d %s %s",$2,$1,r,"\n";}'
2014-09-29 12:40:43
User: injez
Functions: awk head printf sort uniq
0

Top 30 History Command line with histogram display

tcpdump -tnn -c 2000 -i eth0 | awk -F "." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr | awk ' $1 > 10 '
2014-09-26 01:15:23
User: hochmeister
Functions: awk sort tcpdump uniq
1

capture 2000 packets and print the top 10 talkers