Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,209 results
for i in `cat /etc/passwd | awk -F : '{ print $1 }';`; do passwd -e $i; done
ompload() { curl -# -F file1=@"$1" http://ompldr.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;print}';}
2009-11-07 20:56:52
User: eightmillion
Functions: awk
8

This function uploads images to http://omploader.org and then prints out the links to the file.

Some coloring can also be added to the command with:

ompload() { curl -F file1=@"$1" http://omploader.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;sub(/^/,"\033[0;34m");sub(/:/,"\033[0m:");print}';}
ifconfig | awk '/HW/ {print $5}'
2009-11-05 18:00:50
User: Cont3mpo
Functions: awk ifconfig
0

Simple MAC adrress, thanks to ifconfig.

mount | awk '/:/ { print $3 } ' | xargs sudo umount
ip link show eth0 | grep "link/ether" | awk '{print $2}'
2009-11-05 17:06:15
User: maxmanders
Functions: awk grep link
Tags: mac
0

...or for a particular interface...

ip link | grep 'link/ether' | awk '{print $2}'
2009-11-04 19:41:26
User: markdrago
Functions: awk grep link
Tags: mac
1

I much prefer using /sbin/ip over /sbin/ifconfig for most everything. I find the interface and output to be much more consistent and it has many abilities that ifconfig, route, etc. do not. To get the mac address for only one interface, add 'show dev [interface]' to the 'ip link' part of the command: ip link show dev eth0 | grep 'link/ether' | awk '{print $2}' . Also, both this command and the ifconfig one do not require root access to run, so the sudo is not necessary.

ps -ec -o command,rss | grep Stainless | awk -F ' ' '{ x = x + $2 } END { print x/(1024) " MB."}'
2009-11-04 19:01:22
Functions: awk grep ps
0

Adds up the total memory used by all Stainless processes: 1 Stainless, 1 StainlessManager and 1 StainlessClient per tab open.

svn ci `svn stat |awk '/^A/{printf $2" "}'`
ifconfig eth1 | grep inet\ addr | awk '{print $2}' | cut -d: -f2 | sed s/^/eth1:\ /g
2009-11-03 19:26:40
User: TuxOtaku
Functions: awk cut grep ifconfig sed
2

Sometimes, you don't really care about all the other information that ifconfig spits at you (however useful it may otherwise be). You just want an IP. This strips out all the crap and gives you exactly what you want.

mysql -u <user> --password=<password> -e "SHOW COLUMNS FROM <table>" <database> | awk '{print $1}' | tr "\n" "," | sed 's/,$//g'
2009-10-29 13:42:17
User: maxmanders
Functions: awk sed tr
-1

Useful when you need to write e.g. an INSERT for a table with a large number of columns. This command will retrieve the column names and comma-separate them ready for INSERT INTO(...), removing the last comma.

ps -eo user,pcpu,pmem | tail -n +2 | awk '{num[$1]++; cpu[$1] += $2; mem[$1] += $3} END{printf("NPROC\tUSER\tCPU\tMEM\n"); for (user in cpu) printf("%d\t%s\t%.2f\t%.2f\n",num[user], user, cpu[user], mem[user]) }'
2009-10-29 12:49:01
User: georgz
Functions: awk ps tail
7

The original version gives an error, here is the correct output

TIMEUNIT=$(awk '/timescale/{print NR}' a)
TIMEUNIT=$( cat a | grep -n "timescale" | awk -F ":" '{ print $1 } ' )
awk 'BEGIN {for(i=1;i<=100;i++)sum+=i}; END {print sum}' /dev/null
2009-10-26 18:24:57
User: dennisw
Functions: awk
Tags: awk
0

Calculating series with awk only, no need for seq: add numbers from 1 to 100

Variations:

1+3+...+(2n-1) = n^2

awk 'BEGIN {for(i=1;i<=19;i+=2)sum+=i}; END {print sum}' /dev/null # displays 100

1/2 + 1/4 + ... = 1

awk 'BEGIN {for(i=1;i<=10;i++)sum+=1/(2**i)}; END {print sum}' /dev/null # displays 0.999023
calc(){ awk "BEGIN{ print $* }" ;}
2009-10-23 06:03:07
User: twfcc
Functions: awk
12

simple function , floating point number is supported.

awk 'BEGIN{dir=DIR?DIR:ENVIRON["PWD"];l=split(dir,parts,"/");last="";for(i=1;i<l+1;i++){d=last"/"parts[i];gsub("//","/",d);system("ls -ld \""d"\"");last=d}}'
2009-10-22 16:28:07
User: arcege
Functions: awk
-1

Handled all within awk. Takes the value from $PWD and constructs directory structures and runs commands against them. The gsub() call is not necessary, but added for better visibility.

If a variable DIR is given on the awk command-line, then that directory is used instead:

awk -vDIR=$HOME/.ssh 'BEGIN{dir=DIR?...}'
awk 'FNR==5' <file>
2009-10-20 22:52:41
User: dennisw
Functions: awk
1

Just one character longer than the sed version ('FNR==5' versus -n 5p). On my system, without using "exit" or "q", the awk version is over four times faster on a ~900K file using the following timing comparison:

testfile="testfile"; for cmd in "awk 'FNR==20'" "sed -n '20p'"; do echo; echo $cmd; eval "$cmd $testfile"; for i in {1..3}; do time for j in {1..100}; do eval "$cmd $testfile" >/dev/null; done; done; done

Adding "exit" or "q" made the difference between awk and sed negligible and produced a four-fold improvement over the awk timing without the "exit".

For long files, an exit can speed things up:

awk 'FNR==5{print;exit}' <file>
awk '{if (NR == 3) print}' <file>
2009-10-19 15:58:09
User: yooreck
Functions: awk
-5

I don't know if it's better but works fine :)

for i in $(netstat --inet -n|grep ESTA|awk '{print $5}'|cut -d: -f1);do geoiplookup $i;done
2009-10-18 20:41:47
Functions: awk cut grep netstat
3

Sample command to obtain a list of geographic localization for established connections, extracted from netstat. Need geoiplookup command ( part of geoip package under CentOS)

for file in *.iso; do mkdir `basename $file | awk -F. '{print $1}'`; sudo mount -t iso9660 -o loop $file `basename $file | awk -F. '{print $1}'`; done
HDD=$(df | awk ' NR>3 (S=$5) (M=$6) { if (S>90) print "Your Systems "M" is """S" Full" } ') ; [[ $HDD ]] && echo "$HDD" | mail -s "Hard-Drives Full" [email protected] -- -f [email protected] >/dev/null
seq 4|xargs -n1 -i bash -c "echo -n 164.85.216.{} - ; nslookup 164.85.216.{} |grep name"|tr -s ' ' ' '|awk '{print $1" - "$5}'|sed 's/.$//'
URL=[target.URL]; curl -q -d "url=$URL" http://untr.im/api/ajax/api | awk -F 'href="' '{print $3}' | awk -F '" rel="' '{print $1}'
find . -type d -print | sed -e 's;[^/]*/;..........;g'|awk '{print $0"-("NR-1")"}'
awk '{print length, $0;}' | sort -nr