Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,117 results
ip addr list | grep global | awk '{print $7"\t"$2}'
randchannelurl=$(lynx -dump http://www.tvcatchup.com/channels.html | grep watch | sed 's/^......//'| awk 'BEGIN { srand() } int(rand() * NR) == 0 { x = $0 } END { print x }') && firefox -new-window $randchannelurl
2013-08-01 10:38:10
User: dunryc
Functions: awk grep sed watch
0

becuase im lazy and cant be bothered looking at the tv guide to choose a channel , any improvments or comments appreciated

ifconfig | egrep [0-9A-Za-z]{2}\(:[0-9A-Za-z]{2}\){5} | awk '{print $1 ":\t" $5}'
2013-07-30 17:02:07
User: jaimeanrm
Functions: awk egrep ifconfig
1

Is the better option on a Open SuSE Box

curl -sL http://www.dell.com/support/troubleshooting/us/en/555/Servicetag/$(dmidecode -s system-serial-number) | html2text -style pretty | awk -F\. '/with an end date of/ { print $1 "."}'
2013-07-30 14:46:12
User: mhollick
Functions: awk date
1

pretty much the same.

I use awk rather than grep and perl.

It looks like the URL has been updated.

The service tag can also be retrieved via snmp - potential for a for loop over a list of servers. I might have a look into doing an example.

for m in `df -P | awk -F ' ' '{print $NF}' | sed -e "1d"`;do n=`df -P | grep "$m$" | awk -F ' ' '{print $5}' | cut -d% -f1`;i=0;if [[ $n =~ ^-?[0-9]+$ ]];then printf '%-25s' $m;while [ $i -lt $n ];do echo -n '=';let "i=$i+1";done;echo " $n";fi;done
2013-07-29 20:12:39
User: drockney
Functions: awk cut echo grep printf sed
Tags: bash
5

Automatically drops mount points that have non-numeric sizes (e.g. /proc). Tested in bash on Linux and AIX.

while read X ; do printf "$X --"; virsh dumpxml $X | egrep "source dev|source file"; done< <(virsh list | awk '$1 ~ /^[1-9]/ { print $2 }')
2013-07-29 17:32:59
User: hugme
Functions: awk egrep printf read
0

This will strip out the relivent disk information from kvm. I'm using it to find disks on a SAN which are no longer in use.

curl -s http://www.census.gov/popclock/data/population/world | awk -F'[:,]' '{print $7}'
dpkg-query --show --showformat='${Package;-50}\t${Installed-Size}\n' `aptitude --display-format '%p' search '?installed!?automatic'` | sort -k 2 -n | grep -v deinstall | awk '{printf "%.3f MB \t %s\n", $2/(1024), $1}'
2013-07-26 23:18:20
User: EvilDennisR
Functions: awk grep sort
0

The other commands were good, but they included packages that were installed and then removed.

This command only shows packages that are currently installed, sorts smallest to largest, and formats the sizes to be human readable.

uptime | awk -F ',' ' {print $1" "$2}'|awk ' {print $3" "$4" "$5}' | sed '1,$s/:/ /' | awk ' {if ($4 =="user") print $1*60 + $2;else if ($2=="mins") print $1;else print $1*24*60 + $2*60 + $3}'
2013-07-19 13:28:29
User: tatgren
Functions: awk sed uptime
Tags: uptime minutes
0

find the uptime and convert in minutes, works with AIX and Linux too

cut -d, -f1 /var/opt/example/dumpfile.130610_subscriber.csv | cut -c3-5 | sort | uniq -c | sed -e 's/^ *//;/^$/d' | awk -F" " '{print $2 "," $1}' > SubsxPrefix.csv
2013-07-17 07:58:56
User: neomefistox
Functions: awk cut sed sort uniq
Tags: Linux UNIX
0

dumpfile is a CSV file, which its 1st field is a phone number in format CC+10 digits

Empty lines are deleted, before the output in format "prefix,ocurrences"

find . | sort | awk 'NR%2==0' | xargs rm $1
2013-07-11 07:36:18
User: sucotronic
Functions: awk find rm sort xargs
-1

If you have a directory with lot of backups (full backups I mean), when it gets to some size, you could want to empty some space. With this command you'll remove half of the files. The command assumes that your backup files starts with YYYYMMDD or that they go some alphabetical order.

history | awk '{$1="";print substr($0,2)}'
2013-07-07 08:00:26
User: Fagood
Functions: awk
Tags: history awk
0

alias h="history | awk '{\$1=\"\";print substr(\$0,2)}'"

# h

[ 07/07/2013 10:04:53 ] alias h="history | awk '{\$1=\"\";print substr(\$0,2)}'"

ls -lh file-name | awk '{ print $5}'
history | awk '{CMD[$4]++;count++;} END { for (a in CMD )print CMD[a] " " CMD[a]/count*100 "% " a }' | sort -nr | nl | column -t | head -n 10
ls -lt | awk '{sum+=$5} END {print sum}'
2013-07-03 20:12:54
User: martinmorono
Functions: awk ls
0

Use awk to sum and print the space used by a group of files.

It works well as long as the space used is not bigger than 79094548.80...

I found that upper limit when trying to find out what was the total amount of recoverable space from a set of directories:

user@servername:/home/user/scripts>for dirName in aleph_bin aleph_sh aleph_work dailycheck INTERFAZ ; do echo "${dirName} = $(cat /tmp/purge_ocfs_dir.*.log | awk '{sum+=$5} END {printf "%4.2f", sum}') "; done

aleph_bin = 79094548.80

aleph_sh = 79094548.80

aleph_work = 79094548.80

dailycheck = 79094548.80

INTERFAZ = 79094548.80

In the worst case scenario, the total number might be almost 137G.

user@servername:/home/user/scripts>df -h /ocfs/*

Filesystem Size Used Avail Use% Mounted on

//argalephfsprod/aleph_bin$

137G 38G 99G 28% /ocfs/aleph_bin

//argalephfsprod/aleph_sh$

137G 38G 99G 28% /ocfs/aleph_sh

//argalephfsprod/aleph_work$

280G 135G 146G 49% /ocfs/aleph_work

//argalephfsprod/dailycheck$

137G 38G 99G 28% /ocfs/dailycheck

//argalephfsprod/INTERFAZ/

137G 38G 99G 28% /ocfs/INTERFAZ

Any suggestion about how to get the correct amount of space for total over 80 Mbytes?

awk '{if(!seen[$0]++) {print $0;}}'
netstat -tn | awk '{print $5}' | egrep -v '(localhost|\*\:\*|Address|and|servers|fff|127\.0\.0)' | sed 's/:[0-99999999].*//g'
2013-06-13 14:35:38
User: kehansen
Functions: awk egrep netstat sed
0

I used this to get all the remote connection ip addresses connected to my server... I had to start storing and tracking this data so thats why i built this out... probably not optimal as far as the egrep regex but it works ;)

git branch -r | awk '{print $1}' | egrep -v -f /dev/fd/0 <(git branch -vv | grep origin) | awk '{print $1}' | xargs git branch -d
sudo /usr/sbin/exim -bp | sed -n '/\*\*\* frozen \*\*\*/,+1!p' | awk '{print $1}' | tr -d [:blank:] | grep @ | sort | uniq -c | sort -n
sudo lsof -p `sudo ps aux | grep -i neo4j | grep -v grep | awk '{ print $2 }'`
2013-06-02 10:15:30
User: andycunn
Functions: awk grep ps sudo
0

Inner "ps...grep..." command searches for a process matching the specified .

"lsof -p lists all file descriptors owned by . Includes open files, sockets, devices, etc...

cd /srcfolder; tar -czf - . | pv -s `du -sb . | awk '{print $1}'` | ssh -c arcfour,blowfish-cbc -p 50005 root@destination.com "tar -xzvf - -C /dstfolder"
2013-05-30 07:21:06
User: bhbmaster
Functions: awk cd ssh tar
Tags: ssh tar pv
0

NOTE: When doing these commands when asked for questions there might be flowing text from the pv doing the progress bar just continue typing as if its not there, close your eyes if it helps, there might be a yes or no question, type "yes" and ENTER to it, and also it will ask for a password, just put in your password and ENTER

I talk alot more about this and alot of other variations of this command on my site:

http://www.kossboss.com/linuxtarpvncssh

grep '.tag =' <file> | awk '{print $3}' | awk 'sub(/[;]/, x)' | sort -n
2013-05-21 15:58:16
User: pbriggeman
Functions: awk grep sort
0

I use this one-liner to search my sourcecode to find out where tags are named and since there's no easy way in XCode to see what values have already been used.

svn info | grep ^URL | awk -F\/ '{print $NF}'
more restart_weblogic.log | grep "LISTEN" | awk '{ print $7 }' | uniq | wc -l
ls | paste --delimiters='*' - ./zzz | awk ' BEGIN{FS="*";} { system("mv " $1 " \"" $2 "\"") }'
2013-05-13 15:44:07
User: skilowatt
Functions: awk ls paste
0

Rename all files in current directory by names from text file 'zzz'