Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,112 results
ifconfig eth0|awk '/HWaddr/{gsub(/:/,"",$5);print $5}'
ps aux | grep <process> | grep -v grep | awk '{print $2}' | xargs -i -t kill -9 {}
ls -lT -rt | grep "^-" | awk 'BEGIN {START=2002} (START <= $9){ print $10 ;START=$9 }' | tail -1
2013-02-24 23:39:22
User: Glamdring
Functions: awk grep ls tail
Tags: ls date osx
0

On the Mac, the 'ls' function can sort based on month/day/time, but seems to lack ability to filter on the Year field (#9 among the long listed fields). The sorted list continuously increases the 'START' year for the most recently accessed set of files. The final month printed will be the highest month that appeared in that START year. The command does its magic on the current directory, and suitably discards all entries that are themselves directories. If you expect files dating prior to 2002, change the START year accordingly.

ps auxw | grep sbin/apache | awk '{print"-p " $2}' | xargs strace -f
2013-02-19 19:14:57
User: msealand
Functions: awk grep ps strace xargs
1

This version also attaches to new processes forked by the parent apache process. That way you can trace all current and *future* apache processes.

more blast.out| grep virus | awk '{print $1}' > virus_id.txt
sudo ifconfig wlan0 | grep inet | awk 'NR==1 {print $2}' | cut -c 6-
2013-02-18 14:10:07
User: mouths
Functions: awk cut grep ifconfig sudo
-1

On wired connections set 'eth0' instead of 'wlan0'

dpkg-query -Wf '${Installed-Size}\t${Package}\n' | grep "\-dev" | sort -n | awk '{ sum+=$1} END {print sum/1024 "MB"}'
dpkg -l | grep ^rc | awk '{ print $2}' | xargs apt-get -y remove --purge
2013-02-15 01:34:37
User: Richzendy
Functions: awk grep xargs
0

completely remove those packages that leave files in debian / ubuntu marked with rc and not removed completely with traditional tools

awk '{for (i=9;i<=NF;i++) {printf "%s",$i; printf "%s", " ";}; printf "\n"}'
2013-02-12 13:57:43
User: adimania
Functions: awk printf
Tags: awk
0

It'll print the file names preserving the spaces in their names and adding new line after every new filename.

I wrote this to quickly find out how many files in any directory is owned by a particular user. This can be extended using pipe and grep to do much more.

msgfilter --keep-header --input input.po awk '{}' | sed '/^#$/d; /^#[^\:\~,\.]/d' >empty.po
2013-02-08 08:05:32
User: seanf
Functions: awk sed
0

Also removes translator comments. You can remove the header by omitting --keep-header, but if your msgids contain non-ASCII characters you will need the header to specify a suitable charset.

df -H | grep -vE '^Filesystem|tmpfs|cdrom|none' | awk '{ print $5 " " $1 }'
find . -name '*.jpg' | awk 'BEGIN{ a=0 }{ printf "mv %s name%01d.jpg\n", $0, a++ }' | bash
2013-02-07 06:12:37
User: doublescythe
Functions: awk find printf
0

This command will take the files in a directory, rename them, and then number them from 1...N.

Black belt stuff.

Hell of a time saver.

sudo dd if=/dev/sdc bs=4096 | pv -s `sudo mount /dev/sdc /media/sdc && du -sb /media/sdc/ |awk '{print $1}' && sudo umount /media/sdc`| sudo dd bs=4096 of=~/USB_BLACK_BACKUP.IMG
load=`uptime|awk -F',' '{print $3}'|awk '{print $3}'`; if [[ $(echo "if ($load > 1.0) 1 else 0" | bc) -eq 1 ]]; then notify-send "Load $load";fi
2013-02-06 08:30:24
User: adimania
Functions: awk echo
0

I run this via crontab every one minute on my machine occasionally to see if a process is eating up my system's resources.

for x in `ps -u 500 u | grep java | awk '{ print $2 }'`;do ls /proc/$x/fd|wc -l;done
dpkg-query -Wf '${Package}\n' | xargs dpkg --status | sed '/^Conffiles:/,/^Description:/!d;//d' | awk '{print $2 " " $1}' | md5sum -c 2>/dev/null | grep FAILED$ | cut -f1 -d':'
2013-01-31 16:52:38
User: hallmarc
Functions: awk cut grep md5sum sed xargs
0

This functionality seems to be missing from commands like dpkg. Ideally, I want to duplicate the behavior of rpm --verify, but it seems difficult to do this in one relatively short command pipeline.

xprop | awk '/PID/ {print $3}'
for i in `pfiles pid|grep S_IFREG|awk '{print $5}'|awk -F":" '{print $2}'`; do find / -inum $i |xargs ls -lah; done
2013-01-24 13:57:19
User: giorger
Functions: awk find grep ls xargs
0

Executing pfiles will return a list of all descriptors utilized by the process

We are interested in the S_IFREG entries since they are pointing usually to files

In the line, there is the inode number of the file which we use in order to find the filename.

The only bad thing is that in order not to search from / you have to suspect where could possibly be the file.

Improvements more than welcome.

lsof was not available in my case

find $folder -name "[1-9]*" -type f -print|while read file; do echo $file $(sed -e '/^$/Q;:a;$!N;s/\n //;ta;s/ /_/g;P;D' $file|awk '/^Received:/&&!r{r=$0}/^From:/&&!f{f=$0}r&&f{printf "%s%s",r,f;exit(0)}');done|sort -k 2|uniq -d -f 1
2013-01-21 22:50:51
User: lpb612
Functions: awk echo find read sed sort uniq
1

# find assumes email files start with a number 1-9

# sed joins the lines starting with " " to the previous line

# gawk print the received and from lines

# sort according to the second field (received+from)

# uniq print the duplicated filename

# a message is viewed as duplicate if it is received at the same time as another message, and from the same person.

The command was intended to be run under cron. If run in a terminal, mutt can be used:

mutt -e "push otD~=xq" -f $folder

expandurl() { curl -s "http://api.longurl.org/v2/expand?url=${1}&format=php" | awk -F '"' '{print $4}' }
2013-01-19 10:40:46
User: atoponce
Functions: awk
Tags: curl longurl
2

This relies on a public API from http://longurl.org. So, this has the weakness that if the service disappears, the function will break. However, it has the advantage that the shortened URL service will not be tracking your IP address and other metrics, but instead will track longurl.org. Thus, you can remain anonymous from the shortened URL services (although not anonymous from longurl.org). It does no sanity checking that you have provided an argument. If you do not provide one, "message" is displayed to STDOUT.

tshark -r *.eth -S -R "ajp13" -d tcp.port==9009,ajp13 -s 0 -l -V | awk '/Apache JServ/ {p=1} /^ *$/ {p=0;printf "\n"} (p){printf "%s\n", $0} /^(Frame|Internet Pro|Transmission Control)/ {print $0}'
2013-01-10 21:12:51
User: tsureshkumar
Functions: awk printf
Tags: tshark
0

if you have a capture file *.eth, and ajp protocol is in use on port 9009, you can paste the above command. You can change the fiile and port name

awk '{s+=$1}END{print s}' <file>
curl http://en.wikipedia.org/wiki/List_of_programming_languages | grep "<li>" | awk -F"title=" '{ print $2 }' | awk -F\" '{ print $2 }'
2013-01-09 21:40:11
User: sxiii
Functions: awk grep
0

Requirements: curl, grep, awk, internet connection with access to wikipedia

Loaded page: http://en.wikipedia.org/wiki/List_of_programming_languages

If you can make shorter version of this listgetter, you are welcome to paste it here :)

find . -name "*.pdf" -exec pdftk {} dump_data output \; | grep NumberOfPages | awk '{s+=$2} END {print s}'
aptitude purge $(dpkg -l|grep ^rc|awk '{ print $2 }')