Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged awk from sorted by
Terminal - Commands tagged awk - 291 results
history | awk '{a[$'$(echo "1 2 $HISTTIMEFORMAT" | wc -w)']++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
2010-05-02 21:48:53
User: bandie91
Functions: awk echo sort wc
Tags: history awk wc
0

If you use HISTTIMEFORMAT environment e.g. timestamping typed commands, $(echo "1 2 $HISTTIMEFORMAT" | wc -w)

gives the number of columns that containing non-command parts per lines.

It should universify this command.

tr -d "\n\r" | grep -ioEm1 "<title[^>]*>[^<]*</title" | cut -f2 -d\> | cut -f1 -d\<
awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' | sed '/^$/d' > file.html
2010-04-20 13:27:47
User: tamouse
Functions: awk sed
Tags: Linux awk html
1

previous version leaves lots of blank lines

awk 'BEGIN{IGNORECASE=1;FS="<title>|</title>";RS=EOF} {print $2}' file.html
2010-04-20 10:54:03
User: sata
Functions: awk
Tags: Linux awk html
4

Case Insensitive! and Works even if the "<title>...</title>" spans over multiple line.

Simple! :-)

find ~/.mozilla/firefox/*/Cache -exec file {} \; | awk -F ': ' 'tolower($2)~/mpeg/{print $1}'
2010-04-19 06:59:55
User: sata
Functions: awk file find
2

Grab a list of MP3s (with full path) out of Firefox's cache

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the *full path* of those tunes straight out of FF's cache in a clean list.

Shorter and Intuitive version of the command submitted by (TuxOtaku)

ifconfig -a| awk '/^wlan|^eth|^lo/ {;a=$1;FS=":"; nextline=NR+1; next}{ if (NR==nextline) { split($2,b," ")}{ if ($2 ~ /[0-9]\./) {print a,b[1]}; FS=" "}}'
2010-04-15 04:34:28
User: alf
Functions: awk ifconfig
Tags: ifconfig awk IP
5

Interfaces like lo can be omitted from the beginning, there are probably better ways of doing this, i'm a noob at awk.

for i in `ls ~/.mozilla/firefox/*/Cache`; do file $i | grep -i mpeg | awk '{print $1}' | sed s/.$//; done
2010-04-11 23:14:18
User: TuxOtaku
Functions: awk file grep sed
4

Ever gone to a site that has an MP3 embedded into a pesky flash player, but no download link? Well, this one-liner will yank the names of those tunes straight out of FF's cache in a nice, easy to read list. What you do with them after that is *ahem* no concern of mine. ;)

ffmpeg -f alsa -itsoffset 00:00:02.000 -ac 2 -i hw:0,0 -f x11grab -s $(xwininfo -root | grep 'geometry' | awk '{print $2;}') -r 10 -i :0.0 -sameq -f mp4 -s wvga -y intro.mp4
2010-03-31 09:33:05
User: mohan43u
Functions: awk grep
4

Yet another x11grab using ffmpeg. I also added mic input to the capturing video stream using alsa. Yet I need to find out how to capture audio which is currently playing.

ffmpeg -f x11grab -s `xdpyinfo | grep 'dimensions:'|awk '{print $2}'` -r 25 -i :0.0 -sameq /tmp/out.mpg > /root/howto/capture_screen_video_ffmpeg
alias busy='my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk "{print $1}"); let "r = $RANDOM % $my_len" 2>/dev/null; vim +$r $my_file'
2010-03-09 21:48:41
User: busybee
Functions: alias awk find head sort vim wc
22

This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.

SITE="www.google.com"; curl --silent "http://www.shadyurl.com/create.php?myUrl=$SITE&shorten=on" | awk -F\' '/is now/{print $6}'
function avg { awk "/$2/{sum += \$$1; lc += 1;} END {printf \"Average over %d lines: %f\n\", lc, sum/lc}"; }
2010-02-18 10:20:22
User: vimzard
Functions: awk
Tags: awk
0

Computes a columns average in a file. Input parameters = column number and optional pattern.

curl -s http://www.commandlinefu.com/commands/by/$1/xml | awk -F'</?div[^>]*>' '/class=\"command\"/{gsub(/&quot;/,"\"",$2); gsub(/&lt;/,"<",$2); gsub(/&gt;/,">",$2); gsub(/&amp;/,"\\&",$2); cmd=$2} /class=\"num-votes\"/{printf("%3i %s\n", $2, cmd)}'
2010-02-16 17:24:45
User: putnamhill
Functions: awk
2

This version prints current votes and commands for a user. Pass the user as an argument. While this technically "fits" as a one liner, it really is easier to look at as a shell script with extra whitespace. :)

xrandr -q | awk -F'current' -F',' 'NR==1 {gsub("( |current)","");print $2}'
fbemailscraper YourFBEmail Password
2010-01-31 00:44:35
User: dabom
4

(Apparently it is too long so I put it in sample output, I hope that is OK.)

Run the long command (or put it in your .bashrc) in sample output then run:

fbemailscraper YourFBEmail Password

Voila! Your contacts' emails will appear.

Facebook seems to have gotten rid of the picture encoding of emails and replaced it with a text based version making it easy to scrape!

Needs curl to run and it was made pretty quickly so there might be bugs.

renice +5 -p $(pidof <process name>)
file='path to file'; tar -cf - "$file" | pv -s $(du -sb "$file" | awk '{print $1}') | gzip -c | ssh -c blowfish user@host tar -zxf - -C /opt/games
2010-01-19 16:02:45
User: starchox
Functions: awk du file gzip ssh tar
3

You set the file/dirname transfer variable, in the end point you set the path destination, this command uses pipe view to show progress, compress the file outut and takes account to change the ssh cipher. Support dirnames with spaces.

Merged ideas and comments by http://www.commandlinefu.com/commands/view/4379/copy-working-directory-and-compress-it-on-the-fly-while-showing-progress and http://www.commandlinefu.com/commands/view/3177/move-a-lot-of-files-over-ssh

lsof -p $(pidof firefox) | awk '/.mozilla/ { s = int($7/(2^20)); if(s>0) print (s)" MB -- "$9 | "sort -rn" }'
2010-01-13 22:45:53
User: tzk
Functions: awk pidof
10

Just refining last proposal for this check, showing awk power to make more complex math (instead /1024/1024, 2^20). We don't need declare variable before run lsof, because $(command) returns his output. Also, awk can perform filtering by regexp instead to call grep. I changed the 0.0000xxxx messy output, with a more readable form purging all fractional numbers and files less than 1 MB.

exipick -zi | xargs exim -Mrm
for i in `mailq | awk '$6 ~ /^frozen$/ {print $3}'`; do exim -Mrm $i; done
2010-01-13 21:28:45
User: rjamestaylor
Functions: awk
Tags: bash awk exim
-2

Although Exim will purge frozen (undeliverable) messages over time, the command "exim -Mrm #id#" where #id# is a particular message ID will purge a message immediately. Being lazy, I don't want to type the command for each frozen message, so I wrote the one-liner to do it for me.

$class=ExampleClass; $path=src; for constant in `grep ' const ' $class.php | awk '{print $2;}'`; do grep -r "$class::$constant" $path; done
curl -s search.twitter.com | awk -F'</?[^>]+>' '/\/intra\/trend\//{print $2}'
tar -cf - . | pv -s $(du -sb . | awk '{print $1}') | gzip > out.tgz
2009-12-18 17:09:08
User: opertinicy
Functions: awk du gzip tar
25

What happens here is we tell tar to create "-c" an archive of all files in current dir "." (recursively) and output the data to stdout "-f -". Next we specify the size "-s" to pv of all files in current dir. The "du -sb . | awk ?{print $1}?" returns number of bytes in current dir, and it gets fed as "-s" parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way "pv" knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish.

Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/

curl -sL 'dynamic.xkcd.com/comic/random/' | awk -F\" '/^<img/{printf("<?xml version=\"1.0\"?>\n<xkcd>\n<item>\n <title>%s</title>\n <comment>%s</comment>\n \n</item>\n</xkcd>\n", $6, $4, $2)}'
wget -q http://dynamic.xkcd.com/comic/random/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'
2009-12-09 13:41:25
User: laugg
Functions: awk sed wget
Tags: sed awk wget comic
0

Only need to install Image Magick package.

Display a xkcd comic with its title and save it in /tmp directory

If you prefer to view the newest xkcd, use this command:

wget -q http://xkcd.com/ -O-| sed -n '/<img src="http:\/\/imgs.xkcd.com\/comics/{s/.*\(http:.*\)" t.*/\1/;p}' | awk '{system ("wget -q " $1 " -O- | display -title $(basename " $1") -write /tmp/$(basename " $1")");}'