Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using awk from sorted by
Terminal - Commands using awk - 1,150 results
ipcs -a | grep 0x | awk '{printf( "-Q %s ", $1 )}' | xargs ipcrm
svn status |grep '\?' |awk '{print $2}'| parallel -Xj1 svn add
2010-01-28 08:47:54
Functions: awk grep
Tags: xargs parallel
-2

xargs deals badly with special characters (such as space, ' and "). To see the problem try this:

touch important_file

touch 'not important_file'

ls not* | xargs rm

Parallel https://savannah.nongnu.org/projects/parallel/ does not have this problem.

if [ $(synclient -l | grep TouchpadOff | awk '{print $3}') = "2" ]; then synclient TouchpadOff=1; elif [ $(synclient -l | grep TouchpadOff | awk '{print $3}') == "1" ]; then synclient TouchpadOff=2; else synclient TouchpadOff=2; fi
2010-01-26 07:52:55
User: GinoMan2440
Functions: awk grep
-3

This command toggles the touchpad on and off, when it's on, the right side scroll strip (annoying) and the tap-clicking are disabled, you can change this by changing occurances of 2 in the command to 0. this whole command can then be given a keyboard shortcut so that the touchpad is disableable without using a special fn key (which linux doesn't recognize on some computers) or a seperate button.

dpkg -l | grep ^rc | awk '{print $2}' | sudo xargs dpkg -P
find . -type f -exec stat \{\} \; | grep Modify: | awk '{a[$2]++}END{for(i in a){print i " : " a[i] }}' | sort
for i in $(ps -ef | awk '{print $2}') ; { swp=$( awk '/Swap/{sum+=$2} END {print sum}' /proc/$i/smaps ); if [[ -n $swp && 0 != $swp ]] ; then echo -n "\n $swp $i "; cat /proc/$i/cmdline ; fi; } | sort -nr
curl -u <username> http://app.boxee.tv/api/get_queue | xml2 | grep /boxeefeed/message/description | awk -F= '{print $2}'
2010-01-20 16:17:19
User: Strawp
Functions: awk grep
Tags: curl xml boxee
0

Might be able to do it in less steps with xmlstarlet, although whether that would end up being shorter overall I don't know - xmlstarlet syntax confuses the heck out of me.

Prompts for your password, or if you're a bit mental you can add your password into the command itself in the format "-u user:password".

dir='path to file'; tar cpf - "$dir" | pv -s $(du -sb "$dir" | awk '{print $1}') | tar xpf - -C /other/path
2010-01-19 19:05:45
User: starchox
Functions: awk dir du tar
Tags: copy tar cp
-2

This may seem like a long command, but it is great for making sure all file permissions are kept in tact. What it is doing is streaming the files in a sub-shell and then untarring them in the target directory. Please note that the -z command should not be used for local files and no perfomance increase will be visible as overhead processing (CPU) will be evident, and will slow down the copy.

You also may keep simple with, but you don't have the progress info:

cp -rpf /some/directory /other/path
file='path to file'; tar -cf - "$file" | pv -s $(du -sb "$file" | awk '{print $1}') | gzip -c | ssh -c blowfish user@host tar -zxf - -C /opt/games
2010-01-19 16:02:45
User: starchox
Functions: awk du file gzip ssh tar
3

You set the file/dirname transfer variable, in the end point you set the path destination, this command uses pipe view to show progress, compress the file outut and takes account to change the ssh cipher. Support dirnames with spaces.

Merged ideas and comments by http://www.commandlinefu.com/commands/view/4379/copy-working-directory-and-compress-it-on-the-fly-while-showing-progress and http://www.commandlinefu.com/commands/view/3177/move-a-lot-of-files-over-ssh

lgrep() { string=$1; file=$2; awk -v String=${string} '$0 ~ String' ${file}; }
2010-01-19 09:42:19
User: dopeman
Functions: awk
1

This is a handy way to circumvent the "Maximum line length of 2048 exceeded" grep error.

Once you have run the above command (or put it in your .bashrc), files can be searched using:

lgrep search-string /file/to/search
lsof -p $(pidof firefox) | awk '/.mozilla/ { s = int($7/(2^20)); if(s>0) print (s)" MB -- "$9 | "sort -rn" }'
2010-01-13 22:45:53
User: tzk
Functions: awk pidof
10

Just refining last proposal for this check, showing awk power to make more complex math (instead /1024/1024, 2^20). We don't need declare variable before run lsof, because $(command) returns his output. Also, awk can perform filtering by regexp instead to call grep. I changed the 0.0000xxxx messy output, with a more readable form purging all fractional numbers and files less than 1 MB.

for i in `mailq | awk '$6 ~ /^frozen$/ {print $3}'`; do exim -Mrm $i; done
2010-01-13 21:28:45
User: rjamestaylor
Functions: awk
Tags: bash awk exim
-2

Although Exim will purge frozen (undeliverable) messages over time, the command "exim -Mrm #id#" where #id# is a particular message ID will purge a message immediately. Being lazy, I don't want to type the command for each frozen message, so I wrote the one-liner to do it for me.

tail -f /var/log/apache2/access.log | awk -W interactive '!x[$1]++ {print $1}'
2010-01-12 15:23:03
User: pykler
Functions: awk tail
1

Prints the unique IP Addresses as they arrive from an Apache `access.log` file.

The '-W interactive' tells awk to start writing to stdout immediately and not buffer the output.

This command builds on the uniq lines without sorting command (http://www.commandlinefu.com/commands/view/4389/remove-duplicate-entries-in-a-file-without-sorting.)

grep -e `date +%Y-%m-%d` /var/log/dpkg.log | awk '/install / {print $4}' | uniq | xargs apt-get -y remove
ps -ef | grep user | awk '{print $2}' | while read pid; do echo $pid ; pfiles $pid| grep portnum; done
2010-01-11 12:34:51
User: sharfah
Functions: awk echo grep ps read
0

My old Solaris server does not have lsof, so I have to use pfiles.

awk 'NR%3==1' file
2010-01-08 18:52:24
Functions: awk
4

A better way to show the file lines 3n + 1

awk '{print NR,$0}'
awk '{if (NR % 3 == 1) print $0}' foo > foo_every3_position1; awk '{if (NR % 3 == 2) print $0}' foo > foo_every3_position2; awk '{if (NR % 3 == 0) print $0}' foo > foo_every3_position3
2010-01-08 04:20:06
User: oshazard
Functions: awk
-1

awk extract every nth line.

Generic is:

awk '{if (NR % LINE == POSITION) print $0}' foo

where "last" position is always 0 (zero).

awk 'BEGIN{while (a++<50) s=s "-"; print s}'
2010-01-06 16:16:35
User: SuperFly
Functions: awk
0

Change the number 50 to whatever number of characters you want. Change the character inside the double quotes to whatever you want printed.

for x in `ptree | awk '{print $1}'`; do pfiles $x | grep ${PORT} > /dev/null 2>&1; if [ x"$?" == "x0" ]; then ps -ef | grep $x | grep -v grep; fi; done 2> /dev/null
2010-01-05 17:02:23
User: bpfx
Functions: awk grep ps
0

Can use lsof, but since it's not part of the base OS, it's not always available.

pfiles -F /proc/* 2>/dev/null | awk '/^[0-9]+/{proc=$1};/[s]ockname: AF_INET/{print proc $0}'
2010-01-04 21:21:32
User: hunternet
Functions: awk
1

Command line to get which PID is opening a socket on IP and PORT.

Only useful under Solaris.

(bzcat BZIP2_FILES && cat TEXT_FILES) | grep -E "Invalid user|PAM" | grep -o -E "from .+" | awk '{print $2}' | sort | uniq >> /etc/hosts.deny
2010-01-03 04:41:51
User: jayhawkbabe
Functions: awk cat grep sort uniq
3

Searches all log files (including archived bzip2 files) for invalid user and PAM authentication errors, both of which are indicative of brute force attempts at logging into computer. A list of all unique IP addresses and domain names is appended to hosts.deny. The command (and grep error messages) will work on Mac OS X 10.6, small adjustments may be needed for other OSs.

BACKUP_FILE_SIZE=`eval ls -l ${BACKUP_FILE} | awk {'print $5'}`; if [ $BACKUP_FILE_SIZE -le 20 ]; then echo "its empty"; else echo "its not empty"; fi
2009-12-29 08:34:37
User: Redrocket
Functions: awk echo ls
-2

If you gzip an empty file it becomes 20 bytes. Some backup checks i do check to see if the file is greater than zero size (-s flag) but this is no good here. Im sure someone has a better check than me for this? No check to see if file exists before checking it's size.

calc() { awk 'BEGIN { OFMT="%f"; print '"$*"'; exit}'; }
$class=ExampleClass; $path=src; for constant in `grep ' const ' $class.php | awk '{print $2;}'`; do grep -r "$class::$constant" $path; done