Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using tail from sorted by
Terminal - Commands using tail - 231 results
wget -qO - --post-data "data[Row][clear]=text" http://md5-encryption.com | grep -A1 "Md5 encrypted state" | tail -n1 | cut -d '"' -f3 | sed 's/>//g; s/<\/b//g'
2011-10-13 03:44:48
User: samhagin
Functions: cut grep sed tail wget
Tags: md5
0

Encrypt any text to MD5 , replace text with the string you want to convert to MD5

sudo dpkg -i `ls -tr *.deb | tail -n4`
2011-10-09 14:20:11
User: _john
Functions: sudo tail
0

after kernel build with make deb-pkg, I like to install the 4 newest packages that exist in the directory. Beware: might be fewer for you....

tail -n +<N> <file> | head -n 1
2011-09-30 08:30:30
User: qweqq
Functions: head tail
-5

Tail is much faster than sed, awk because it doesn't check for regular expressions.

diff -U 9999 file_a file_b | tail -n +3 | grep -P "^(\ Header|\-|\+)"
2011-09-21 21:33:40
User: nnutter
Functions: diff grep tail
Tags: diff
0

Maybe very limited in its applicability but could be of use at times.

tail -f LOGFILE | awk '{system("say \"" $0 "\"");}'
2011-09-16 06:20:06
User: tamouse
Functions: awk tail
Tags: awk tail say
-1

like #9295, but awkish instead of perlish

tail -f LOGFILE | perl -ne '`say "$_"`;'
2011-09-16 05:33:22
User: tamouse
Functions: perl tail
Tags: perl tail say
0

say only processes a complete file, at eof, so following a file isn't possible. Quick and dirty perl oneliner to feed each line from the tail -f to say. Yes, expensive to lauch a new process each line.

This little ditty was prompted by a discussion on how horrible it is to use VoiceOver on ncurses programs such as irssi.

head -n 13 /etc/services | tail -n 1
2011-09-15 19:39:49
User: muonIT
Functions: head tail
Tags: goto
-5

Silly approach, but easy to remember...

tail -f ~/.bash_history
2011-09-15 19:35:09
User: totti
Functions: tail
Tags: watch
-1

Changes are displayed when they are written to the file

to exit

ls -trF | grep -v \/ | tail -n 1
2011-09-14 20:05:37
User: mrpollo
Functions: grep ls tail
Tags: find stat mtime
-1

Sort by time and Reverse to get Ascending order, then display a marker next to the a file, negate directory and select only 1 result

fn=$(find . -type f -printf "%T@\t%p\n"|sort -n|tail -1|cut -f2); echo $(date -r "$fn") "$fn"
sudo netstat|head -n2|tail -n1 && sudo netstat -a|grep udp && echo && sudo netstat|head -n2|tail -n1 && sudo netstat -a|grep tcp
search="whatyouwant";data=$(grep "$search" * -R --exclude-dir=.svn -B2 -A2);for((i=$(echo "$data" | wc -l);$i>0;i=$(($i-6)) )); do clear;echo "$data"| tail -n $i | head -n 5; read;done
2011-08-29 18:14:16
User: Juluan
Functions: echo grep head tail wc
-2

Not perfect but working (at least on the project i wrote it ;) )

Specify what you want search in var search, then it grep the folder and show one result at a time.

Press enter and then it will show the next result.

It can work bad on result in the firsts lines, and it can be improved to allow to come back.

But in my case (a large project, i was checking if a value wasn't used withouth is corresponding const and the value is "1000" so there was a lot of result ...) it was perfect ;)

mount |tail -1 | less -p "/dev/[^ ]*"
cd $(ls -ltr|grep ^d|head -1|sed 's:.*\ ::g'|tail -1)
2011-08-10 03:39:35
Functions: cd grep head ls sed tail
-1

Replace the head -1 with head -n that is the n-th item you want to go to.

Replace the head with tail, go to the last dir you listed.

You also can change the parameters of ls.

netstat -an |grep ":80" |awk '{print $5}' | sed s/::ffff://g | cut -d: -f1 |sort |uniq -c |sort -n | tail -1000 | grep -v "0.0.0.0"
NAME=`wget --quiet URL -O - | grep util-vserver | tail -n 1 | sed 's|</a>.*||;s/.*>//'`; wget URL$UTILVSERVER;
ps aux | awk {'sum+=$3;print sum'} | tail -n 1
tail -f /var/log/squid/access.loc | ccze -CA
tail -n0 -f /var/log/messages | while read line; do notify-send "System Message" "$line"; done
2011-07-11 22:33:24
User: hukketto
Functions: read tail
Tags: notify-send
1

It willl popup a message for each new entry in /var/log/messages

found on the notify-send howto page on ubuntuforums.org.

Posted here only because it is one of the favourites of mine.

tail -f /var/log/squid/access.log | perl -p -e 's/^([0-9]*)/"[".localtime($1)."]"/e'
tail -f /var/log/logfile|perl -e 'while (<>) {$l++;if (time > $e) {$e=time;print "$l\n";$l=0}}'
2011-06-21 10:28:26
User: madsen
Functions: perl tail time
Tags: perl tail
2

Using tail to follow and standard perl to count and print the lps when lines are written to the logfile.

head -n1 sample.txt | tail -n1
2011-06-14 17:45:04
User: gtcom
Functions: head tail
Tags: tail HEAD
-1

You can actually do the same thing with a combination of head and tail. For example, in a file of four lines, if you just want the middle two lines:

head -n3 sample.txt | tail -n2

Line 1 --\

Line 2 } These three lines are selected by head -n3,

Line 3 --/ this feeds the following filtered list to tail:

Line 4

Line 1

Line 2 \___ These two lines are filtered by tail -n2,

Line 3 / This results in:

Line 2

Line 3

being printed to screen (or wherever you redirect it).

history | tail -(n+1) | head -(n) | sed 's/^[0-9 ]\{7\}//' >> ~/script.sh
2011-06-08 13:40:58
Functions: head sed tail
1

Uses history to get the last n+1 commands (since this command will appear as the most recent), then strips out the line number and this command using sed, and appends the commands to a file.

tail /var/log/auth.log -n 100
ls -atr /home/reports/*.csv -o --time-sty=+%s | tail -1 | awk '{print systime()-$5}'