Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged perl from sorted by
Terminal - Commands tagged perl - 164 results
perl -e '@F = `ls -1`;while (<@F>){@T = stat($_);print "$_ = " . localtime($T[8]) . "\n";}'
2010-05-20 15:02:51
User: hckhckhck
Functions: perl
0

Solaris 'ls' command does not have a nice '--full-time' arg to make the time show after a year has passed. So I spit this out quick. It hates spaces in file names.

perl -MDigest::SHA -e 'print substr( Digest::SHA::sha256_base64( time() ), 0, $ARGV[0] ) . "\n"' <length>
2010-04-30 21:45:46
User: udog
Functions: perl
1

Of course you will have to install Digest::SHA and perl before this will work :)

Maximum length is 43 for SHA256. If you need more, use SHA512 or the hexadecimal form: sha256_hex()

echo $hex | perl -pe 's/(..)/chr(hex($1))/ge'
perl -i -pe 's/\r/\n/g' file
perldoc perllocal
2010-04-14 10:57:56
User: octopus
Tags: version perl
3

This command will give you the detailed information about the installed perl modules i.e. installed path, Link type, version, files etc.

curl -s http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}|xmlstarlet sel -E utf-8 -t -m //forecast/txt_forecast/forecastday -v fcttext -n
perl -MStatistics::Descriptive -alne 'my $stat = Statistics::Descriptive::Full->new; $stat->add_data(@F[1..4]); print $stat->variance' filename
2010-04-02 21:16:12
User: alperyilmaz
Functions: perl
1

In this example, file contains five columns where first column is text. Variance is calculated for columns 2 - 5 by using perl module Statistics::Descriptive. There are many more statistical functions available in the module.

perl -pi -e 's/\r\n?/\n/g'
2010-03-18 17:48:16
User: putnamhill
Functions: perl
Tags: perl
3

This method will also convert mac line endings.

perl -ne 'BEGIN{undef $/}; print "$ARGV\t$.\t$1\n" if m/(first line.*\n.*second line)/mg'
2010-03-18 15:46:10
User: hfs
Functions: perl
Tags: perl grep
7

Using perl you can search for patterns spanning several lines, a thing that grep can't do. Append the list of files to above command or pipe a file through it, just as with regular grep. If you add the 's' modifier to the regex, the dot '.' also matches line endings, useful if you don't known how many lines you need are between parts of your pattern. Change '*' to '*?' to make it greedy, that is match only as few characters as possible.

See also http://www.commandlinefu.com/commands/view/1764/display-a-block-of-text-with-awk to do a similar thing with awk.

Edit: The undef has to be put in a begin-block, or a match in the first line would not be found.

echo "$url" | perl -MURI::Escape -ne 'chomp;print uri_escape($_),"\n"'
2010-02-13 00:44:48
User: eightmillion
Functions: echo perl
Tags: perl
5

Converts reserved characters in a URI to their percent encoded counterparts.

Alternate python version:

echo "$url" | python -c 'import sys,urllib;print urllib.quote(sys.stdin.read().strip())'
weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "%s: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}
2010-02-10 01:23:39
User: eightmillion
Functions: perl
7

This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default.

To add a bit of color formatting to the output, use the following instead:

weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "\x1B[0;34m%s\x1B[0m: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}

Requires: perl, curl

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/^<li>//g'|nl|/usr/bin/perl -MHTML::Entities -pe 'decode_entities($_)';}
2010-01-30 13:08:03
User: gthb
Functions: grep sed
7

This version works on Mac (avoids grep -P, adding a sed step instead, and invokes /usr/bin/perl with full path in case you have another one installed).

Still requires that you install perl module HTML::Entities ? here's how: http://www.perlmonks.org/?node_id=640489

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
2010-01-29 05:01:11
User: eightmillion
Functions: grep perl
18

This function takes a word or a phrase as arguments and then fetches definitions using Google's "define" syntax. The "nl" and perl portion isn't strictly necessary. It just makes the output a bit more readable, but this also works:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Po '(?<=<li>)[^<]+';}

If your version of grep doesn't have perl compatible regex support, then you can use this version:

define(){ local y="$@";curl -sA"Opera" "http://www.google.com/search?q=define:${y// /+}"|grep -Eo '<li>[^<]+'|sed 's/<li>//g'|nl|perl -MHTML::Entities -pe 'decode_entities($_)' 2>/dev/null;}
podwebserver& sleep 2; elinks 'http://127.0.0.1:8020'
2010-01-27 10:57:34
User: vlan7
Functions: sleep
5

Prerequisites: module Pod::Webserver installed. You can install it typing:

sudo perl -MCPAN -e 'install Pod::Webserver'

You can replace elinks with your fav browser. For FF:

podwebserver& sleep 2; firefox -remote 'openurl( http://127.0.0.1:8020/, new-tab )'

If you have Firefox open, this will pop-up the index web in a new tab.

grepp() { [ $# -eq 1 ] && perl -00ne "print if /$1/i" || perl -00ne "print if /$1/i" < "$2";}
2010-01-12 04:30:15
User: eightmillion
Functions: perl
13

This is a command that I find myself using all the time. It works like regular grep, but returns the paragraph containing the search pattern instead of just the line. It operates on files or standard input.

grepp <PATTERN> <FILE>

or

<SOMECOMMAND> | grepp <PATTERN>
perl -wl -e '@f=<>; for $i (0 .. $#f) { $r=int rand ($i+1); @f[$i, $r]=@f[$r,$i] if ($i!=$r); } chomp @f; print join $/, @f;' try.txt
perl -lne 'print for /url":"\K[^"]+/g' $(ls -t ~/.mozilla/firefox/*/sessionstore.js | sed q)
2009-12-14 00:51:54
User: sputnick
Functions: ls perl sed
0

If you want all the URLs from all the sessions, you can use :

perl -lne 'print for /url":"\K[^"]+/g' ~/.mozilla/firefox/*/sessionstore.js

Thanks to tybalt89 ( idea of the "for" statement ).

For perl purists, there's JSON and File::Slurp modules, buts that's not installed by default.

x=IO::Interface::Simple; perl -e 'use '$x';' &>/dev/null || cpan -i "$x"; perl -e 'use '$x'; my $ip='$x'->new($ARGV[0]); print $ip->address,$/;' <INTERFACE>
2009-12-13 02:23:40
User: sputnick
Functions: perl
1

Thanks to comment if that works or not...

If you have already typed that snippet or you know you already have IO::Interface::Simple perl module, you can type only the last command :

perl -e 'use IO::Interface::Simple; my $ip=IO::Interface::Simple->new($ARGV[0]); print $ip->address,$/;' <INTERFACE>

( The first perl command will install the module if it's not there already... )

perl -e 'use Date::Calc qw(Today Week_Number); $weekn = Week_Number(Today); print "$weekn\n"'
cho "(Something like http://foo.com/blah_blah)" | awk '{for(i=1;i<=NF;i++){if($i~/^(http|ftp):\/\//)print $i}}'
2009-11-28 03:31:41
Functions: awk
-1

don't have to be that complicated

echo "(Something like http://foo.com/blah_blah)" | grep -oP "\b(([\w-]+://?|www[.])[^\s()<>]+(?:\([\w\d]+\)|([^[:punct:]\s]|/)))"
perl -pe 's/%([0-9a-f]{2})/sprintf("%s", pack("H2",$1))/eig'
utime(){ perl -e "print localtime($1).\"\n\"";}
2009-11-06 12:58:10
User: MoHaG
Functions: perl
1

A shell function using perl to easily convert Unix-time to text.

Put in in your ~/.bashrc or equivalent.

Tested on Linux / Solaris Bourne, bash and zsh. using perl 5.6 and higher.

(Does not require GNU date like some other commands)

dpigs
perl -ne '$pkg=$1 if m/^Package: (.*)/; print "$1\t$pkg\n" if m/^Installed-Size: (.*)/;' < /var/lib/dpkg/status | sort -rn | less
2009-10-19 12:55:59
User: hfs
Functions: perl sort
0

List packages and their disk usage in decreasing order. This uses the "Installed-Size" from the package metadata. It may differ from the actual used space, because e.g. data files (think of databases) or log files may take additional space.