Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using perl from sorted by
Terminal - Commands using perl - 342 results
find . -type f -exec grep -l XXX {} \;|tee /tmp/fileschanged|xargs perl -pi.bak -e 's/XXX/YYY/g'
2009-02-16 02:55:23
User: drossman
Functions: find grep perl tee xargs
6

Find all files that contain string XXX in them, change the string from XXX to YYY, make a backup copy of the file and save a list of files changed in /tmp/fileschanged.

cat file.php | perl -p -e 's/(\$|->)(str|arr|obj|int|flt|boo|bool|mix|res)([A-Z])/$1\L$3/g'
2009-02-10 14:37:12
User: root
Functions: cat perl
0

This removes the type prefix used in Hungarian notation (v. bad) for PHP variables. Eg. variables of the form $intDays, $fltPrice, $arrItems, $objLogger convert to $days, $price, $Items, $logger.

perl -e "''=~('(?{'.('-^@.]|(;,@/{}/),[\\\$['^'],)@)[\`^@,@[*@[@?}.|').'})')"
find . -name "*.jpg" | perl -ne'chomp; $name = $_; $quote = chr(39); s/[$quote\\!]/_/ ; print "mv \"$name\" \"$_\"\n"'
perl -e "use SOAP::Lite"
2009-02-06 15:26:37
User: leprasmurf
Functions: perl
0

Quick command to check if Perl library is installed on your server.

perl -pe 's/.+;//' ~/.zsh_history | sort | uniq -c | sort -r|head -10
perl -le 'print join ", ", map { chomp; $_ } <>'
2009-02-06 12:50:43
User: jozef
Functions: join perl
-1

joins multiple lines to create single line with comma separated values. for example if we have an email addresses one per line (copy&paste from spreadsheet) it will oputput one line with comman separated addresses to put it to email client.

perl -MHTML::Entities -ne 'print encode_entities($_)' /tmp/subor.txt
2009-02-06 12:44:24
User: jozef
Functions: perl
1

Encodes HTML entities from input (file or stdin) so it's possible to directly past the result to a blog or HTML source file.

grep -r -l xxxxx . | xargs perl -i -pe "s/xxxxx/yyyyy/g"
2009-02-06 08:18:50
User: hassylin
Functions: grep perl xargs
-1

This script first find all files which contains word xxxxx recursively. Then replace the word xxxxx to yyyyy of the files.

Use case:

- Web site domain change

- Function name change of the program

find . -name "*.txt" | xargs perl -pi -e 's/old/new/g'
2009-02-06 00:28:03
User: neztach
Functions: find perl xargs
6

syntax follows regular command line expression.

example: let's say you have a directory (with subdirs) that has say 4000 .php files.

All of these files were made via script, but uh-oh, there was a typo!

if the typo is "let's go jome!" but you meant it to say "let's go home!"

find . -name "*.php" | xargs perl -pi -e "s/let\'s\ go\ jome\!/let\'s\ go\ home\!/g"

all better :)

multiline: find . -name "*.php" | xargs perl -p0777i -e 's/knownline1\nknownline2/replaced/m'

indescriminate line replace: find ./ -name '*.php' | xargs perl -pi -e 's/\".*$\"/\new\ line\ content/g'

perl -pi -e 's/THIS/THAT/g' fileglob*
2009-02-05 19:19:52
User: elofland
Functions: perl
2

changes THIS to THAT in all files matching fileglob* without using secondary files

perl -e '$b="bork"; while(<STDIN>){$l=`$_ 2>&1`; $l=~s/[A-Za-z]+/$b/g; print "$l$b\@$b:\$ ";}'
2009-02-05 18:33:40
User: fonik
Functions: perl
3

Bork, bork, bork!

To keep it short, the first terminal line doesn't show a prompt.

perl -pi -e 's/localhost/replacementhost/g' *.php
perl -i -pe "s/old/new/g" *
2009-02-05 12:07:27
User: pandres
Functions: perl
-1

Replaces every ocurrence of 'old' for 'new' in all files specified. After the 'i' char you can put a '~' or whatever to create a backup file for each replaced with the name equal to the original plus character.

function sshdel { perl -i -n -e "print unless (\$. == $1)" ~/.ssh/known_hosts; }
2009-02-03 16:20:50
User: xsawyerx
Functions: perl
-1

sometimes you got conflicts using SSH (host changing ip, ip now belongs to a different machine) and you need to edit the file and remove the offending line from known_hosts. this does it much easier.

perl -pi.bk -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:51:11
User: xsawyerx
Functions: perl
10

the addition of ".bk" to the regular "pie" idiom makes perl create a backup of every file with the extension ".bk", in case it b0rks something and you want it back

perl -pi -e's/foo/bar/g' file1 file2 fileN
2009-01-29 09:47:01
User: xsawyerx
Functions: perl
0

The "g" at the end is for global, meaning replace all occurrences and not just the first one.