Commands using perl (369)

  • This shell function grabs the weather forecast for the next 24 to 48 hours from weatherunderground.com. Replace <YOURZIPORLOCATION> with your zip code or your "city, state" or "city, country", then calling the function without any arguments returns the weather for that location. Calling the function with a zip code or place name as an argument returns the weather for that location instead of your default. To add a bit of color formatting to the output, use the following instead: weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "\x1B[0;34m%s\x1B[0m: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';} Requires: perl, curl Show Sample Output


    7
    weather(){ curl -s "http://api.wunderground.com/auto/wui/geo/ForecastXML/index.xml?query=${@:-<YOURZIPORLOCATION>}"|perl -ne '/<title>([^<]+)/&&printf "%s: ",$1;/<fcttext>([^<]+)/&&print $1,"\n"';}
    eightmillion · 2010-02-10 01:23:39 16
  • Converts reserved characters in a URI to their percent encoded counterparts. Alternate python version: echo "$url" | python -c 'import sys,urllib;print urllib.quote(sys.stdin.read().strip())' Show Sample Output


    7
    echo "$url" | perl -MURI::Escape -ne 'chomp;print uri_escape($_),"\n"'
    eightmillion · 2010-02-13 00:44:48 32
  • Using perl you can search for patterns spanning several lines, a thing that grep can't do. Append the list of files to above command or pipe a file through it, just as with regular grep. If you add the 's' modifier to the regex, the dot '.' also matches line endings, useful if you don't known how many lines you need are between parts of your pattern. Change '*' to '*?' to make it greedy, that is match only as few characters as possible. See also http://www.commandlinefu.com/commands/view/1764/display-a-block-of-text-with-awk to do a similar thing with awk. Edit: The undef has to be put in a begin-block, or a match in the first line would not be found.


    7
    perl -ne 'BEGIN{undef $/}; print "$ARGV\t$.\t$1\n" if m/(first line.*\n.*second line)/mg'
    hfs · 2010-03-18 15:46:10 33
  • Works only if modules are installed "the right way"


    7
    perl -MExtUtils::Installed -e '$inst = ExtUtils::Installed->new(); @modules = $inst->modules(); print join("\n", @modules);'
    braak · 2010-07-20 15:47:32 9
  • syntax follows regular command line expression. example: let's say you have a directory (with subdirs) that has say 4000 .php files. All of these files were made via script, but uh-oh, there was a typo! if the typo is "let's go jome!" but you meant it to say "let's go home!" find . -name "*.php" | xargs perl -pi -e "s/let\'s\ go\ jome\!/let\'s\ go\ home\!/g" all better :) multiline: find . -name "*.php" | xargs perl -p0777i -e 's/knownline1\nknownline2/replaced/m' indescriminate line replace: find ./ -name '*.php' | xargs perl -pi -e 's/\".*$\"/\new\ line\ content/g' Show Sample Output


    6
    find . -name "*.txt" | xargs perl -pi -e 's/old/new/g'
    neztach · 2009-02-06 00:28:03 28
  • Find all files that contain string XXX in them, change the string from XXX to YYY, make a backup copy of the file and save a list of files changed in /tmp/fileschanged.


    6
    find . -type f -exec grep -l XXX {} \;|tee /tmp/fileschanged|xargs perl -pi.bak -e 's/XXX/YYY/g'
    drossman · 2009-02-16 02:55:23 59

  • 6
    nmap -sS -O -oX /tmp/nmap.xml 10.1.1.0/24 -v -v && perl nmap2nagios.pl -v -r /tmp/10net.xml -o /etc/nagios/10net.cfg
    scubacuda · 2009-02-19 18:42:37 8
  • Finds all directories containing more than 99MB of files, and prints them in human readable format. The directories sizes do not include their subdirectories, so it is very useful for finding any single directory with a lot of large files. Show Sample Output


    6
    du -hS / | perl -ne '(m/\d{3,}M\s+\S/ || m/G\s+\S/) && print'
    Alioth · 2009-03-25 18:06:53 5

  • 6
    perl -lne '$l{$_}=length;END{for(sort{$l{$a}<=>$l{$b}}keys %l){print}}' < /usr/share/dict/words | tail
    grokskookum · 2009-09-10 14:49:03 3
  • Cool but useless. Show Sample Output


    6
    perl -nle 'printf "%0*v8b\n"," ",$_;'
    forcefsck · 2011-01-21 09:20:54 6
  • I use this in my bashrc to expand hosts defined in ~/.ssh/config: function _ssh_completion() { perl -ne 'print "$1 " if /^Host (.+)$/' ~/.ssh/config } complete -W "$(_ssh_completion)" ssh Here's a great article on how to setup your own ~/.ssh/config: http://blogs.perl.org/users/smylers/2011/08/ssh-productivity-tips.html


    6
    perl -ne 'print "$1 " if /^Host (.+)$/' ~/.ssh/config
    bashrc · 2011-08-21 14:51:20 7

  • 5
    u=`curl -d 'dl.start=Free' $(curl $1|perl -wpi -e 's/^.*"(http:\/\/rs.*)" method.*$/$1/'|egrep '^http'|head -n1)|grep "Level(3) \#2"|perl -wpi -e 's/^.*(http:\/\/rs[^\\\\]*).*$/$1/'`;sleep 60;wget $u
    fel1x · 2009-04-01 20:14:41 6
  • Mouse around the title of this item, and note that your cookies are being logged to the console. If I were evil, I could instead send everyone's cookies to my site, and then post up-votes on all my submissions using their cookies, and try to delete every other submission, until clfu was completely pwned by me, redirecting people to malware and porn sites, and so on. Update - now fixed.


    5
    perl -pi -e 's/<a href="#" onmouseover="console.log('xss! '+document.cookie)" style="position:absolute;height:0;width:0;background:transparent;font-weight:normal;">xss</a>/<\/a>/g'
    isaacs · 2009-07-08 22:26:15 30

  • 5
    perl -ne 'split /,/ ; $a+= $_[3]; END {print $a."\n";}' -f ./file.csv
    ioggstream · 2009-08-07 09:35:52 3
  • 'jot' does not come with most *nix distros, so we need to use seq to make it work. This version tested good on Fedora 11.


    5
    for x in `seq 0 25 $(curl "http://www.commandlinefu.com/commands/browse"|grep "Terminal - All commands" |perl -pe 's/.+(\d+),(\d+).+/$1$2/'|head -n1)`; do curl "http://www.commandlinefu.com/commands/browse/sort-by-votes/plaintext/$x" ; done > a.txt
    SuperFly · 2009-08-27 11:02:53 7
  • In this example the command "somecommand" will be executed and sent a SIGALARM signal if it runs for more than 10 seconds. It uses the perl alarm function. It's not 100% accurate on timing, but close enough. I found this really useful when executing scripts and commands that I knew might hang E.g. ones that connect to services that might not be running. Importantly this can be used within a sequential script. The command will not release control until either the command completes or the timeout is hit. Show Sample Output


    5
    perl -e "alarm 10; exec @ARGV" "somecommand"
    jgc · 2009-09-23 12:03:55 9

  • 5
    perl -MCPAN -e 'CPAN::Shell->install(CPAN::Shell->r)'
    grokskookum · 2009-09-24 03:27:12 4
  • Place the regular expression you want to validate between the forward slashes in the eval block. Show Sample Output


    5
    perl -we 'my $regex = eval {qr/.*/}; die "$@" if $@;'
    tlacuache · 2009-10-13 21:50:47 10

  • 5
    perl -pe 's/%([0-9a-f]{2})/sprintf("%s", pack("H2",$1))/eig'
    putnamhill · 2009-11-25 14:32:50 38
  • First we accept a socket and fork the server. Then we overload the new socket as a code ref. This code ref takes one argument, another code ref, which is used as a callback. The callback is called once for every line read on the socket. The line is put into $_ and the socket itself is passed in to the callback. Our callback is scanning the line in $_ for an HTTP GET request. If one is found it parses the file name into $1. Then we use $1 to create an new IO::All file object... with a twist. If the file is executable("-x"), then we create a piped command as our IO::All object. This somewhat approximates CGI support. Whatever the resulting object is, we direct the contents back at our socket which is in $_[0].


    5
    perl -MIO::All -e 'io(":8080")->fork->accept->(sub { $_[0] < io(-x $1 ? "./$1 |" : $1) if /^GET \/(.*) / })'
    Neo23x0 · 2010-03-31 15:03:55 8
  • This will generate 3 paragraphs with random text. Change the 3 to any number. Show Sample Output


    5
    lynx -source http://www.lipsum.com/feed/xml?amount=3|perl -p -i -e 's/\n/\n\n/g'|sed -n '/<lipsum>/,/<\/lipsum>/p'|sed -e 's/<[^>]*>//g'
    houghi · 2010-04-26 17:26:44 5
  • Safe for whitespaces in names.


    5
    dpkg -L iptables | perl -lne 'print if -f && -x'
    depesz · 2010-10-30 15:47:51 7
  • Using tail to follow and standard perl to count and print the lps when lines are written to the logfile.


    5
    tail -f /var/log/logfile|perl -e 'while (<>) {$l++;if (time > $e) {$e=time;print "$l\n";$l=0}}'
    madsen · 2011-06-21 10:28:26 7

  • 4
    find . -type d | perl -nle 'print s,/,/,g," $_"' | sort -n | tail
    sirhc · 2009-02-23 22:52:07 5
  • When Ldapsearch queries an Active directory server, all the dates are shown using a timestamp of 18 digits. This perl regexp decodes them in a more human friendly notation. 11644473600 corresponds to some microsoft epoch. Show Sample Output


    4
    ldapsearch -v -H ldap://<server> -x -D cn=<johndoe>,cn=<users>,dc=<ourdomain>,dc=<tld> -w<secret> -b ou=<lazystaff>,dc=<ourdomain>,dc=<tld> -s sub sAMAccountName=* '*' | perl -pne 's/(\d{11})\d{7}/"DATE-AD(".scalar(localtime($1-11644473600)).")"/e'
    flux · 2009-04-22 00:57:34 5
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

List your installed Chromium extensions (with url to each page)
Gives you a list for all installed chrome (chromium) extensions with URL to the page of the extension. With this you can easy add a new Bookmark folder called "extensions" add every URL to that folder, so it will be synced and you can access the names from every computer you are logged in. ------------------------------------------------------------------------------------------------------------------ Only tested with chromium, for chrome you maybe have to change the find $PATH.

find the delete file ,which is in use

history autocompletion with arrow keys
This will enable the possibility to navigate in the history of the command you type with the arrow keys, example "na" and the arrow will give all command starting by na in the history.You can add these lines to your .bashrc (without &&) to use that in your default terminal.

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

Backup a directory structure
Copies a directory structure from /home/ to /backups/home (notice that the destination does not have a trailing slash)

Output files without comments or empty lines
better integration. works on all Unices works one bash and ksh.

Adding a startup script to be run at bootup Ubuntu
This script will run each time you boot up.The script must be in /etc/init.d directory.

Find a file's package or list a package's contents.
This is the fastest method to search the Debian package database. Requires the dlocate package. The dlocate db updates periodically, but you may force an update with # dlocate-update

cat large file to clipboard with speed-o-meter
shortest alternative without the speed-o-meter"xclip large.xml" "xclip -o" to get the clipboard content, alternatively [shift key] + insert or middle button of your mouse.

Search recursively to find a word or phrase in certain file types, such as C code
I have a bash alias for this command line and find it useful for searching C code for error messages. The -H tells grep to print the filename. you can omit the -i to match the case exactly or keep the -i for case-insensitive matching. This find command find all .c and .h files


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: