No need for further filedes or substitution for splitting. Simply use read a b
This uses Text::Highlight to output the specified Perl file with syntax highlighting. A better alternative is my App::perlhl - find it on the CPAN: http://p3rl.org/App::perlhl
This works by reading in two lines of input, turning each into a list of one-character matches that are sorted and compared.
This is from perldoc -q random.*line, which says: This has a significant advantage in space over reading the whole file in. You can find a proof of this method in The Art of Computer Programming, Volume 2, Section 3.4.2, by Donald E. Knuth. Who am I to argue with Don Knuth?
Recursively delete empty directories. Use with care.
this command example converts to 25 fps subtitles that were originally created for 24 fps movie
Much better alternatives - grep-alikes using perl regexps. With more options, and nicer outputs.
from http://stackoverflow.com/questions/1030787/multiline-search-replace-with-perl added greedy trick in wildcard match (.*?) from http://www.troubleshooters.com/codecorn/littperl/perlreg.htm#Greedy Show Sample Output
the output of svn log is annoying to grep, since it spreads the useful info over multiple lines. This compacts the output down to one line so eg you can grep for a comment and see the rev, date & committer straight away. Updated: MUCH shorter, easier to remember. Now it just replaces newlines with spaces, except on '---' lines. Show Sample Output
This fixes a bug found in the other scripts which fail when a branch has the same name as a file or directory in the current directory. Show Sample Output
The command creates an alias called 'path', so it's useful to add it to your .profile or .bash_profile. The path command then prints the full path of any file, directory, or list of files given. Soft links will be resolved to their true location. This is especially useful if you use scp often to copy files across systems. Now rather then using pwd to get a directory, and then doing a separate cut and paste to get a file's name, you can just type 'path file' and get the full path in one operation. Show Sample Output
Sets the @ A record for your domain hosted by namecheap to your current internet-facing IP address, logs success or failure with syslog, and logs the data returned to /root/dnsupdate. Change the XXX's as appropriate. More info at: http://www.namecheap.com/support/knowledgebase/article.aspx/29/ Show Sample Output
Converts control codes and spaces (ASCII code ≤ 32) to visible Unicode Control Pictures, U+2400 ? U+2420. Skips \n characters, which is probably a good thing. Show Sample Output
This is based on __unixmonkey73469__ answer. You will need to supply `--multiline 1` option to JSON importer if your .json is multiline (i.e. it was prettyfied) And you still need catmandu installed via `cpanm Catmandu`
An advantage is that this doesn't modify remained string at all. One can change {0,1} with {0,n} to drop several columns
single-column-numbers.txt is a text file with 22658 rows (numbers) in a single column. Each number can range from 0 to 134298679.533591 and the dot is for the decimals. This is done with perl because awk can't sum such high numbers. Show Sample Output
Converts windows lined-style file to unix. see http://en.wikipedia.org/wiki/Newline Can be used to convert from linux2dos : just invert \r\n and \n.
This command will output 1 if the given argument is a valid ip address and 0 if it is not. Show Sample Output
substitute the URL with your private/public XML url from calendar sharing settings substitute the dates YYYY-mm-dd adjust the perl parsing part for your needs Show Sample Output
This finds all the PowerPC apps recognized by OS X.
A better version is:
system_profiler SPApplicationsDataType 2> /dev/null | perl -
wnl -e '$i=$j=$k=$p=0; @al=; $c=@al; while($j
s[$i].=$al[$j]; $i++ if ($al[$j]) =~ /^\s\s\s\s\S.*:$/; $j++} while($k
apps[$k++]; if (/Kind: PowerPC/s) {print; $p++;}} print "$i applications, $p P
owerPC applications\n\n"'
but that is more than 255 characters...
print scalar gmtime
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: