Requires the date command. This also works with some other comics. Here's a bash script that displays daily Garfield, Id, and Andy Capp: http://putnamhill.net/cgi-bin/yahoo-comics.sh?ga,crwiz,crcap
Might be able to do it in less steps with xmlstarlet, although whether that would end up being shorter overall I don't know - xmlstarlet syntax confuses the heck out of me. Prompts for your password, or if you're a bit mental you can add your password into the command itself in the format "-u user:password". Show Sample Output
Ever wanted to stream your favorite podcast across the network, well now you can. This command will parse the iTunes enabled podcast and stream the latest episode across the network through ssh encryption. Show Sample Output
Gets the latest podcast show from from your favorite Podcast. Uses curl and xmlstarlet. Make sure you change out the items between brackets.
Splitting on tags in awk is a handy way to parse html.
This function displays the latest comic from xkcd.com. One of the best things about xkcd is the title text when you hover over the comic, so this function also displays that after you close the comic.
To get a random xkcd comic use the following:
xkcdrandom() { wget -qO- http://dynamic.xkcd.com/comic/random | sed -n 's#^<img src="\(http://imgs.[^"]\+\)"\s\+title="\(.\+\?\)"\salt.\+$#eog "\1"\necho '"'\2'#p" | bash; }
These are just a bit shorter than the ones eigthmillion wrote, however his version didn't work as expected on my laptop for some reason (I got the title-tag first), so these build a command which is executed by bash.
Note, you need to replace the email address with your private Instapaper email address. There are a bunch of possible improvements such as, - Not writing a temp file - Doesnt strip tags (tho Instapaper does thankfully) - Shouldnt require 2 curls
Just a few minor changes. First the usage of lynx instead of curl so no sed is needed to revert the spaces. Then the usages of egrep instead of grep -e to save a few characters and last the removal of the extra 0. Show Sample Output
this simply curls the feed and runs a xpath query on it ... Show Sample Output
Changed wget to curl and it doesn't create a file anymore. Show Sample Output
yt2mp3(){ for j in `seq 1 301`;do i=`curl -s gdata.youtube.com/feeds/api/users/$1/uploads\?start-index=$j\&max-results=1|grep -o "watch[^&]*"`;ffmpeg -i `wget youtube.com/$i -qO-|grep -o 'url_map"[^,]*'|sed -n '1{s_.*|__;s_\\\__g;p}'` -vn -ab 128k "`youtube-dl -e ${i#*=}`.mp3";done;}
squeezed the monster (and nifty ☺) command from 7776 from 531 characters to 284 characters, but I don't see a way to get it down to 255. This is definitely a kludge!
If your version of curl does not support the --compressed option, use
curl -s http://funnyjunk.com | gunzip
instead of
curl -s --compressed http://funnyjunk.com
Expand a URL, aka do a head request, and get the URL. Copy this value to clipboard.
JSON version. Additionally it may give your geolocation if it's known by hostip.info Show Sample Output
This shell function uses curl(1) as it is more portable than wget(1) across Unices, to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. It is a refinement to www.commandlinefu.com/commands/view/9515/expand-shortened-urls to make it better for use in scripts. Only displays final result.
expandurl http://t.co/LDWqmtDM
Show Sample Output
Grabs your external public IP. Show Sample Output
Sometimes it could be very useful to obtain the final URL you'll use after several redirects. (I use this command line for my automated tests to check if every redirections are ok) Show Sample Output
Gets the (previously obtainable with:
finger @kernel.org
) info of the latest linux kernel versions available.
Show Sample Output
Reciprocally, we could get the node name from a give Tor IP address => ip2node() { curl -s -d "QueryIP=$1" http://torstatus.blutmagie.de/tor_exit_query.php | grep -oP "Server name:.*'>\K\w+" ; } ip2node 204.8.156.142 BostonUCompSci Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: