Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Expand shortened URLs

Terminal - Expand shortened URLs
expandurl() { wget -S $1 2>&1 | grep ^Location; }
2011-10-18 18:50:54
User: atoponce
Functions: grep wget
0
Expand shortened URLs

This shell function uses wget(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from:

expandurl http://t.co/LDWqmtDM

Alternatives

There are 16 alternatives - vote for the best!

Terminal - Alternatives
expandurl() { curl -sIL $1 | grep ^Location; }
2011-10-19 00:56:53
User: atoponce
Functions: grep
Tags: curl
5

curl(1) is more portable than wget(1) across Unices, so here is an alternative doing the same thing with greater portability. This shell function uses curl(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from:

expandurl http://t.co/LDWqmtDM
expandurl() { curl -s "http://api.longurl.org/v2/expand?url=${1}&format=php" | awk -F '"' '{print $4}' }
2013-01-19 10:40:46
User: atoponce
Functions: awk
Tags: curl longurl
2

This relies on a public API from http://longurl.org. So, this has the weakness that if the service disappears, the function will break. However, it has the advantage that the shortened URL service will not be tracking your IP address and other metrics, but instead will track longurl.org. Thus, you can remain anonymous from the shortened URL services (although not anonymous from longurl.org). It does no sanity checking that you have provided an argument. If you do not provide one, "message" is displayed to STDOUT.

expandurl() { curl -sIL $1 2>&1 | awk '/^Location/ {print $2}' | tail -n1; }
2011-10-19 01:35:33
Functions: awk tail
Tags: curl
0

This shell function uses curl(1) as it is more portable than wget(1) across Unices, to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. It is a refinement to www.commandlinefu.com/commands/view/9515/expand-shortened-urls to make it better for use in scripts. Only displays final result.

expandurl http://t.co/LDWqmtDM

Know a better way?

If you can do better, submit your command here.

What others think

sweet!

Comment by linuxrawkstar 153 weeks and 6 days ago

Nice, however with bash, you'll need a ; at the end of the grep part.

Comment by flatcap 153 weeks and 6 days ago

Nice, But would like to see a version of this that only prints the last line and/or drops the "Location: " and " [following]" bits.

Comment by defiantredpill 153 weeks and 6 days ago

@flatcap updated. thx. simple missing typo. glad someone is paying attention

@defiantredpill seems simple enough to use by replacing grep(1) with awk(1). submit an alternative!

Comment by atoponce 153 weeks and 6 days ago

@atoponce I forked your curl version instead, just waiting for a mod to look at it as it was my first command.

Comment by defiantredpill 153 weeks and 6 days ago

The wget version not only prints locations, it also downloads.

Comment by wipu 153 weeks and 6 days ago

expandurl() { curl -sIL $1 2>&1 | awk '/^Location/ {print $2}' | tail -n1; }

Comment by defiantredpill 152 weeks and 5 days ago

Your point of view

You must be signed in to comment.

Related sites and podcasts