Get all links of a website

lynx -dump | awk '/http/{print $2}' | sort -u
This will get all links from a given URL, remove any duplicates, and output the result.

By: mathias
2011-10-13 09:49:36

8 Alternatives + Submit Alt

What Others Think

Got a error with the sort -u bit, as follows: qsort: string comparison failed: Illegal byte sequence sort: Set LC_ALL='C' to work around the problem. sort: The strings compared were `\273' and `[1486]'. I removed the 'sort -u' and send the output to a file. Then I looked at the file and this is what I saw: === START === [3] [9] [10] [11] [448]http [449]httpd [450]http_code [1486] -dump 80/tcp paste(){ listhw(){ === END === Any explanation?
fukr · 553 weeks and 2 days ago
Ok, so some of the output between the START and END tags was removed. there are two lines with <BB> immediately prior to the line that starts with [1486].
fukr · 553 weeks and 2 days ago
works with elinks, too
ctapyxa · 549 weeks and 5 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: