Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands tagged website from sorted by
Terminal - Commands tagged website - 6 results
while true; do curl -s http://sensiblepassword.com/?harder=1 | tail -n 15 | head -n 1 | sed 's;<br/>;;' | cut -c 5- | cb; sleep 1; done
2012-01-30 20:52:14
User: supervacuo
Functions: cut head sed sleep tail
1

Use the excellent sensiblepasswords.com to a generate random (yet easy-to-remember) password every second, and copy it to the clipboard. Useful for generating a list of passwords and pasting them into a spreadsheet.

This script uses "madebynathan"'s "cb" function (http://madebynathan.com/2011/10/04/a-nicer-way-to-use-xclip/); you could also replace "cb" with

xclip -selection c

Remove "while true; do" and "; done" to generate and copy only 1 password.

HTMLTEXT=$( curl -s http://www.page.de/test.html > /tmp/new.html ; diff /tmp/new.html /tmp/old.html ); if [ "x$HTMLTEXT" != x ] ; then echo $HTMLTEXT | mail -s "Page has changed." mail@mail.de ; fi ; mv /tmp/new.html /tmp/old.html
2010-07-04 21:45:37
User: Emzy
Functions: diff echo mail mv
2

Checks if a web page has changed. Put it into cron to check periodically.

Change http://www.page.de/test.html and mail@mail.de for your needs.

urls=('www.ubuntu.com' 'google.com'); for i in ${urls[@]}; do http_code=$(curl -I -s $i -w %{http_code}); echo $i status: ${http_code:9:3}; done
wget --server-response --spider http://www.example.com/
2009-03-31 18:49:14
User: penpen
Functions: wget
5

Let me suggest using wget for obtaining the HTTP header only as the last resort because it generates considerable textual overhead. The first ellipsis of the sample output stands for

Spider mode enabled. Check if remote file exists.

--2009-03-31 20:42:46-- http://www.example.com/

Resolving www.example.com... 208.77.188.166

Connecting to www.example.com|208.77.188.166|:80... connected.

HTTP request sent, awaiting response...

and the second one looks for

Length: 438 [text/html]

Remote file exists and could contain further links,

but recursion is disabled -- not retrieving.

lynx -dump -head http://www.example.com/
2009-03-31 18:41:36
User: penpen
-1

Without the -dump option the header is displayed in lynx. You can also use w3m, the command then is

w3m -dump_head http://www.example.com/
wget --http-user=YourUsername --http-password=YourPassword http://YourWebsiteUrl:2082/getbackup/backup-YourWebsiteUrl-`date +"%-m-%d-%Y"`.tar.gz
2009-03-31 17:50:41
User: nadavkav
Functions: wget
4

this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally.

(do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )