Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using wget from sorted by
Terminal - Commands using wget - 238 results
wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip
wget -O- http://example.com/mytarball.tgz | tee mytarball.tgz | tar xzv
2013-03-06 11:11:28
Functions: tar tee wget
0

Very similar as doing "wget http://example.com/mytarball|tar xzv", this one involves the "tee" command between both, which will simultaneously write the tarball and copy it to stdout. So this command will locally save the tarball and extract it - both at the same time while it downloads.

wget -q -O - http://listen.di.fm/public2 | sed 's/},{/\n/g' | perl -n -e '/"key":"([^"]*)".*"playlist":"([^"]*)"/; print "$1\n"; system("wget -q -O - $2 | grep -E '^File' | cut -d= -f2 > di_$1.m3u")'
2013-02-20 03:37:41
User: Zort
Functions: perl sed wget
1

1.- Enter into the playlist path.

2.- Run the command.

3.- Playlists created!

wget -c or wget --continue
2013-02-17 21:12:00
User: sonic
Functions: wget
2

I couldn't find this on the site and it's a useful switch. Great for large files.

http_proxy=http://127.0.0.1:3128 wget --http-user='domain\account' --http-password='###' -p -r -l 8 --no-remove-listing -P . 'http://sp.corp.com/teams/Team/Shared%20Documents/Forms/AllItems.aspx?RootFolder=%2fteams%2fTeam%2fShared%20Documents%2fFolder'
2012-12-26 09:03:55
User: mhs
Functions: wget
1

If you have to deal with MS Sharepoint which is (rarely, let's hope) used in e.g. certain corporate environments).

This uses Cntlm.

For single files, just use cURL -- its NTLM authentication works quite well.

# /etc/cntlm.conf:

# Username account

# Domain domain

# Password ############

# Proxy 10.20.30.40 (IP of the sharepoint site)

# NoProxy *

# Listen 3128

wget --recursive --page-requisites --convert-links www.moyagraphix.co.za
wget --input-file=~/donwloads.txt --user="$USER" --password="$(gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null)"
2012-12-13 00:14:55
User: kyle0r
Functions: wget
Tags: GPG password
1

In this example, where the users gpg keyring has a password, the user will be interactively prompted for the keyring password.

If the keyring has no password, same as above, sans the prompt. Suitable for cron jobs.

~/.gnupg/passwd/http-auth.gpg is the encrypted http auth password, for this particular wget use case.

This approach has many use cases.

example bash functions:

function http_auth_pass() { gpg2 --decrypt ~/.gnupg/passwd/http-auth.gpg 2>/dev/null; }

function decrypt_pass() { gpg2 --decrypt ~/.gnupg/passwd/"$1" 2>/dev/null; }

wget --no-check-certificate https://code.google.com/p/msysgit/downloads/list -O - 2>nul | sed -n "0,/.*\(\/\/msysgit.googlecode.com\/files\/Git-.*\.exe\).*/s//http:\1/p" | wget -i - -O Git-Latest.exe
2012-11-14 08:17:50
User: michfield
Functions: sed wget
Tags: git windows wget
-1

This command should be copy-pasted in Windows, but very similar one will work on Linux.

It uses wget and sed.

function ip-where { wget -qO- -U Mozilla http://www.ip-adress.com/ip_tracer/$1 | html2text -nobs -style pretty | sed -n /^$1/,/^$/p;}
2012-10-22 21:39:53
User: tox2ik
Functions: sed wget
0

Tries to avoid the fragile nature of scrapers by looking for user-input in the output as opposed to markup or headers on the web site.

ompload() { wget -O- - "$1" --quiet|curl -# -F file1=@- http://ompldr.org/upload|awk '/Info:|File:|Thumbnail:|BBCode:/{gsub(/<[^<]*?\/?>/,"");$1=$1;print}';}
2012-10-19 22:54:55
User: dzup
Functions: awk wget
-2

Like i said, i havent test it yet, all becouse my internet its soo slow, if you try and works please share, also be nice to do it using the direct url link.

apt-popcon() { (echo \#rank; apt-cache search "$@" |awk '$1 !~ /^lib/ {print " "$1" "}') |grep -Ff- <(wget -qqO- http://popcon.debian.org/by_inst.gz |gunzip); }
2012-09-08 00:29:31
User: khopesh
Functions: apt awk echo grep wget
4

This will take the packages matching a given `apt-cache search` query (a collection of AND'd words or regexps) and tell you how popular they are. This is particularly nice for those times you have to figure out which solution to use for e.g. a PDF reader or a VNC client.

Substitute "ubuntu.com" for "debian.org" if you want this to use Ubuntu's data instead. Everything else will work perfectly.

wget -mk -w 20 http://www.example.com/
wb(){ for i in $(wget -O- -U "" "http://wallbase.cc/random/23/eqeq/1920x1080/0/" --quiet|grep wallpaper/|grep -oe 'http://wallbase.[^"]*'); do if (( n > "$1" )); then break;fi;let n++;wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done;}
for i in $(wget -O- -U "" "http://wallbase.cc/random/23/e..." --quiet|grep wallpaper/|grep -oe 'http://wallbase.cc[^"]*'); do wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done
wget --no-use-server-timestamps $(curl $(curl http://wallbase.cc/random/23/eqeq/1920x1080/0/100/20 | grep 'wallpaper/' | awk -F'"' '{print $2}' | head -n1) | grep -A4 bigwall | grep img | awk -F'"' '{print $2}'); feh --bg-center $(ls -1t | head -n1)
for i in {1..10};do wget $(wget -O- -U "" "http://images.google.com/images?imgsz=xxlarge&hl=en&q=wallpaper+HD&start=$(($RANDOM%900+100))" --quiet | grep -oe 'http://[^"]*\.jpg' | head -1);done
2012-07-26 10:42:13
User: dzup
Functions: grep head wget
6

you may want &hl=en for &hl=es for the language

you may want imgsz=xxlarge for imgsz=large or whatever filter

you may want q=apples or whatever

while pgrep wget || sudo shutdown -P now; do sleep 1m; done
cat urls.txt | wget -i- -T 10 -t 3 --waitretry 1
wget -O - http://www.reddit.com/r/wallpapers.rss | grep -Eo 'http://i.imgur.com[^&]+jpg' | head -1 | xargs wget -O background.jpg
2012-04-25 11:15:26
User: untitaker
Functions: grep head wget xargs
0

Doesn't depend on curl and doesn't use thumbnails as wallpaper (which has the unfortunate effect of only allowing imgur links)

wget -S --spider http://osswin.sourceforge.net/ 2>&1 | grep Mod
2012-04-18 03:43:33
User: dmmst19
Functions: grep wget
6

I used to use the Firefox "View page info" feature a lot to determine how stale the web page I was looking at was. Now that I use mostly Chrome I miss that feature, so here is a command line alternative using wget. The -S says to display the server response, the --spider says to not download any files/pages, just fetch the header. The output goes to stderr, so to grep it you use 2>&1 to combine the stderr stream with stdout, the pipe that to grep for Last-Modified.

You can use curl instead if you have it installed, like this:

curl --head -s http://osswin.sourceforge.net | grep Mod
wget -q ip.nu && cat index.html
wget -m -k -K -E http://url/of/web/site
wget -qO - http://whatismyip.org | tail
2012-03-17 10:13:05
User: Flolagale
Functions: wget
0

Uses wget standard GNU utility. Prints only your ip.

for i in $(seq 1 `curl http://megatokyo.com 2>/dev/null|grep current|cut -f6 -d\"`);do wget http://megatokyo.com/`curl http://megatokyo.com/strip/${i} 2>/dev/null|grep src=\"strips\/|cut -f4 -d\"`;done
2012-03-04 22:52:36
User: akira88
Functions: cut grep seq wget
Tags: wget comic
0

A simple script for download all the MegaTokyo strips from the first to the last one

wget -qO- -U '' 'google.com/search?q=weather' | grep -oP '(-)?\d{1,3}\xB0[FC]'
2012-02-28 22:27:38
User: slaufer
Functions: grep wget
-1

Grabs the current weather in your area (or their best guess of your area). Change the query to your zip code/location (e.g. google.com/search?q=weather+jakarta,+india) to get weather somewhere else. change google.com to google.ca or google.co.uk for metric.