Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using wget from sorted by
Terminal - Commands using wget - 252 results
wget -mk -w 20 http://www.example.com/
wb(){ for i in $(wget -O- -U "" "http://wallbase.cc/random/23/eqeq/1920x1080/0/" --quiet|grep wallpaper/|grep -oe 'http://wallbase.[^"]*'); do if (( n > "$1" )); then break;fi;let n++;wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done;}
for i in $(wget -O- -U "" "http://wallbase.cc/random/23/e..." --quiet|grep wallpaper/|grep -oe 'http://wallbase.cc[^"]*'); do wget $(wget -O- -U "" $i --quiet|grep -oe 'http://[^"]*\.jpg');done
wget --no-use-server-timestamps $(curl $(curl http://wallbase.cc/random/23/eqeq/1920x1080/0/100/20 | grep 'wallpaper/' | awk -F'"' '{print $2}' | head -n1) | grep -A4 bigwall | grep img | awk -F'"' '{print $2}'); feh --bg-center $(ls -1t | head -n1)
for i in {1..10};do wget $(wget -O- -U "" "http://images.google.com/images?imgsz=xxlarge&hl=en&q=wallpaper+HD&start=$(($RANDOM%900+100))" --quiet | grep -oe 'http://[^"]*\.jpg' | head -1);done
2012-07-26 10:42:13
User: dzup
Functions: grep head wget
6

you may want &hl=en for &hl=es for the language

you may want imgsz=xxlarge for imgsz=large or whatever filter

you may want q=apples or whatever

while pgrep wget || sudo shutdown -P now; do sleep 1m; done
cat urls.txt | wget -i- -T 10 -t 3 --waitretry 1
wget -O - http://www.reddit.com/r/wallpapers.rss | grep -Eo 'http://i.imgur.com[^&]+jpg' | head -1 | xargs wget -O background.jpg
2012-04-25 11:15:26
User: untitaker
Functions: grep head wget xargs
0

Doesn't depend on curl and doesn't use thumbnails as wallpaper (which has the unfortunate effect of only allowing imgur links)

wget -S --spider http://osswin.sourceforge.net/ 2>&1 | grep Mod
2012-04-18 03:43:33
User: dmmst19
Functions: grep wget
6

I used to use the Firefox "View page info" feature a lot to determine how stale the web page I was looking at was. Now that I use mostly Chrome I miss that feature, so here is a command line alternative using wget. The -S says to display the server response, the --spider says to not download any files/pages, just fetch the header. The output goes to stderr, so to grep it you use 2>&1 to combine the stderr stream with stdout, the pipe that to grep for Last-Modified.

You can use curl instead if you have it installed, like this:

curl --head -s http://osswin.sourceforge.net | grep Mod
wget -q ip.nu && cat index.html
wget -m -k -K -E http://url/of/web/site
wget -qO - http://whatismyip.org | tail
2012-03-17 10:13:05
User: Flolagale
Functions: wget
0

Uses wget standard GNU utility. Prints only your ip.

for i in $(seq 1 `curl http://megatokyo.com 2>/dev/null|grep current|cut -f6 -d\"`);do wget http://megatokyo.com/`curl http://megatokyo.com/strip/${i} 2>/dev/null|grep src=\"strips\/|cut -f4 -d\"`;done
2012-03-04 22:52:36
User: akira88
Functions: cut grep seq wget
Tags: wget comic
1

A simple script for download all the MegaTokyo strips from the first to the last one

wget -qO- -U '' 'google.com/search?q=weather' | grep -oP '(-)?\d{1,3}\xB0[FC]'
2012-02-28 22:27:38
User: slaufer
Functions: grep wget
-1

Grabs the current weather in your area (or their best guess of your area). Change the query to your zip code/location (e.g. google.com/search?q=weather+jakarta,+india) to get weather somewhere else. change google.com to google.ca or google.co.uk for metric.

wget http://icanhazip.com -qO-
wget -nd -r -l 2 -A jpg,jpeg,png,gif http://website-url.com
wget -U Mozilla http://example.com/foo.tar.gz
for fn in xkcd*.png xkcd*.jpg; do echo $fn; read xw xh <<<$(identify -format '%w %h' $fn); nn="$(echo $fn | sed 's/xkcd-\([^-]\+\)-.*/\1/')"; wget -q -O xkcd-${nn}.json http://xkcd.com/$nn/info.0.json; tt="$(sed 's/.*"title": "\([^"]\+\)",.*/\1/' ...
2012-01-06 20:26:11
User: fpunktk
Functions: echo read wget
-2

full command:

for fn in xkcd*.png xkcd*.jpg; do; echo $fn; read xw xh <<<$(identify -format '%w %h' $fn); nn="$(echo $fn | sed 's/xkcd-\([0-9]\+\)-.*/\1/')"; wget -q -O xkcd-${nn}.json http://xkcd.com/$nn/info.0.json; tt="$(sed 's/.*"title": "\([^"]*\)", .*/\1/' xkcd-${nn}.json)"; at="$(sed 's/.*alt": "\(.*\)", .*/\1/' xkcd-${nn}.json)"; convert -background white -fill black -font /usr/share/fonts/truetype/freefont/FreeSansBold.ttf -pointsize 26 -size ${xw}x -gravity Center caption:"$tt" tt.png; convert -background '#FFF9BD' -border 1x1 -bordercolor black -fill black -font /usr/share/fonts/truetype/freefont/FreeSans.ttf -pointsize 16 -size $(($xw - 2))x -gravity Center caption:"$at" at.png; th=$(identify -format '%h' tt.png); ah=$(identify -format '%h' at.png); convert -size ${xw}x$(($xh+$th+$ah+5)) "xc:white" tt.png -geometry +0+0 -composite $fn -geometry +0+$th -composite at.png -geometry +0+$(($th+$xh+5)) -composite ${fn%\.*}_cmp.png; echo -e "$fn $nn $xw $xh $th $ah \n$tt \n$at\n"; done

this assumes that all comics are saved as xkcd-[number]-[title].{png|jpg}.

it will then download the title and alt-text, create pictures from them, and put everything together in a new png-file.

it's not perfect, but it worked for nearly all my comics.

it uses the xkcd-json-interface.

though it's poorly written, it doesn't completely break on http://xkcd.com/859/

wget ifconfig.me/ip -q -O -
url="put_url_here";audio=$(youtube-dl -s -e $url);wget -q -O - `youtube-dl -g $url`| ffmpeg -i - -f mp3 -vn -acodec libmp3lame - > "$audio.mp3"
2011-11-15 19:09:52
User: o0110o
Functions: wget
Tags: youtube mp3
1

Make your own MP3s from Youtube videos.

wget -A mp3,mpg,mpeg,avi -r -l 3 http://www.site.com/
tpb() { wget -U Mozilla -qO - $(echo "http://thepiratebay.org/search/$@/0/7/0" | sed 's/ /\%20/g') | grep -o 'http\:\/\/torrents\.thepiratebay\.org\/.*\.torrent' | tac; }
2011-10-26 12:15:55
User: Bonster
Functions: echo grep sed wget
3

usage: tpb searchterm

example: tpb the matrix trilogy

This searches for torrents from thepiratebay and displays the top results in reverse order,

so the 1st result is at the bottom instead of the top -- which is better for command line users

geoip() { wget -qO - http://freegeoip.net/xml/$1 | sed '3,12!d;s/<//g;s/>/: /g;s/\/.*//g' ; }
wget -b http://dl.google.com/android/android-sdk_r14-linux.tgz