Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Commands using wget from sorted by
Terminal - Commands using wget - 234 results
read -p "Please enter the 4chan url: "|egrep '//i.4cdn.org/[a-z0-9]+/src/([0-9]*).(jpg|png|gif)' - -o|nl -s https:|cut -c7-|uniq|wget -nc -i - --random-wait
wget -q -O- http://bitinfocharts.com/markets/btc-e/btc-usd.html |grep -o -P 'lastTrade">([0-9]{1,})(.){0,1}[0-9]{0,}' |grep -o -P '(\d)+(\.){0,1}(\d)*' |head -n 1
wget -r -l1 -H -t1 -nd -N -np -A.mp3 -erobots=off [url of website]
wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us"|awk '/driverResults.aspx/ {print $4}'|cut -d "'" -f2|head -n 1)"|awk '/url=/ {print $2}'|cut -d '=' -f3|cut -d '&' -f1)"
2013-11-21 03:04:59
User: lowjax
Functions: awk cut head wget
1

Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :)

wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)"
wget --no-check-certificate https://www.kernel.org/$(wget -qO- --no-check-certificate https://www.kernel.org | grep tar | head -n1 | cut -d\" -f2)
wget --recursive --no-clobber --page-requisites --html-extension --convert-links --domains website.org --no-parent www.website.com/folder
t2s() { wget -q -U Mozilla -O $(tr ' ' _ <<< "$1"| cut -b 1-15).mp3 "http://translate.google.com/translate_tts?ie=UTF-8&tl=en&q=$(tr ' ' + <<< "$1")"; }
2013-10-16 23:29:59
User: snipertyler
Functions: cut tr wget
12

Usage: t2s 'How are you?'

Nice because it automatically names the mp3 file up to 15 characters

wget http://ipecho.net/plain -O - -q ; echo
2013-10-02 21:18:40
Functions: wget
0

Returns your external IP address to the command line using only wget

wget -q -O- http://example-podcast-feed.com/rss | grep -o "<enclosure[ -~][^>]*" | grep -o "http://[ -~][^\"]*" | xargs wget -c
2013-09-24 12:38:08
User: talha131
Functions: grep wget xargs
0

This script can be used to download enclosed files from a RSS feed. For example, it can be used to download mp3 files from a podcasts RSS feed.

wget -qO- http://utils.admin-linux.fr/ip.php
wget -qO - http://www.asciiartfarts.com/random.cgi | sed -n '/<pre>/,/<\/pre>/p' | sed -n '/<table*/,/<\/table>/p' | sed '1d' | sed '$d' | recode html..ascii
word="apple"; wget http://ssl.gstatic.com/dictionary/static/sounds/de/0/$word.mp3
site=http://www.duden.de; wort="Apfel"; wget -O $wort.mp3 $(wget.exe -O - "$site/rechtschreibung/$wort" | grep -o "$site/_media_/audio/[^\.]*\.mp3")
function 4chandl () { wget -e robots=off -nvcdp -t 0 -Hkrl 0 -I \*/src/ -P . "$1" }
2013-07-28 11:29:53
User: 89r
Functions: wget
Tags: wget 4chan
1

First (and only) argument should be a 4chan thread URL.

wget -r -l1 -H -nd -A mp3 -e robots=off http://example/url
2013-07-13 02:00:23
User: trizko
Functions: wget
Tags: wget music
1

This will download all files of the type specified after "-A" from a website. Here is a breakdown of the options:

-r turns on recursion and downloads all links on page

-l1 goes only one level of links into the page(this is really important when using -r)

-H spans domains meaning it will download links to sites that don't have the same domain

-nd means put all the downloads in the current directory instead of making all the directories in the path

-A mp3 filters to only download links that are mp3s(this can be a comma separated list of different file formats to search for multiple types)

-e robots=off just means to ignore the robots.txt file which stops programs like wget from crashing the site... sorry http://example/url lol..

wget -qO - whatismyipaddress.com/ip/<type ip address> | grep -E "City:|Country:" | sed 's:<tr><th>::'| sed 's</th>::' | sed 's:</td>::' | sed 's:</tr>::' | sed 's:<img*::'
2013-06-21 03:27:09
User: pentester
Functions: grep sed wget
0

This command will help you to get Ip address origin of city and country.

I will be happy if someone can shrink the sed command

fb.me/josenirmal

url=http://mirrors.mageia.org/api/mageia.4.i586.list; wget -q ${url} -O - | grep rsync:
2013-05-20 16:19:05
User: mpb
Functions: grep wget
1

Need to find a Mageia Linux mirror server providing Mageia 4 via rsync?

Modify the "url=" string for the version you want. This shows i586 which is the 32bit version.

If you want the 64bit version it is:

url=http://mirrors.mageia.org/api/mageia.4.x86_64.list; wget -q ${url} -O - | grep rsync:

wget -O "output-filename.mp4" $( youtube-dl -g -f "format-number" "youtube-video-link" )
2013-05-19 16:25:30
Functions: wget
1

in place of "output-filename.mp4" put the name you want the file to be named with.

in place of "youtube-video-link" put the link of the Video page eg: http://www.youtube.com/watch?v=AclA-7YntvE

in place of "format-number" put the number of the file format you would like

How to get the "format-number"

to get format number type in below command before running this command

youtube-dl -F "youtube-video-link"

and it will list all the available formats with the format number, like to download in 360p mp4 use the number "18"

To automatically let it fetch the best quality available just remove the -f "format-number" and you are good to go.

rm index.html | wget www.google.com;cat index.html | sed 's/<script>/\n\n\<script>\n\n/g' | sed 's/<\/script>/>\n\n/g'
2013-04-10 04:05:30
User: lbhack
Functions: cat rm sed wget
0

remove old index.html if you download it again and organiaz the java script tag on the file index.html

if wget https://twitter.com/users/username_available?username=xmuda -q -O - | grep -q "\"reason\":\"taken\""; then echo "Username taken"; else echo "Free / Banned Name"; fi
2013-03-23 17:39:15
User: Joschasa
Functions: echo grep wget
0

Reason can be: taken, available, contains_banned_word

wget -O - "[PICASA ALBUM RSS LINK]" |sed 's/</\n</g' | grep media:content |sed 's/.*url='"'"'\([^'"'"']*\)'"'"'.*$/\1/' |awk -F'/' '{gsub($NF,"d/"$NF); print $0}'|wget -i -
wget --output-document=/dev/null http://speedtest.wdc01.softlayer.com/downloads/test500.zip
wget -O- http://example.com/mytarball.tgz | tee mytarball.tgz | tar xzv
2013-03-06 11:11:28
Functions: tar tee wget
0

Very similar as doing "wget http://example.com/mytarball|tar xzv", this one involves the "tee" command between both, which will simultaneously write the tarball and copy it to stdout. So this command will locally save the tarball and extract it - both at the same time while it downloads.

wget -q -O - http://listen.di.fm/public2 | sed 's/},{/\n/g' | perl -n -e '/"key":"([^"]*)".*"playlist":"([^"]*)"/; print "$1\n"; system("wget -q -O - $2 | grep -E '^File' | cut -d= -f2 > di_$1.m3u")'
2013-02-20 03:37:41
User: Zort
Functions: perl sed wget
0

1.- Enter into the playlist path.

2.- Run the command.

3.- Playlists created!

wget -c or wget --continue
2013-02-17 21:12:00
User: sonic
Functions: wget
2

I couldn't find this on the site and it's a useful switch. Great for large files.