Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.
Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!
Hide

Top Tags

Hide

Functions

Hide

Credits

Commands using wget from sorted by
Terminal - Commands using wget - 252 results
expandurl() { wget -S $1 2>&1 | grep ^Location; }
2011-10-18 18:50:54
User: atoponce
Functions: grep wget
Tags: wget
0

This shell function uses wget(1) to show what site a shortened URL is pointing to, even if there are many nested shortened URLs. This is a great way to test whether or not the shortened URL is sending you to a malicious site, or somewhere nasty that you don't want to visit. The sample output is from:

expandurl http://t.co/LDWqmtDM
wget -O - -q http://whatismyip.org/
2011-10-15 11:36:56
User: ztank1013
Functions: wget
-5

This just output your external IP, no extra characters at the end of line.

wget -qO - --post-data "data[Row][cripted]=1cb251ec0d568de6a929b520c4aed8d1" http://md5-decrypter.com/ | grep -A1 "Decrypted text" | tail -n1 | cut -d '"' -f3 | sed 's/>//g; s/<\/b//g'
2011-10-13 03:48:54
User: samhagin
Functions: cut grep sed tail wget
Tags: md5
0

Decrypt MD5 , replace 1cb251ec0d568de6a929b520c4aed8d1 with the MD5 string you want to decrypt

wget -qO - --post-data "data[Row][clear]=text" http://md5-encryption.com | grep -A1 "Md5 encrypted state" | tail -n1 | cut -d '"' -f3 | sed 's/>//g; s/<\/b//g'
2011-10-13 03:44:48
User: samhagin
Functions: cut grep sed tail wget
Tags: md5
0

Encrypt any text to MD5 , replace text with the string you want to convert to MD5

wget -qO- "http://www.amazon.com/b?ie=UTF8&node=163856011" | grep Daily | sed -e 's/<[^>]*>//g' -e 's/^ *//' -e 's/\&[^;]*;/ /'
2011-10-06 20:27:02
User: winsbe01
Functions: grep sed wget
0

grabs and prints the AmazonMP3 daily album deal

wget -nd -nH -r -A pdf -I library/mac/documentation/ http://developer.apple.com/library/mac/navigation/#section=Resource%20Types&topic=Reference
2011-10-02 04:56:25
User: GinoMan2440
Functions: wget
Tags: mac os x
-2

download every last pdf reference Doc from the Apple Developer Connection, takes a half hour on a fast connection. enjoy!

wget http://www.commandlinefu.com/commands/by/e7__7dal
wgetall () { wget -r -l2 -nd -Nc -A.$@ $@ }
2011-09-28 09:43:25
Functions: wget
0

Recursively download all files of a certain type down to two levels, ignoring directory structure and local duplicates.

Usage:

wgetall mp3 http://example.com/download/

wget http://tools.web4host.net/versions.tmp --quiet -O - | grep PHPMYADMIN | sed 's/PHPMYADMIN=//' | cat
2011-09-22 04:11:44
User: wr8cr8
Functions: grep sed wget
-2

This is to get the latest version of phpMyAdmin to support scripts to download the latest version of the software if they want to automatically update.

p=$(echo "hello world, how r u?"|sed 's/ /+/g');wget -U Mozilla -q -O - "$@" translate.google.com/translate_tts?tl=en\&q=$p|mpg123 -
2011-09-19 23:06:15
User: jhansen
Functions: echo mpg123 sed wget
4

same but redirecting to player and putting whaever text line.. works on my ubuntu machine ...

wget -O - -q http://www.chisono.it/ip.asp && echo
2011-09-18 15:38:02
User: scanepa
Functions: wget
-2

The echo at the end is for pretty printing as the output is just the IP address without any html

wget -O - -q http://checkip.dyndns.org/ | cut -d':' -f2 | cut -d'<' -f1| cut -c2-
2011-09-17 13:42:01
User: ztank1013
Functions: cut wget
-2

This is just a "cut" addicted variant of the previous unixmonkey24730 command...

wget http://checkip.dyndns.org/ -q -O - | grep -Eo '\<[[:digit:]]{1,3}(\.[[:digit:]]{1,3}){3}\>'
curl -sm1 http://www.website.com/ | grep -o 'http://[^"]*jpg' | sort -u | wget -qT1 -i-
say() { wget -q -U Mozilla -O output.mp3 "http://translate.google.com/translate_tts?tl=en&q=$1"; gnome-terminal -x bash -c "totem output.mp3"; sleep 4; totem --quit;}
2011-09-07 19:48:53
User: totti
Functions: bash sleep wget
1

No need to install additional packages

eg:

say hello

For multiword

say how+are+you

wget -r -k -l 7 -p -E -nc http://site.com/
2011-08-20 10:16:06
User: realjkeee
Functions: wget
-4

-r ? указывает на то, что нужно рекурсивно переходить по ссылкам на сайте, чтобы скачивать страницы.

-k ? используется для того, чтобы wget преобразовал все ссылки в скаченных файлах таким образом, чтобы по ним можно было переходить на локальном компьютере (в автономном режиме).

-p ? указывает на то, что нужно загрузить все файлы, которые требуются для отображения страниц (изображения, css и т.д.).

-l ? определяет максимальную глубину вложенности страниц, которые wget должен скачать (по умолчанию значение равно 5, в примере мы установили 7). В большинстве случаев сайты имеют страницы с большой степенью вложенности и wget может просто ?закопаться?, скачивая новые страницы. Чтобы этого не произошло можно использовать параметр -l.

-E ? добавлять к загруженным файлам расширение .html.

-nc ? при использовании данного параметра существующие файлы не будут перезаписаны. Это удобно, когда нужно продолжить загрузку сайта, прерванную в предыдущий раз.

wget -q -O - http://www.perl.org/get.html | grep -m1 '\.tar\.gz' | sed 's/.*perl-//; s/\.tar\.gz.*//'
for i in `seq -w 1 50`; do wget --continue \ http://commandline.org.uk/images/posts/animal/$i.jpg; done
$ wget --mirror -p --convert-links -P ./<LOCAL-DIR> <WEBSITE-URL>
2011-08-18 08:27:28
User: tkembo
Functions: wget
1

?mirror : turn on options suitable for mirroring.

-p : download all files that are necessary to properly display a given HTML page.

?convert-links : after the download, convert the links in document for local viewing.

-P ./LOCAL-DIR : save all the files and directories to the specified directory.

wget http://somesite.com/somestream.pls; cvlc somestream.pls&sleep 5; rm somestream.pls*
2011-08-04 19:24:18
User: tomjrace
Functions: rm wget
-1

I wanted to play a song from the shell and get the shell back, I also dont want to store the file if it is not needed.

edit, not sure if I need to mention it... killall vlc to stop it

NAME=`wget --quiet URL -O - | grep util-vserver | tail -n 1 | sed 's|</a>.*||;s/.*>//'`; wget URL$UTILVSERVER;
wget --spider $URL 2>&1 | awk '/Length/ {print $2}'
2011-07-03 00:14:58
User: d3Xt3r
Functions: awk wget
5

- Where $URL is the URL of the file.

- Replace the $2 by $3 at the end to get a human-readable size.

Credits to svanberg @ ArchLinux forums for original idea.

Edit: Replaced command with better version by FRUiT. (removed unnecessary grep)

wget -O/dev/sdb ftp://ftp.debian.org/debian/dists/stable/main/installer-amd64/current/images/netboot/mini.iso
2011-06-12 21:58:13
Functions: wget
0

This wgets the iso directly to the USB device, replace /dev/sdb with the device name of the USB stick. After wget finishes you will be able to boot the .iso file from the USB stick.

wget -r -A .pdf -l 5 -nH --no-parent http://example.com
2011-06-09 17:17:03
User: houghi
Functions: wget
Tags: wget pdf
7

See man wget if you want linked files and not only those hosted on the website.

curl -s http://example.com | grep -o -P "<a.*href.*>" | grep -o "http.*.pdf" | xargs -d"\n" -n1 wget -c
2011-06-09 14:42:46
User: b_t
Functions: grep wget xargs
0

This example command fetches 'example.com' webpage and then fetches+saves all PDF files listed (linked to) on that webpage.

[*Note: of course there are no PDFs on example.com. This is just an example]