Change *.ext to the appropriate extension
This command crawls a domain with the typical WGET output. It logs every thing to a WGET-LOG file with any errors repeated at the end. It also had the added benefit of not flooding your terminal without ouput, so it is safe to run in the background.
Download all pdf files off of a website using wget. You can change the file type to download, changing the extension, as an example you can change pdf for txt in command. Show Sample Output
Returns the global weighted BTC rate in EUR. Requires the "jq" JSON parser. Show Sample Output
Just added a little url encoding with sed - urls with spaces don't work well - this also works against instead of enclosure and adds a sample to show that you can filter against links at a certain domain Show Sample Output
Get all files of particular type (say, mp3) listed on some web page (say, audio.org)
A function for retrieving and displaying a list of synonyms for a German word or phrase. Show Sample Output
This will send the web page at $u to recipient@example.com . To send the web page to oneself, recipient@example.com can be replaced by $(whoami) or $USER. The "charset" is UTF-8 here, but any alternative charset of your choice would work. `wget -O - -o /dev/null $u` may be considered instead of `curl $u` . On some systems the complete path to sendmail may be necessary, for instance /sys/pkg/libexec/sendmail/sendmail for some NetBSD.
Bash scrip to test if a server is up, you can use this before wget'ing a file to make sure a blank one isn't downloaded.
The command was too long for the command box, so here it is:
echo $(( `wget -qO - http://i18n.counter.li.org/ | grep 'users registered' | sed 's/.*\<font size=7\>//g' | tr '\>' ' ' | sed 's/<br.*//g' | tr ' ' '\0'` + `curl --silent http://www.dudalibre.com/gnulinuxcounter?lang=en | grep users | head -2 | tail -1 | sed 's/.*<strong>//g' | sed 's/<\/strong>.*//g'` ))
This took me about an hour to do. It uses wget and curl because, dudalibre.com blocks wget, and wget worked nicely for me.
Show Sample Output
Just added view with the eog viewer.
Substitute that 724349691704 with an UPC of a CD you have at hand, and (hopefully) this oneliner should return the $Artist - $Title, querying discogs.com. Yes, I know, all that head/tail/grep crap can be improved with a single sed command, feel free to send "patches" :D Enjoy! Show Sample Output
This uses wget instead of curl
This command should be copy-pasted in Windows, but very similar one will work on Linux. It uses wget and sed.
The "-k" flag will tell wget to convert links for local browsing; it works with mirroring (ie with "-r") or single-file downloads.
This is just a "cut" addicted variant of the previous unixmonkey24730 command...
The preferred way for scripts (and easier to parse) Show Sample Output
I use this command in my Conky script to display the number of messages in my Gmail inbox and to list the from: and subject: fields. Show Sample Output
This command might not be useful for most of us, I just wanted to share it to show power of command line. Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination. This command removes numbers and single characters from count. I'm sure you can write a shorter version. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: