Commands using wget (286)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

check open ports without netstat or lsof

Bulk install
Like 7172, but much easier.

grep (or anything else) many files with multiprocessor power
xargs -P N spawns up to N worker processes. -n 40 means each grep command gets up to 40 file names each on the command line.

Coping files, excluding certain files
Preserve file structure when coping and exclude some file o dir patterns

Rename files in batch

Add page numbers to a PDF
Put this code in a bash script. The script expects the PDF file as its only parameter. It will add a header to the PDF containing the page numbers and output it to a file with the suffix "-header.pdf" Requires enscript, ps2pdf and pdftk.

make directory with current date

Find the package that installed a command

List pr. command in megabytes sum of deleted files that are still in use and therefore consumes diskspace

resize all JPG images in folder and create new images (w/o overwriting)
Convert all jpegs in the current directory into ~1024*768 pixels and ~ 150 KBytes jpegs


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: