All commands (14,187)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Traceroute w/TCP to get through firewalls.
man tcptraceroute

Signals list by NUMBER and NAME
This command seems to achieve the similar/same goal.

Quick notepad
Quick write some notes to a file with cat. Ctrl+C when you have finish.

Ripping VCD in Linux
Ripping VCD in Linux

send DD a signal to print its progress
Sends the "USR1" signal every 1 second (-n 1) to a process called exactly "dd". The signal in some systems can be INFO or SIGINFO ... look at the signals list in: man kill

Create an audio test CD of sine waves from 1 to 99 Hz
This command creates and burns a gapless audio CD with 99 tracks. Each track is a 30 second sine wave, the first is 1 Hz, the second 2 Hz, and so on, up to 99 Hz. This is useful for testing audio systems (how low can your bass go?) and for creating the constant vibrations needed to make non-Newtonian fluids (like cornstarch and water) crawl around. Note, this temporarily creates 500MB of .cdda files in the current directory. If you don't use the "rm" at the end of the command, you can burn more disks using $ cdrdao write cdrdao.toc Prerequisites: a blank CD-R in /dev/cdrw, sox (http://sox.sourceforge.net/), and cdrdao (http://cdrdao.sourceforge.net/). I'm also assuming a recent version of bash for the brace expansion (which just looks nicer than using seq(1), but isn't necessary).

Brute force discover
Show the number of failed tries of login per account. If the user does not exist it is marked with *.

Make a dedicated folder for each zip file
${f%*.zip} strips off the extension from zip filenames

Send an email from the terminal when job finishes
Might as well include the status code it exited with so you know right away if it failed or not.

Retrieve a list of all webpages on a site
This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: