Commands using tcpdump (52)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Randomly run command
Randomly decide whether to run a command, or fail. It's useful for testing purposes. . Usage: ran PERCENTAGE COMMAND [ARGS] Note: In this version the percentage is required. . This is like @sesom42 and @snipertyler's commands but in a USABLE form. . e.g. In your complicated shell script, put "ran 99" before a crucial component. Now, it will fail 1% of the time allowing you to test the failure code-path. $ ran 99 my_complex_program arg1 arg2

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Show the UUID of a filesystem or partition
Shows the UUID of the given partition (here /dev/sda7). Doesn't need to be root.

Copy all files. All normal files, all hidden files and all files starting with - (minus).
./* is for copying files starting with - .[!.]* is for copying hidden files and avoiding copying files from the parent directory. ..?* is for copying files starting with .. (avoids the directory ..) /path/to/dir the path to the directory where the files should be copied Can also be used as a script. Input argument is /path/to/dir in tcsh, replace .[!.]* with .[^.]*

Install pip with Proxy
Installs pip packages defining a proxy

Run iMacros from terminal
Run iMacros from terminal

backup your entire hosted website using cPanel backup interface and wget
this will connect to your hosted website service through the cPanel interface and use its backup tool to backup and download the entire website, locally. (do not forget to replace : YourUsername , YourPassword and YourWebsiteUrl for it to work )

Retrieve a list of all webpages on a site
This spiders the given site without downloading the HTML content. The resulting directory structure is then parsed to output a list of the URLs to url-list.txt. Note that this can take a long time to run and you make want to throttle the spidering so as to play nicely.

Find out how much data is waiting to be written to disk
Ever ask yourself "How much data would be lost if I pressed the reset button?" Scary, isn't it?

Download entire commandlinefu archive to single file
'jot' does not come with most *nix distros, so we need to use seq to make it work. This version tested good on Fedora 11.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: