Check These Out
Then hit ^C to stop, get the file by scp, and you can now use wireshark like this :
$ wireshark /tmp/sniff.pcap
If you have tshark on remote host, you could use that :
$ wireshark -k -i
In this example the command "somecommand" will be executed and sent a SIGALARM signal if it runs for more than 10 seconds. It uses the perl alarm function. It's not 100% accurate on timing, but close enough. I found this really useful when executing scripts and commands that I knew might hang E.g. ones that connect to services that might not be running. Importantly this can be used within a sequential script. The command will not release control until either the command completes or the timeout is hit.
This command will give you the same list of files as "find /etc/ -name '*killall' | xargs ls -l".
In a simpler format just do 'ls /etc/**/file'.
It uses shell globbing, so it will also work with other commands, like "cp /etc/**/sshd sshd_backup".
Normally when a site is blocked through /etc/hosts, traffic is just being redirected to a non-existent server that isn't going to respond. This helps get your point across a little more clearly than a browser timeout.
Of course you could use any number of codes: http://en.wikipedia.org/wiki/List_of_HTTP_status_codes
Obviously, this command can be added to init-rc.d, and more sophisticated responses can be given. Seems noteworthy to mention that the information sent from the browser can be parsed using the bash READ builtin (such as 'while read -t 1 statement; do parsing'), and the connection stays open until the script exits. Take care that you must use EXEC:'bash -c foo.sh', as 'execvp' (socat's method for executing scripts) invokes 'sh', not 'bash'.
Very effective, use only DNS protocol. The @ part is optional if you already set opendns servers as default ns servers.
Do ls with permissions written in octal form.
using -u is better for standardizing date output and timezones, for servers in different timezones.
Useful when you want to cron a daily deletion task in order to keep files not older than one year. The command excludes .snapshot directory to prevent backup deletion.
One can append -delete to this command to delete the files :
$ find /path/to/directory -not \( -name .snapshot -prune \) -type f -mtime +365 -delete