This command uses mutt to send the mail. You must pipe in a body, otherwise mutt will prompt you for some stuff. If you don't have mutt, it should be dead easy to install.
It's very common to have cron jobs that send emails as their output, but the From: address is whatever account the cron job is running under, which is often not the address you want replies to go to. Here's a way to change the From: address right on the command line. What's happening here is that the "--" separates the options to the mail client from options for the sendmail backend. So the -f and -F get passed through to sendmail and interpreted there. This works on even on a system where postfix is the active mailer - looks like postfix supports the same options. I think it's possible to customize the From: address using mutt as a command line mailer also, but most servers don't have mutt preinstalled.
Joker wants an email if the Brand X server is down. Set a cron job for every 5 mins with this line and he gets an email when/if a ping takes longer than 3 seconds. Show Sample Output
Copies an entire hierarchy of mailboxes from the named POP3/IMAP/etc. source to the named destination. Mailboxes are created on the destination as needed. NOTE: The 'mailutil' is Washington's University 'mailutil' (apt-get install uw-mailutils). More examples
mailutil transfer {imap.gmail.com/ssl/user=you@gmail.com}INBOX Gmail/ ; mailutil check imap.gmail.com/ssl/user=you@gmail.com}\[Gmail\]/Spam
If you use the utility in the first, append -v|-d flag(s) to the end the commands above (man mailutil).
Replace "user@domain.com" with the target e-mail address. Thanks to alediaz for "$HOSTNAME" which is very useful when running the command with Apple Remote Desktop to multiple machines simultaneously.
An easy one but nice to keep in mind.
Note, this works because smtp is running
This just reads in a local file and sends it via email. Works with text or binary. *Requires* local mail server.
Alternative to the ping check if your firewall blocks ping. Uses curl to get the landing page silently, or fail with an error code. You can probably do this with wget as well. Show Sample Output
For some reason the 2&>1 does not work for me, but the shorter stdout/stderr redirection >& works perfectly (Ubuntu 10.04).
dsniff is general purpose password sniffer, it handles *lots* of different protocols, but it also handles tcp-style expressions for limiting analyzed traffic - so I can limit it to work on pop3 only. Show Sample Output
Usage: mailme message
This is a useful function if you want to get notified about process completion or failure. e.g.
mailme "process X completed"
This takes a picture (with the web cam) every 5 minutes, and send the picture to your e-mail. Some systems support mail -a "References: " so that all video surveillance emails are grouped in a single email thread. To keep your inbox clean, it is still possible to filter and move to trash video surveillance emails (and restore these emails only if you really get robbed!) For instance with Gmail, emails sent to me+trash@gmail.com can be filtered with "Matches: DeliveredTo:me+trash@gmail.com" Show Sample Output
if "mail -a" fail, try "mutt -a" or "nail -a"
as unixmonkey7109 pointed out, first awk parse replaces three steps.
Full Command:
google contacts list name,name,email|perl -pne 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx'|grep -oP '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort
You'll need googlecl and python-gdata. First setup google cl via:
google
Then give your PC access
google contacts list name,email
Then do the command, save it or use this one to dump it in the cone-address.txt file in your home dir:
google contacts list name,name,email | perl -p -n -e 's%^((?!N\/A)(.+?)),((?!N\/A)(.+?)),([a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+))%${1}:${3} <${5}>%imx' | grep -o -P '^((?!N\/A)(.+?)) <[a-z0-9\._-]+\@([a-z0-9][a-z0-9-]*[a-z0-9]\.)+([a-z]+\.)?([a-z]+)>' | sort > ~/cone-adress.txt
Then import into cone.
It filters out multiple emails, and contacts with no email that have N/A. (Picasa photo persons without email for example...)
Show Sample Output
This version uses netcat to check a particular service.
Run "ps -x" (process status) in the background every hour (in this example).
The outputs of both "nohup" and "ps -x" are sent to the e-mail (instead of nohup.out and stdout and stderr).
If you like it, replace "ps -x" by the command of your choice, replace 3600 (1 hour) by the period of your choice.
You can run the command in the loop any time by killing the sleep process. For example
ps -x
2925 ? S 0:00.00 sh -c unzip E.zip >/dev/null 2>&1
11288 ? O 0:00.00 unzip E.zip
25428 ? I 0:00.00 sleep 3600
14346 pts/42- I 0:00.01 bash -c while true; do ps -x | mail (...); sleep 3600; done
643 pts/66 Ss 0:00.03 -bash
14124 pts/66 O+ 0:00.00 ps -x
kill 25428
You have mail in /mail/(...)
Show Sample Output
Export all Mailman mailing lists Members to separate .txt files excluding "Mailman" and "Test" or add yours by && $1!="myDontWannaList"
This will email user@example.com a message with the body: "rsync done" when there are no processes of rsync running. This can be changed for other uses by changing $(pgrep rsync) to something else, and echo "rsync done" | mailx user@example.com to another command.
uuencode the file to appear as an attachment
on this way we can define the body too
The command is useful for monitoring the use of the boxes and their connection IP. Result file "sniff" is readable with GUI program "wireshark" or through CLI with the command: tcpdump -f "sniff" -XX Show Sample Output
This can be used to delete or archive old mails. In fact, for archiving its a bit different, you need to archive mails with any tools (e.g archivemail), and then deleting (if you want!). Here we use -path ".*/cur/*" to avoid files limit in bash globbing and to search in any inbox (e.g .mymail .spam .whatever). ! -newermt "1 week ago" can be read: All files which is older than "1 week ago", adapt it in consequence. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: