This command will tell lynx to read keystrokes from the specified file - which can be used in a cronjob to auto-login on websites that give you points for logging in once a day *cough cough* (which is why I used -accept_all_cookies).
For creating your keystroke file, use:
lynx -cmd_log yourfile
It's very common to have cron jobs that send emails as their output, but the From: address is whatever account the cron job is running under, which is often not the address you want replies to go to. Here's a way to change the From: address right on the command line. What's happening here is that the "--" separates the options to the mail client from options for the sendmail backend. So the -f and -F get passed through to sendmail and interpreted there. This works on even on a system where postfix is the active mailer - looks like postfix supports the same options. I think it's possible to customize the From: address using mutt as a command line mailer also, but most servers don't have mutt preinstalled.
This command will log the output of your simple cronjobs to syslog, and syslog will take it from there. Works great for monitoring scripts which only produce simple output. Advantages: * This can be used by regular users, without modifying system files like /etc/syslog.conf * Reduce cron spam to root@localhost (Please stop spaming the sysadmins) * Uses common tools like syslog (and logrotate) so that you don't need to maintain yet another krufty logfile. * Still ensures that the output is logged somewhere, for posterity. Perhaps it's stored the secure, central syslog server, for example. * Seems to work fine on Ubuntu, CentOS, FreeBSD & MacOSX Show Sample Output
With this cron, rsync begins to sinchronize the contents of the local directory on /[VIPdirectory] with the directory /backup/[VIPdirectory] on the remote server X.X.X.X. Previously we need working on public/private-keys ssh to guarantee the acces to the remote server on X.X.X.X Show Sample Output
I find it ugly & sexy at the same time isn't it ?
Chronic Bash function:
chronic 3600 time # Print the time in your shell every hour
chronic 60 updatedb > /dev/null # update slocate every minute
Note: use 'jobs' to list background tasks and fg/bg to take control of them.
pauses exactly long enough to wake at the top of the hour
This is helpful for shell scripts, I use it in my custom php install script to schedule to delete the build files in 3 hours, as the php install script is completely automated and is made to run slow. Does require at, which some environments without crontab still do have. You can add as many commands to the at you want. Here's how I delete them in case the script gets killed. (trapped) atq |awk '{print $1}'|xargs -iJ atrm J &>/dev/null
if you need to install cron jobs in a given time range. Show Sample Output
This one-liner is for cron jobs that need to provide some basic information about a filesystem and the time it takes to complete the operation. You can swap out the di command for df or du if that's your thing. The |& redirections the stderr and stdout to the mail command. How to configure the variables. TOFSCK=/path/to/mount FSCKDEV=/dev/path/device or FSCKDEV=`grep $TOFSCK /proc/mounts | cut -f1 -d" "` MAILSUB="weekly file system check $TOFSCK " Show Sample Output
usage = crontest "/path/to/bin" This version of this function will echo back the entire command so it can be copied/pasted to crontab. Should be able to be automagically appended to crontab with a bit more work. Tested on bash and zsh on linux,freebsd,aix Show Sample Output
This is not exhaustive but after checking /etc/cron* is a good way to see if there are any other jobs any users may have set. Note: this is a repost from a comment "flatcap" made on http://www.commandlinefu.com/commands/view/3726/print-crontab-entries-for-all-the-users-that-actually-have-a-crontab#comment, for which I am grateful and I take no credit.
use it to stagger cronjob or to get a random number increase the range by replacing 100 with your own max value Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: