echo 'wget url' | at 12:00

Download schedule

Downloads at 12:00

9
By: kayowas
2009-04-14 21:10:20

These Might Interest You

  • Schedule your Mac to sleep at any future time. Also wake, poweron, shutdown, wakeorpoweron. Or repeating with sudo pmset repeat wakeorpoweron MTWRFSU 7:00:00 Query with pmset -g sched Lots more at http://www.macenterprise.org/articles/powermanagementandschedulingviathecommandline


    4
    sudo pmset schedule sleep "08/31/2009 00:00:00"
    hobzcalvin · 2009-05-14 09:31:32 0
  • This is helpful for shell scripts, I use it in my custom php install script to schedule to delete the build files in 3 hours, as the php install script is completely automated and is made to run slow. Does require at, which some environments without crontab still do have. You can add as many commands to the at you want. Here's how I delete them in case the script gets killed. (trapped) atq |awk '{print $1}'|xargs -iJ atrm J &>/dev/null


    1
    echo "nohup command rm -rf /phpsessions 1>&2 &>/dev/null 1>&2 &>/dev/null&" | at now + 3 hours 1>&2 &>/dev/null
    AskApache · 2009-08-18 07:31:17 6
  • miss a class at UTOSC2010? need a refresher? use this to curl down all the presentations from the UTOSC website. (http://2010.utosc.com) NOTE/WARNING this will dump them in the current directory and there are around 37 and some are big - tested on OSX10.6.1 Show Sample Output


    2
    b="http://2010.utosc.com"; for p in $( curl -s $b/presentation/schedule/ | grep /presentation/[0-9]*/ | cut -d"\"" -f2 ); do f=$(curl -s $b$p | grep "/static/slides/" | cut -d"\"" -f4); if [ -n "$f" ]; then echo $b$f; curl -O $b$f; fi done
    danlangford · 2009-10-11 17:28:46 0
  • Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.


    2
    ( trap '' 1; ( nice -n 19 sleep 2h && command rm -v -rf /garbage/ &>/dev/null && trap 1 ) & )
    AskApache · 2009-10-10 04:43:44 1
  • The only zipped version of an album available for download is the lossy mp3 version. To download lossless files, because of their size, you must download them individually. This command scrapes the page for all the FLAC (or also SHN) files.


    0
    wget -rc -A.flac --tries=5 http://archive.org/the/url/of/the/album
    meunierd · 2010-01-20 07:36:25 0
  • The download content part. NOTE: the '-c' seems to not work very well and the download stuck at 99% sometimes. Just finish wget with no problem. Also, the download may restart after complete. You can also cancel. I don't know if it is a wget or Rapidshare glitch since I don't have problems with Megaupload, for example. UPDATE: as pointed by roebek the restart glitch can be solved by the "-t 1" option. Thanks a lot.


    6
    wget -c -t 1 --load-cookies ~/.cookies/rapidshare <URL>
    cammarin · 2009-03-28 09:13:35 2

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: