Commands tagged automation (6)

  • This is a (last resort) way to automate applications that provide no other ways for automation, it would send 'Hello world' to the currently active window. See the manpage (and the -text and -window entries) for how to send special characters and target specific windows. An example: Using xwininfo, I get the id of my XPlanet background window: alanceil@kvirasim:19:51:0:~> xwininfo xwininfo: Please select the window about which you would like information by clicking the mouse in that window. xwininfo: Window id: 0x3600001 "Xplanet 1.2.0" Absolute upper-left X: 0 (..etc..) Now I use xvkbd to tell it to close itself: xvkbd -xsendevent -window 0x3600001 -text "Q" Obviously, the best way is to put these commands in a shellscript - just make sure to include a short sleep (sleep .1 should suffice) after each xvkbd call, or some programs will become confused.


    15
    xvkbd -xsendevent -text "Hello world"
    Alanceil · 2009-03-20 18:58:05 7
  • Referring to the original post, if you are using $! then that means the process is a child of the current shell, so you can just use `wait $!`. If you are trying to wait for a process created outside of the current shell, then the loop on `kill -0 $PID` is good; although, you can't get the exit status of the process.


    3
    wait $!
    noahspurrier · 2010-06-07 21:56:36 2
  • This one-liner is for cron jobs that need to provide some basic information about a filesystem and the time it takes to complete the operation. You can swap out the di command for df or du if that's your thing. The |& redirections the stderr and stdout to the mail command. How to configure the variables. TOFSCK=/path/to/mount FSCKDEV=/dev/path/device or FSCKDEV=`grep $TOFSCK /proc/mounts | cut -f1 -d" "` MAILSUB="weekly file system check $TOFSCK " Show Sample Output


    1
    ( di $TOFSCK -h ; /bin/umount $TOFSCK ; time /sbin/e2fsck -y -f -v $FSCKDEV ; /bin/mount $TOFSCK ) |& /bin/mail $MAILTO -s "$MAILSUB"
    px · 2010-10-24 00:35:23 5
  • If you really _must_ use a loop, this is better than parsing the output of 'ps': PID=$! ;while kill -0 $PID &>/dev/null; do sleep 1; done kill -0 $PID returns 0 if the process still exists; otherwise 1


    0
    wait
    bhepple · 2010-01-15 04:03:11 4
  • Use this command to execute the contents of http://www.example.com/automation/remotescript.sh in the local environment. The parameters are optional. Alterrnatives to wget: CURL: curl -s http://www.example.com/automation/remotescript.sh | bash /dev/stdin param1 param2 W3M: w3m -dump http://www.example.com/automation/remotescript.sh | bash /dev/stdin [param1] [param2] LYNX: lynx -source http://www.example.com/automation/remotescript.sh | bash /dev/stdin [param1] [param2]


    0
    wget -q -O - http://www.example.com/automation/remotescript.sh | bash /dev/stdin parameter1 parameter2
    paulera · 2015-02-16 16:55:09 11
  • The '[r]' is to avoid grep from grepping itself. (interchange 'r' by the appropriate letter) Here is an example that I use a lot (as root or halt will not work): while (ps -ef | grep [w]get); do sleep 10; done; sleep 60; halt I add the 'sleep 60' command just in case something went wrong; so that I have time to cancel. Very useful if you are going to bed while downloading something and do not want your computer running all night.


    -2
    while (ps -ef | grep [r]unning_program_name); do sleep 10; done; command_to_execute
    m_a_xim · 2010-01-14 16:26:34 2

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

wget download with multiple simultaneous connections

Simple addicting bash game.
Really bored during class so I made this... Basically, you hold period (or whatever) and hit enter after a second and you need to make the next line of periods the same length as the previous line... My record was 5 lines of the same length. It's best if you do it one handed with your pointer on period and ring on enter.

Find broken symlinks and delete them
This command is adapted from http://otomaton.wordpress.com/2012/12/26/find-broken-symbolic-links/ Solutions with $ find -L don't work when the link is a loop, an error message is printed.

Recursive cat - concatenate files (filtered by extension) across multiple subdirectories into one file
Useful if you have to put together multiple files into one and they are scattered across subdirectories. For example: You need to combine all .sql files into one .sql file that would be sent to DBAs as a batch script. You do get a warning if you create a file by the same extension as the ones your searching for. find . -type f -name *.sql -exec cat {} > BatchFile.txt \;

Find dead symbolic links

Virtualbox: setup hardware
where - memory 256 assign 256 Mb RAM - acpi on enable ACPI (mandatory if you use Winfog 2000 - ioapic off disable the IO APIC. Not useful if you use one CPU (on virtual machine or a 32 bit operative system). As ACPI, this switch is mandatory for Winbug 2000 - pae on enable the Phisical Address Extension how to use more than 4Gb of RAM on x86 CPU - hwvirtex on enables hardware virtualization extensions for microprocessors that have this feature (which should be also enabled in the BIOS of the motherboard) - nestedpaging on allows part of the processes of memory management hardware are made directly

Viewable terminal session over network.
connect to it with any network command (including web browser - if you don't mind weird formatting) curl 127.0.0.1:9876 nc 127.0.0.1 9876

Find files of two different extensions and copy them to a directory

ls -hog --> a more compact ls -l
I often deal with long file names and the 'ls -l' command leaves very little room for file names. An alternative is to use the -h -o and -g flags (or together, -hog). * The -h flag produces human-readable file size (e.g. 91K instead of 92728) * The -o suppresses the owner column * The -g suppresses the group column Since I use to alias ll='ls -l', I now do alias ll='ls -hog'

Slow down the screen output of a command
(example above is the 'ls' command with reduced output speed)


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: