Commands tagged background (14)

  • doesn't require "at", change the "2h" to whatever you want... (deafult unit for sleep is seconds)


    25
    ( ( sleep 2h; your-command your-args ) & )
    sitaram · 2009-08-19 17:39:11 3
  • Very useful in shell scripts because you can run a task nicely in the background using job-control and output progress until it completes. Here's an example of how I use it in backup scripts to run gpg in the background to encrypt an archive file (which I create in this same way). $! is the process ID of the last run command, which is saved here as the variable PI, then sleeper is called with the process id of the gpg task (PI), and sleeper is also specified to output : instead of the default . every 3 seconds instead of the default 1. So a shorter version would be sleeper $!; The wait is also used here, though it may not be needed on your system. echo ">>> ENCRYPTING SQL BACKUP" gpg --output archive.tgz.asc --encrypt archive.tgz 1>/dev/null & PI=$!; sleeper $PI ":" 3; wait $PI && rm archive.tgz &>/dev/null Previously to get around the $! not always being available, I would instead check for the existance of the process ID by checking if the directory /proc/$PID existed, but not everyone uses proc anymore. That version is currently the one at http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html but I plan on upgrading to this new version soon. Show Sample Output


    13
    sleeper(){ while `ps -p $1 &>/dev/null`; do echo -n "${2:-.}"; sleep ${3:-1}; done; }; export -f sleeper
    AskApache · 2009-09-21 07:36:25 1
  • I needed a way to search all files in a web directory that contained a certain string, and replace that string with another string. In the example, I am searching for "askapache" and replacing that string with "htaccess". I wanted this to happen as a cron job, and it was important that this happened as fast as possible while at the same time not hogging the CPU since the machine is a server. So this script uses the nice command to run the sh shell with the command, which makes the whole thing run with priority 19, meaning it won't hog CPU processing. And the -P5 option to the xargs command means it will run 5 separate grep and sed processes simultaneously, so this is much much faster than running a single grep or sed. You may want to do -P0 which is unlimited if you aren't worried about too many processes or if you don't have to deal with process killers in the bg. Also, the -m1 command to grep means stop grepping this file for matches after the first match, which also saves time. Show Sample Output


    10
    sh -c 'S=askapache R=htaccess; find . -mount -type f|xargs -P5 -iFF grep -l -m1 "$S" FF|xargs -P5 -iFF sed -i -e "s%${S}%${R}%g" FF'
    AskApache · 2009-10-02 05:03:10 0
  • This command runs your shell script in the background with no output of any kind, and it will remain running even after you logout.


    7
    nohup /bin/sh myscript.sh 1>&2 &>/dev/null 1>&2 &>/dev/null&
    AskApache · 2009-08-18 07:24:52 4
  • This is helpful for shell scripts, I use it in my custom php install script to schedule to delete the build files in 3 hours, as the php install script is completely automated and is made to run slow. Does require at, which some environments without crontab still do have. You can add as many commands to the at you want. Here's how I delete them in case the script gets killed. (trapped) atq |awk '{print $1}'|xargs -iJ atrm J &>/dev/null


    1
    echo "nohup command rm -rf /phpsessions 1>&2 &>/dev/null 1>&2 &>/dev/null&" | at now + 3 hours 1>&2 &>/dev/null
    AskApache · 2009-08-18 07:31:17 6
  • Need package: gksu Note: Launching gui app in background that needs sudo, won't work great with our old friendly style of launching: sudo gedit /etc/passwd & because this would put sudo in background ! Using gksudo as demonstrated, would popup a gui sudo window. May be this is a common knowledge, but not knowing this frustrated me during my newbie year.


    1
    gksudo gedit /etc/passwd &
    b_t · 2010-10-05 13:11:04 1

  • 1
    /System/Library/Frameworks/ScreenSaver.framework/Resources/ScreenSaverEngine.app/Contents/MacOS/ScreenSaverEngine -background &
    adkatrit · 2011-01-14 18:53:11 1
  • You're running a program that reads LOTS of files and takes a long time. But it doesn't tell you about its progress. First, run a command in the background, e.g. find /usr/share/doc -type f -exec cat {} + > output_file.txt Then run the watch command. "watch -d" highlights the changes as they happen In bash: $! is the process id (pid) of the last command run in the background. You can change this to $(pidof my_command) to watch something in particular. Show Sample Output


    1
    watch -d "ls -l /proc/$!/fd"
    flatcap · 2014-01-31 23:51:17 0
  • Simple way of having random mrxvt backgrounds. Add this to your bashrc and change the path names for the pictures.


    0
    LIST="/some/pic/file /another/picture /one/more/pic"; PIC=$(echo $LIST | sed s/"\ "/"\n"/g | shuf | head -1 | sed s/'\/'/'\\\/'/g ); sed -i s/Mrxvt.Pixmap:.*/"Mrxvt.Pixmap:\t$PIC"/ ~/.mrxvtrc
    dog · 2010-08-23 10:17:42 0
  • "The -b (background) option tells sudo to run the given command in the background." -- after it asks you for the password in the foreground. Show Sample Output


    0
    sudo -b xterm
    shavenwarthog · 2010-10-05 23:03:01 1
  • List background jobs, grep their number - not process id - and then kill them Show Sample Output


    0
    jobs | grep -o "[0-9]" | while read j; do kill %$j; done
    haggen · 2012-04-12 17:29:58 0
  • Run a nohup script in background launched trough a shell script without interrupting the main shell script execution.


    0
    nohup some_command/script.sh > /dev/null 2>&1&
    klausro · 2016-10-21 07:17:05 0

  • 0
    (nohup your-command your-args &>/dev/null &)
    socketz · 2018-05-23 17:35:43 0
  • Take advantage of sudo keeping you authenticated for ~15 minutes. The command is a little longer, but it does not require X (it can run on a headless server).


    -3
    sudo ls ; sudo gedit /etc/passwd &
    aporter · 2010-10-05 21:01:34 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

colored prompt
It colors the machine name and current directory different colors for easy viewing.

Colorize make, gcc, and diff output
Colorize output of make, gcc/g++ or diff, making it easier to read at a glance. They are not distributed with make, diff or gcc, but are usually available in the repositories.

Create a mirror of a local folder, on a remote server
Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22) (all files & folders on destination server/folder will be deleted)

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Read choice from user instantaneously
Usage exaple cmd $echo 'Sure to continue ??'; read -n1 choi; if [ "$choi" = 'y' ] || [ "$choi" = 'Y' ]; then echo -e '\nExecuting..'; else echo 'Aborted'; fi

play high-res video files on a slow processor
Certain codecs in high res don't play so well on my Dell Mini 9. Using this command, I can play just about anything and it keeps the sound in sync to boot!

Create a file and manipulate the date

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Multi-thread any command
For instance: $ find . -type f -name '*.wav' -print0 |xargs -0 -P 3 -n 1 flac -V8 will encode all .wav files into FLAC in parallel. Explanation of xargs flags: -P [max-procs]: Max number of invocations to run at once. Set to 0 to run all at once [potentially dangerous re: excessive RAM usage]. -n [max-args]: Max number of arguments from the list to send to each invocation. -0: Stdin is a null-terminated list. I use xargs to build parallel-processing frameworks into my scripts like the one here: http://pastebin.com/1GvcifYa

cd to (or operate on) a file across parallel directories
This is useful for quickly jumping around branches in a file system, or operating on a parellel file. This is tested in bash. cd to (substitute in PWD, a for b) where PWD is the bash environmental variable for the "working directory"


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: