Commands tagged jobs (4)

  • make, find and a lot of other programs can take a lot of time. And can do not. Supppose you write a long, complicated command and wonder if it will be done in 3 seconds or 20 minutes. Just add "R" (without quotes) suffix to it and you can do other things: zsh will inform you when you can see the results. You can replace zenity with other X Window dialogs program.


    1
    alias -g R=' &; jobs | tail -1 | read A0 A1 A2 cmd; echo "running $cmd"; fg "$cmd"; zenity --info --text "$cmd done"; unset A0 A1 A2 cmd'
    pipeliner · 2010-12-13 17:44:36 0
  • The "-u USER" is optional if root user is used


    1
    crontab -l -u USER | grep -v 'YOUR JOB COMMAND or PATTERN' | crontab -u USER -
    Koobiac · 2015-03-11 13:10:47 0
  • List background jobs, grep their number - not process id - and then kill them Show Sample Output


    0
    jobs | grep -o "[0-9]" | while read j; do kill %$j; done
    haggen · 2012-04-12 17:29:58 0
  • Run a job in the background and prefix it's output with some string. This is particularly useful if you are running inside a docker container in a startup script (sue me, I'll run two jobs in a docker container if I want to) and you can run something like: /usr/sbin/nginx 2>&1 | awk '{print "[NGINX] " $0}' & /opt/jws-3.1/tomcat8/bin/catalina.sh run 2>&1 | awk '{print "[TOMCAT] " $0}' & while true; do ; done it can also be combined with tee to create a file log as well as a stdout log, for example if the script above where a script called "/bin/start-container.sh" then you could run /bin/start-container.sh | tee /var/log/containerlogs Show Sample Output


    0
    nginx 2>&1 | awk '{print "[NGINX] " $0}' &
    hvindin · 2017-04-25 22:18:38 0

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Download an entire ftp directory using wget
If the username includes an @ you can use this one: wget -r --user=username_here --password=pass_here ftp://ftp.example.com

send echo to socket network
Using netcat, usuallly installed on debian/ubuntu. Also to test against a sample server the following two commands may help echo got milk? | netcat -l -p 25 python -c "import SocketServer; SocketServer.BaseRequestHandler.handle = lambda self: self.request.send('got milk?\n'); SocketServer.TCPServer(('0.0.0.0', 25), SocketServer.BaseRequestHandler).serve_forever()"

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Create a mirror of a local folder, on a remote server
Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22) (all files & folders on destination server/folder will be deleted)

Hiding and Show files on Mac OS X
These commands will mark a file as hidden or visible to Mac OS X Finder. Notice the capitol V vs the lowercase v. This will also work for directories. setfile -a V foo.bar; // This marks the file invisible setfile -a v foo.bar; // This marks the file visible I have also found that adding the following aliases are helpful: alias hide='setfile -a V' alias show='setfile -a v'

List all information about all files (in current dir)
This is a funny usage of the traditional command ls. It could be basically simplified as: $ ls -a -l Duplicating arguments is permitted: $ ls -a -l -l And this markup could be shortened as: $ ls -al Extra note: To view filesizes like a pro, pray for your God: $ ls -allah

sort ip by count quickly with awk from apache logs
creates associative array from apache logs, assumes "combined" log format or similar. replace awk column to suit needs. bandwidth per ip is also useful. have fun. I haven't found a more efficient way to do this as yet. sorry, FIXED TYPO: log file should obviously go after awk, which then pipes into sort.

find both total size and number of files below any given svn directory
afaik, svn doesn't have a good, scriptable way of telling you these two basic pieces of information.

Convert CSV to JSON
Replace 'csv_file.csv' with your filename.

Make any command read line enabled (on *nix)
Enable readline even if the command line application is not using it.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: