List background jobs, grep their number - not process id - and then kill them Show Sample Output
Daemontools[1] won't always properly reap it's children. Sometimes when you need to kill the main svscan process, you want to also clean up all of it's children. The way to do that is to send a signal to the entire process group. It is a bit tricky [1] http://cr.yp.to/daemontools.html
You can also use gawk: ps auxww | gawk '/application/' | gawk '/processtobekilled/' | gawk '{print $2}' | grep -v grep | xargs kill -9
Kill all processes belonging to a user, with a minimum of resource usage. Great for the times when a user fork-bombs the system and it's difficult to login or run commands.
Execute commands serially on a list of hosts. Each ssh connection is made in the background so that if, after five seconds, it hasn't closed, it will be killed and the script will go on to the next system. Maybe there's an easier way to set a timeout in the ssh options...
Cleaned up and silent with &>/dev/null at the end. Show Sample Output
Kills all processes with a certain string. This was done to kill all ssh sessions openned by zenoss which look like: usr/bin/ssh /opt/zenoss/bin/zenmodeler
There are times when a X Window server hangs. When this happens, you can log in on a terminal and kill the Xorg process (i.e. the X Server). This one line command will do the trick.
display dd status on OSX (coreutils) every 10 seconds Show Sample Output
Kills all browser tabs, without killing browser or extensions.
Some times firefox run a plugin named "plugin-containe". It sometimes take so much cpu that it is impossible to keep browsing. When this happens, one can simply kill the process. If the process does not exist, the command returns an innocuous error message.
If logged in via ssh, you'll be knocked out.
Given a process name (kdiff3 in this example) that keeps auto-spawning/starting, auto-kill until it stops, assuming there is an upper limit (if not, reboot). This was beneficial after I clicked the DIFF button in a git GUI on a merge commit. 2000+ files were being opened one after the other in my diff program (kdiff3). Each time I closed one (or quit Kdiff3), the next file would be auto-opened in Kdiff3.
May need to substitute 'awk' for 'gawk'.
Useful when developing and you donut usually have other docker commands running Show Sample Output
You cannot kill zombies, as they are already dead. But if you have too many zombies then kill parent process or restart service. You can kill zombie process using PID obtained from the above command. For example kill zombie proces having PID 4104: # kill -9 4104 Please note that kill -9 does not guarantee to kill a zombie process.
kill all processes of a program
compare to alternative : - directly tests the -STOP of the process to continue or stop loop, - background operator should be set (or not) at the call of the function For extension i suggest a slowPID() based on kill like above and a slowCMD based on killall.
Replace 'sleep 10' with the command to wait for Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: