This solution is similar to [1] except that it does not have any dependency on GNU Parallel. Also, it tries to minimize the impact on the running system (using ionice and nice). [1] http://www.commandlinefu.com/commands/view/7009/recompress-all-.gz-files-in-current-directory-using-bzip2-running-1-job-per-cpu-core-in-parallel
Replace "user/sbin/sshd" with the file you would like to check. If you are doing this due to intrusion, you obviously would want to check size, last modification date and md5 of the md5sum application itself. Also, note that "/var/lib/dpkg/info/*.md5sums" files might have been tampered with themselves. Neither to say, this is a useful command. Show Sample Output
While `echo rm * | batch` might seem to work, it might still raise the load of the system since `rm` will be _started_ when the load is low, but run for a long time. My proposed command executes a new `rm` execution once every minute when the load is small. Obviously, load could also be lower using `ionice`, but I still think this is a useful example for sequential batch jobs. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: