Commands tagged high memory usage (1)

  • Finding high memory usage report in human readable format. Show Sample Output


    3
    ps -eo size,pid,user,command --sort -size |awk '{hr[1024**2]="GB";hr[1024]="MB";for (x=1024**3; x>=1024; x/=1024){if ($1>=x){printf ("%-6.2f %s ", $1/x, hr[x]);break}}}{printf ("%-6s %-10s ", $2, $3)}{for (x=4;x<=NF;x++){printf ("%s ",$x)} print ("\n")}'
    rockon · 2012-11-27 04:29:08 11

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Say something out loud
Whatever arguments you pass will be spoken out loud. (Put it in a script or shell function.)

Obtain last stock quote from google API with xmlstarlet

Browse system RAM in a human readable form
This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc). Sometimes if you don't save that file that you were working on or want to get back something you closed it can be found floating around in here! The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn't "human readable"). If you want to dump the whole thing to a file replace the final '| less' with '> memorydump'. This is great for searching through many times (and with the added bonus that it doesn't overwrite any memory...). Here's a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)... $sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})' (depending on sudo settings it might be best to run $sudo su first to get to a # prompt)

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Display all shell functions set in the current shell environment
Uses the shell builtin `declare` with the '-f' flag to output only functions to grep out only the function names. You can use it as an alias or function like so: alias shfunctions="builtin declare -f | command grep --color=never -E '^[a-zA-Z_]+\ \(\)'" shfunctions () { builtin declare -f | command grep --color=never -E '^[a-zA-Z_]+\ \(\)'; }

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Put the machine to sleep after the download(wget) is done
[Note: This command needs to be run as root]. If you are downloading something large at night, you can start wget as a normal user and issue the above command as root. When the download is done, the computer will automatically go to sleep. If at any time you feel the computer should not go to sleep automatically(like if you find the download still continuing in the morning), just create an empty file called nosleep in /tmp directory.

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

cd to (or operate on) a file across parallel directories
This is useful for quickly jumping around branches in a file system, or operating on a parellel file. This is tested in bash. cd to (substitute in PWD, a for b) where PWD is the bash environmental variable for the "working directory"

Rename duplicates from MusicBrainz Picard
Renames duplicates from MusicBrainz Picard, so you get the latest copy and not a bunch of duplicates.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: