Commands tagged sort (176)

  • Based on the MrMerry one, just add some visuals and sort directory and files


    2
    find . -maxdepth 1 -type d|xargs du -a --max-depth=0|sort -rn|cut -d/ -f2|sed '1d'|while read i;do echo "$(du -h --max-depth=0 "$i")/";done;find . -maxdepth 1 -type f|xargs du -a|sort -rn|cut -d/ -f2|sed '$d'|while read i;do du -h "$i";done
    nickwe · 2009-09-03 20:33:21 5

  • 2
    grep current_state= /var/log/nagios/status.dat|sort|uniq -c|sed -e "s/[\t ]*\([0-9]*\).*current_state=\([0-9]*\)/\2:\1/"|tr "\n" " "
    c3w · 2010-03-11 06:04:14 3
  • I've wanted this for a long time, finally just sat down and came up with it. This shows you the sorted output of ps in a pretty format perfect for cron or startup scripts. You can sort by changing the k -vsz to k -pmem for example to sort by memory instead. If you want a function, here's one from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html aa_top_ps(){ local T N=${1:-10};T=${2:-vsz}; ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -${T} -A|sed -u "/^ *PID/d;${N}q"; } Show Sample Output


    2
    command ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -vsz -A|sed -u '/^ *PID/d;10q'
    AskApache · 2010-05-18 18:41:38 6
  • This provides a way to sort output based on the length of the line, so that shorter lines appear before longer lines. It's an addon to the sort that I've wanted for years, sometimes it's very useful. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    2
    sortwc () { local L;while read -r L;do builtin printf "${#L}@%s\n" "$L";done|sort -n|sed -u 's/^[^@]*@//'; }
    AskApache · 2010-05-20 20:13:52 7
  • Works with files containing spaces and for very large directories.


    2
    find -type f -print0 | xargs -r0 stat -c %y\ %n | sort
    dooblem · 2010-05-29 13:40:18 10
  • Shows a list of users that currently running processes are executing as. YMMV regarding ps and it's many variants. For example, you might need: ps -axgu | cut -f1 -d' ' | sort -u Show Sample Output


    2
    ps -eo user | sort -u
    dfaulkner · 2010-07-07 12:28:44 6
  • This uses some tricks I found while reading the bash man page to enumerate and display all the current environment variables, including those not listed by the 'env' command which according to the bash docs are more for internal use by BASH. The main trick is the way bash will list all environment variable names when performing expansion on ${!A*}. Then the eval builtin makes it work in a loop. I created a function for this and use it instead of env. (by aliasing env). This is the function that given any parameters lists the variables that start with it. So 'aae B' would list all env variables starting wit B. And 'aae {A..Z} {a..z}' would list all variables starting with any letter of the alphabet. And 'aae TERM' would list all variables starting with TERM. aae(){ local __a __i __z;for __a in "$@";do __z=\${!${__a}*};for __i in `eval echo "${__z}"`;do echo -e "$__i: ${!__i}";done;done; } And my printenv replacement is: alias env='aae {A..Z} {a..z} "_"|sort|cat -v 2>&1 | sed "s/\\^\\[/\\\\033/g"' From: http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    2
    for _a in {A..Z} {a..z};do _z=\${!${_a}*};for _i in `eval echo "${_z}"`;do echo -e "$_i: ${!_i}";done;done|cat -Tsv
    AskApache · 2010-10-27 07:16:54 5

  • 2
    awk '{if ($1 ~ /Package/) p = $2; if ($1 ~ /Installed/) printf("%9d %s\n", $2, p)}' /var/lib/dpkg/status | sort -n | tail
    gb38 · 2010-12-14 14:59:42 4
  • list top committers (and number of their commits) of svn repository. in this example it counts revisions of current directory. Show Sample Output


    2
    svn log -q | grep '^r[0-9]' | cut -f2 -d "|" | sort | uniq -c | sort -nr
    kkapron · 2011-01-03 15:23:08 4
  • Show disk space info, grepping out the uninteresting ones beginning with ^none while we're at it. The main point of this submission is the way it maintains the header row with the command grouping, by removing it from the pipeline before it gets fed into the sort command. (I'm surprised sort doesn't have an option to skip a header row, actually..) It took me a while to work out how to do this, I thought of it as I was drifting off to sleep last night! Show Sample Output


    2
    df -h | grep -v ^none | ( read header ; echo "$header" ; sort -rn -k 5)
    purpleturtle · 2011-03-16 14:25:45 14
  • List all MAC addresses on a Linux box. sort -u is useful when having virtual interfaces.


    2
    sort -u < /sys/class/net/*/address
    marssi · 2011-05-18 17:50:44 3
  • Randomizes a file. The opposite of sort is sort -R!


    2
    sort -R
    RyanM · 2011-07-15 15:35:27 3
  • Tells you everything you could ever want to know about all files and subdirectories. Great for package creators. Totally secure too. On my Slackware box, this gets set upon login: LS_OPTIONS='-F -b -T 0 --color=auto' and alias ls='/bin/ls $LS_OPTIONS' which works great. Show Sample Output


    2
    lsr() { find "${@:-.}" -print0 |sort -z |xargs -0 ls $LS_OPTIONS -dla; }
    h3xx · 2011-08-15 03:10:58 3
  • (separator = $IFS)


    2
    ps aux | sort -nk 6
    totti · 2011-08-16 11:04:45 3
  • sort command can sort month-wise (first three letters of each month). See the sample output for clarification. Sorting Stable ? NO. Take note if that matters to you. Sample output suggests that sort performs unstable sorting (see the relative order of two 'feb' entries). Show Sample Output


    2
    sort -M filename
    b_t · 2011-12-10 12:50:30 563

  • 2
    find . -type f -print0 | xargs -0 du -h | sort -hr | head
    mesuutt · 2012-06-29 12:43:06 6

  • 2
    du --max-depth=1 -h * |sort -n -k 1 |egrep 'M|G'
    leonteale · 2013-02-07 18:52:29 4
  • Get the longest match of file extension (Ex. For 'foo.tar.gz', you get '.tar.gz' instead of '.gz') Show Sample Output


    2
    find /some/path -type f -printf '%f\n' | grep -o '\..\+$' | sort | uniq -c | sort -rn
    skkzsh · 2013-03-18 14:42:29 7
  • Displays the duplicated lines in a file and their occuring frequency.


    1
    cat file.txt | sort | uniq -dc
    Vadi · 2009-03-21 18:15:14 7
  • A little bit smaller, faster and should handle files with special characters in the name.


    1
    find . -maxdepth 1 ! -name '.' -execdir du -0 -s {} + | sort -znr | gawk 'BEGIN{ORS=RS="\0";} {sub($1 "\t", ""); print $0;}' | xargs -0 du -hs
    ashawley · 2009-09-11 16:07:39 7
  • Counts TCP states from Netstat and displays in an ordered list. Show Sample Output


    1
    netstat -an | awk '/tcp/ {print $6}' | sort | uniq -c
    Kered557 · 2010-05-06 17:04:37 4
  • use Linux ;) Show Sample Output


    1
    pgrep -cu ioggstream
    ioggstream · 2010-05-21 10:53:57 4
  • Just a little simplification.


    1
    find /path/to/dir -type f | grep -o '\.[^./]*$' | sort | uniq
    dooblem · 2010-08-12 14:32:48 7
  • If your grep doesn't have an -o option, you can use sed instead.


    1
    find /path/to/dir -type f -name '*.*' | sed 's@.*/.*\.@.@' | sort | uniq
    putnamhill · 2010-08-12 15:48:54 26
  • Grabs the cmdline used to execute the process, and the environment that the process is being run under. This is much different than the 'env' command, which only lists the environment for the shell. This is very useful (to me at least) to debug various processes on my server. For example, this lets me see the environment that my apache, mysqld, bind, and other server processes have. Here's a function I use: aa_ps_all () { ( cd /proc && command ps -A -opid= | xargs -I'{}' sh -c 'test $PPID -ne {}&&test -r {}/cmdline&&echo -e "\n[{}]"&&tr -s "\000" " "<{}/cmdline&&echo&&tr -s "\000\033" "\nE"<{}/environ|sort&&cat {}/limits' ); } From my .bash_profile at http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    1
    cd /proc&&ps a -opid=|xargs -I+ sh -c '[[ $PPID -ne + ]]&&echo -e "\n[+]"&&tr -s "\000" " "<+/cmdline&&echo&&tr -s "\000\033" "\nE"<+/environ|sort'
    AskApache · 2010-10-22 02:34:33 14
  •  < 1 2 3 4 5 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

list all files in a directory, sorted in reverse order by modification time, use file descriptors.
It's both silly, and infinitely useful. Especially useful in logfile directories where you want to know what file is being updated while troubleshooting.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

BASH: Print shell variable into AWK

See how many more processes are allowed, awesome!
There is a limit to how many processes you can run at the same time for each user, especially with web hosts. If the maximum # of processes for your user is 200, then the following sets OPTIMUM_P to 100. $ OPTIMUM_P=$(( (`ulimit -u` - `find /proc -maxdepth 1 \( -user $USER -o -group $GROUPNAME \) -type d|wc -l`) / 2 )) This is very useful in scripts because this is such a fast low-resource-intensive (compared to ps, who, lsof, etc) way to determine how many processes are currently running for whichever user. The number of currently running processes is subtracted from the high limit setup for the account (see limits.conf, pam, initscript). An easy to understand example- this searches the current directory for shell scripts, and runs up to 100 'file' commands at the same time, greatly speeding up the command. $ find . -type f | xargs -P $OPTIMUM_P -iFNAME file FNAME | sed -n '/shell script text/p' I am using it in my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html especially for the xargs command. Xargs has a -P option that lets you specify how many processes to run at the same time. For instance if you have 1000 urls in a text file and wanted to download all of them fast with curl, you could download 100 at a time (check ps output on a separate [pt]ty for proof) like this: $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' I like to do things as fast as possible on my servers. I have several types of servers and hosting environments, some with very restrictive jail shells with 20processes limit, some with 200, some with 8000, so for the jailed shells my xargs -P10 would kill my shell or dump core. Using the above I can set the -P value dynamically, so xargs always works, like this. $ cat url-list.txt | xargs -I '{}' -P $OPTIMUM_P curl -O '{}' If you were building a process-killer (very common for cheap hosting) this would also be handy. Note that if you are only allowed 20 or so processes, you should just use -P1 with xargs.

Count the total number of files in each immediate subdirectory
counts the total (recursive) number of files in the immediate (depth 1) subdirectories as well as the current one and displays them sorted. Fixed, as per ashawley's comment

for too many arguments by *
$ grep ERROR *.log -bash: /bin/grep: Argument list too long $ echo *.log | xargs grep ERROR /dev/null 20090119.00011.log:DANGEROUS ERROR

get total of inodes of root partition

A signal trap that logs when your script was killed and what other processes were running at that time
trap is the bash builtin that allows you to execute commands when the current script receives a particular signal. Uses $0 for the script name, $$ for the script PID, tee to output to STDOUT as well as a log file and ps to log other running processes.

take a look to command before action
add |sh when you agree the list, I often use that method to prevent typos in dangerous or long operations

Get gzip compressed web page using wget.
Like the original command, but the -f allows this one to succeed even if the website returns uncompressed data. From gzip(1) on the -f flag: If the input data is not in a format recognized by gzip, and if the --stdout is also given, copy the input data without change to the standard output: let zcat behave as cat.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: