Commands tagged sort (176)

  • Based on the MrMerry one, just add some visuals and sort directory and files


    2
    find . -maxdepth 1 -type d|xargs du -a --max-depth=0|sort -rn|cut -d/ -f2|sed '1d'|while read i;do echo "$(du -h --max-depth=0 "$i")/";done;find . -maxdepth 1 -type f|xargs du -a|sort -rn|cut -d/ -f2|sed '$d'|while read i;do du -h "$i";done
    nickwe · 2009-09-03 20:33:21 5

  • 2
    grep current_state= /var/log/nagios/status.dat|sort|uniq -c|sed -e "s/[\t ]*\([0-9]*\).*current_state=\([0-9]*\)/\2:\1/"|tr "\n" " "
    c3w · 2010-03-11 06:04:14 3
  • I've wanted this for a long time, finally just sat down and came up with it. This shows you the sorted output of ps in a pretty format perfect for cron or startup scripts. You can sort by changing the k -vsz to k -pmem for example to sort by memory instead. If you want a function, here's one from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html aa_top_ps(){ local T N=${1:-10};T=${2:-vsz}; ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -${T} -A|sed -u "/^ *PID/d;${N}q"; } Show Sample Output


    2
    command ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -vsz -A|sed -u '/^ *PID/d;10q'
    AskApache · 2010-05-18 18:41:38 6
  • This provides a way to sort output based on the length of the line, so that shorter lines appear before longer lines. It's an addon to the sort that I've wanted for years, sometimes it's very useful. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    2
    sortwc () { local L;while read -r L;do builtin printf "${#L}@%s\n" "$L";done|sort -n|sed -u 's/^[^@]*@//'; }
    AskApache · 2010-05-20 20:13:52 7
  • Works with files containing spaces and for very large directories.


    2
    find -type f -print0 | xargs -r0 stat -c %y\ %n | sort
    dooblem · 2010-05-29 13:40:18 10
  • Shows a list of users that currently running processes are executing as. YMMV regarding ps and it's many variants. For example, you might need: ps -axgu | cut -f1 -d' ' | sort -u Show Sample Output


    2
    ps -eo user | sort -u
    dfaulkner · 2010-07-07 12:28:44 6
  • This uses some tricks I found while reading the bash man page to enumerate and display all the current environment variables, including those not listed by the 'env' command which according to the bash docs are more for internal use by BASH. The main trick is the way bash will list all environment variable names when performing expansion on ${!A*}. Then the eval builtin makes it work in a loop. I created a function for this and use it instead of env. (by aliasing env). This is the function that given any parameters lists the variables that start with it. So 'aae B' would list all env variables starting wit B. And 'aae {A..Z} {a..z}' would list all variables starting with any letter of the alphabet. And 'aae TERM' would list all variables starting with TERM. aae(){ local __a __i __z;for __a in "$@";do __z=\${!${__a}*};for __i in `eval echo "${__z}"`;do echo -e "$__i: ${!__i}";done;done; } And my printenv replacement is: alias env='aae {A..Z} {a..z} "_"|sort|cat -v 2>&1 | sed "s/\\^\\[/\\\\033/g"' From: http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    2
    for _a in {A..Z} {a..z};do _z=\${!${_a}*};for _i in `eval echo "${_z}"`;do echo -e "$_i: ${!_i}";done;done|cat -Tsv
    AskApache · 2010-10-27 07:16:54 5

  • 2
    awk '{if ($1 ~ /Package/) p = $2; if ($1 ~ /Installed/) printf("%9d %s\n", $2, p)}' /var/lib/dpkg/status | sort -n | tail
    gb38 · 2010-12-14 14:59:42 4
  • list top committers (and number of their commits) of svn repository. in this example it counts revisions of current directory. Show Sample Output


    2
    svn log -q | grep '^r[0-9]' | cut -f2 -d "|" | sort | uniq -c | sort -nr
    kkapron · 2011-01-03 15:23:08 4
  • Show disk space info, grepping out the uninteresting ones beginning with ^none while we're at it. The main point of this submission is the way it maintains the header row with the command grouping, by removing it from the pipeline before it gets fed into the sort command. (I'm surprised sort doesn't have an option to skip a header row, actually..) It took me a while to work out how to do this, I thought of it as I was drifting off to sleep last night! Show Sample Output


    2
    df -h | grep -v ^none | ( read header ; echo "$header" ; sort -rn -k 5)
    purpleturtle · 2011-03-16 14:25:45 14
  • List all MAC addresses on a Linux box. sort -u is useful when having virtual interfaces.


    2
    sort -u < /sys/class/net/*/address
    marssi · 2011-05-18 17:50:44 3
  • Randomizes a file. The opposite of sort is sort -R!


    2
    sort -R
    RyanM · 2011-07-15 15:35:27 3
  • Tells you everything you could ever want to know about all files and subdirectories. Great for package creators. Totally secure too. On my Slackware box, this gets set upon login: LS_OPTIONS='-F -b -T 0 --color=auto' and alias ls='/bin/ls $LS_OPTIONS' which works great. Show Sample Output


    2
    lsr() { find "${@:-.}" -print0 |sort -z |xargs -0 ls $LS_OPTIONS -dla; }
    h3xx · 2011-08-15 03:10:58 3
  • (separator = $IFS)


    2
    ps aux | sort -nk 6
    totti · 2011-08-16 11:04:45 3
  • sort command can sort month-wise (first three letters of each month). See the sample output for clarification. Sorting Stable ? NO. Take note if that matters to you. Sample output suggests that sort performs unstable sorting (see the relative order of two 'feb' entries). Show Sample Output


    2
    sort -M filename
    b_t · 2011-12-10 12:50:30 560

  • 2
    find . -type f -print0 | xargs -0 du -h | sort -hr | head
    mesuutt · 2012-06-29 12:43:06 6

  • 2
    du --max-depth=1 -h * |sort -n -k 1 |egrep 'M|G'
    leonteale · 2013-02-07 18:52:29 4
  • Get the longest match of file extension (Ex. For 'foo.tar.gz', you get '.tar.gz' instead of '.gz') Show Sample Output


    2
    find /some/path -type f -printf '%f\n' | grep -o '\..\+$' | sort | uniq -c | sort -rn
    skkzsh · 2013-03-18 14:42:29 7
  • Displays the duplicated lines in a file and their occuring frequency.


    1
    cat file.txt | sort | uniq -dc
    Vadi · 2009-03-21 18:15:14 7
  • A little bit smaller, faster and should handle files with special characters in the name.


    1
    find . -maxdepth 1 ! -name '.' -execdir du -0 -s {} + | sort -znr | gawk 'BEGIN{ORS=RS="\0";} {sub($1 "\t", ""); print $0;}' | xargs -0 du -hs
    ashawley · 2009-09-11 16:07:39 7
  • Counts TCP states from Netstat and displays in an ordered list. Show Sample Output


    1
    netstat -an | awk '/tcp/ {print $6}' | sort | uniq -c
    Kered557 · 2010-05-06 17:04:37 4
  • use Linux ;) Show Sample Output


    1
    pgrep -cu ioggstream
    ioggstream · 2010-05-21 10:53:57 4
  • Just a little simplification.


    1
    find /path/to/dir -type f | grep -o '\.[^./]*$' | sort | uniq
    dooblem · 2010-08-12 14:32:48 7
  • If your grep doesn't have an -o option, you can use sed instead.


    1
    find /path/to/dir -type f -name '*.*' | sed 's@.*/.*\.@.@' | sort | uniq
    putnamhill · 2010-08-12 15:48:54 26
  • Grabs the cmdline used to execute the process, and the environment that the process is being run under. This is much different than the 'env' command, which only lists the environment for the shell. This is very useful (to me at least) to debug various processes on my server. For example, this lets me see the environment that my apache, mysqld, bind, and other server processes have. Here's a function I use: aa_ps_all () { ( cd /proc && command ps -A -opid= | xargs -I'{}' sh -c 'test $PPID -ne {}&&test -r {}/cmdline&&echo -e "\n[{}]"&&tr -s "\000" " "<{}/cmdline&&echo&&tr -s "\000\033" "\nE"<{}/environ|sort&&cat {}/limits' ); } From my .bash_profile at http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    1
    cd /proc&&ps a -opid=|xargs -I+ sh -c '[[ $PPID -ne + ]]&&echo -e "\n[+]"&&tr -s "\000" " "<+/cmdline&&echo&&tr -s "\000\033" "\nE"<+/environ|sort'
    AskApache · 2010-10-22 02:34:33 14
  •  < 1 2 3 4 5 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Schedule Nice Background Commands That Won't Die on Logout - Alternative to nohup and at
Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.

Open Remote Desktop (RDP) from command line having a custom screen size
This example uses xfreerdp, which builds upon the development of rdesktop. This example usage will also send you the remote machine's sound.

Repeat a command until stopped
In this case it runs the command 'curl localhost:3000/site/sha' waiting the amount of time in sleep, ie: 1 second between runs, appending each run to the console. This works well for any command where the output is less than your line width This is unlike watch, because watch always clears the display.

faster version of ls *
I know its not much but is very useful in time consuming scripts (cron, rc.d, etc).

list block devices
Shows all block devices in a tree with descruptions of what they are.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Record a screencast and convert it to an mpeg
Grab X11 input and create an MPEG at 25 fps with the resolution 800x600

Lists unambigously names of all xml elements used in files in current directory
This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

Recurse through directories easily
This is a simple case of recursing through all directories, adding the '.bak' extension to every file. Of course, the 'cp $file $file.bak' could be any code you need to apply to your recursion, including tests, other functions, creating variables, doing math, etc. Simple and clean recursion.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: