Commands tagged sort (165)

  • Another one. Maybe not the quicker because of the sort command, but it will also look in other man sections. updated with goodevilgenius 'shuf' idea


    -2
    man $(ls -1 /usr/share/man/man?/ | shuf -n1 | cut -d. -f1)
    dooblem · 2010-08-20 23:36:10 0
  • All with only one pipe. Should be much faster as well (sort is slow). Use find instead of ls for recursion or reliability. Edit: case insensitive Show Sample Output


    -3
    ls | perl -lne '++$x{lc $1} if /[.](.+)$/ }{ print for keys %x'
    recursiverse · 2010-08-13 20:05:15 1
  • If your grep doesn't have an -o option, you can use sed instead.


    1
    find /path/to/dir -type f -name '*.*' | sed 's@.*/.*\.@.@' | sort | uniq
    putnamhill · 2010-08-12 15:48:54 0
  • Just a little simplification.


    1
    find /path/to/dir -type f | grep -o '\.[^./]*$' | sort | uniq
    dooblem · 2010-08-12 14:32:48 1
  • Essentially the same as funky's alias, but will not traverse filesystems and has nicer formatting. Show Sample Output


    -1
    alias dush="du -xsm * | sort -n | awk '{ printf(\"%4s MB ./\",\$1) ; for (i=1;i<=NF;i++) { if (i>1) printf(\"%s \",\$i) } ; printf(\"\n\") }' | tail"
    dopeman · 2010-07-15 10:38:27 1
  • Use this BASH trick to create a variable containing the TAB character and pass it as the argument to sort, join, cut and other commands which don't understand the \t notation. sort -t $'\t' ... join -t $'\t' ... cut -d $'\t' ... Show Sample Output


    5
    sort -t $'\t' -k 2 input.txt
    postrational · 2010-07-11 12:58:51 0

  • -3
    ps -axgu | cut -f1 -d' ' | sort -u
    dfaulkner · 2010-07-07 12:29:46 0
  • Shows a list of users that currently running processes are executing as. YMMV regarding ps and it's many variants. For example, you might need: ps -axgu | cut -f1 -d' ' | sort -u Show Sample Output


    2
    ps -eo user | sort -u
    dfaulkner · 2010-07-07 12:28:44 0

  • -2
    cut -d: -f1 /etc/passwd | sort
    dog · 2010-07-07 12:12:02 2
  • Most systems (at least my macbook) have system users defined, such as _www and using "users" for example will not list them. This command allows you to see who the 'virtual' users are on your system. Show Sample Output


    -4
    sudo lsof|sed 's/ */ /g'|cut -f3 -d' '|sort -u
    binaryten · 2010-07-07 08:20:28 4
  • I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable. find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50% 00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp The key is: # -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' which believe it or not took me hundreds of tweaking before I was happy with the output. You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions. lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; } Personally I'm using this function because: lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`; find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; } # i can sort by user lll -sort=3 # or sort by group reversed lll -sort=4 -r # and sort by modification time lll -sort=6 If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like: for a; do lll $a; done Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    7
    find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
    AskApache · 2010-06-10 22:03:08 4
  • Once you get into advanced/optimized scripts, functions, or cli usage, you will use the sort command alot. The options are difficult to master/memorize however, and when you use sort commands as much as I do (some examples below), it's useful to have the help available with a simple alias. I love this alias as I never seem to remember all the options for sort, and I use sort like crazy (much better than uniq for example). # Sorts by file permissions find . -maxdepth 1 -printf '%.5m %10M %p\n' | sort -k1 -r -g -bS 20% 00761 drwxrw---x ./tmp 00755 drwxr-xr-x . 00701 drwx-----x ./askapache-m 00644 -rw-r--r-- ./.htaccess # Shows uniq history fast history 1000 | sed 's/^[0-9 ]*//' | sort -fubdS 50% exec bash -lxv export TERM=putty-256color Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    3
    alias sorth='sort --help|sed -n "/^ *-[^-]/s/^ *\(-[^ ]* -[^ ]*\) *\(.*\)/\1:\2/p"|column -ts":"'
    AskApache · 2010-06-10 21:30:31 0
  • Works with files containing spaces and for very large directories.


    2
    find -type f -print0 | xargs -r0 stat -c %y\ %n | sort
    dooblem · 2010-05-29 13:40:18 0
  • Here's a version that doesn't use find.


    -2
    ls -rl --time-style=+%s * | sed '/^$/,/^total [0-9]*$/d' | sort -nk6
    putnamhill · 2010-05-27 19:14:12 1
  • use Linux ;) Show Sample Output


    1
    pgrep -cu ioggstream
    ioggstream · 2010-05-21 10:53:57 0
  • This provides a way to sort output based on the length of the line, so that shorter lines appear before longer lines. It's an addon to the sort that I've wanted for years, sometimes it's very useful. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    2
    sortwc () { local L;while read -r L;do builtin printf "${#L}@%s\n" "$L";done|sort -n|sed -u 's/^[^@]*@//'; }
    AskApache · 2010-05-20 20:13:52 1
  • I've wanted this for a long time, finally just sat down and came up with it. This shows you the sorted output of ps in a pretty format perfect for cron or startup scripts. You can sort by changing the k -vsz to k -pmem for example to sort by memory instead. If you want a function, here's one from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html aa_top_ps(){ local T N=${1:-10};T=${2:-vsz}; ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -${T} -A|sed -u "/^ *PID/d;${N}q"; } Show Sample Output


    2
    command ps wwo pid,user,group,vsize:8,size:8,sz:6,rss:6,pmem:7,pcpu:7,time:7,wchan,sched=,stat,flags,comm,args k -vsz -A|sed -u '/^ *PID/d;10q'
    AskApache · 2010-05-18 18:41:38 1
  • Gives you a list for all installed chrome (chromium) extensions with URL to the page of the extension. With this you can easy add a new Bookmark folder called "extensions" add every URL to that folder, so it will be synced and you can access the names from every computer you are logged in. ------------------------------------------------------------------------------------------------------------------ Only tested with chromium, for chrome you maybe have to change the find $PATH. Show Sample Output


    2
    for i in $(find ~/.config/chromium/*/Extensions -name 'manifest.json'); do n=$(grep -hIr name $i| cut -f4 -d '"'| sort);u="https://chrome.google.com/extensions/detail/";ue=$(basename $(dirname $(dirname $i))); echo -e "$n:\n$u$ue\n" ; done
    new_user · 2010-05-18 15:16:36 1

  • 4
    tail -n2000 /var/www/domains/*/*/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{ if ($1 > 20)print $1,$2}'
    allrightname · 2010-05-10 19:08:37 0
  • Counts TCP states from Netstat and displays in an ordered list. Show Sample Output


    1
    netstat -an | awk '/tcp/ {print $6}' | sort | uniq -c
    Kered557 · 2010-05-06 17:04:37 1
  • awk is evil! Show Sample Output


    21
    ps hax -o user | sort | uniq -c
    buzzy · 2010-04-29 10:43:03 4

  • -3
    ls -lS
    javamaniac · 2010-04-08 14:37:46 0

  • 8
    du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
    askedrelic · 2010-04-05 17:09:14 0
  • Thanks for the submit! My alternative produces summaries only for directories. The original post additionally lists all files in the current directory. Sometimes the files, they just clutter up the output. Once the big directory is located, *then* worry about which file(s) are consuming so much space.


    -1
    du -kd | egrep -v "/.*/" | sort -n
    rmbjr60 · 2010-03-30 15:40:35 0
  • sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can dush for the above output, dush -n 3 for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc Show Sample Output


    29
    alias dush="du -sm *|sort -n|tail"
    funky · 2010-03-26 10:18:57 1
  • ‹ First  < 3 4 5 6 7 > 

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: