Commands tagged sort (176)

  • Goes through all files in the directory specified, uses `stat` to print out last modification time, then sorts numerically in reverse, then uses cut to remove the modified epoch timestamp and finally head to only output the last 10 modified files. Note that on a Mac `stat` won't work like this, you'll need to use either: find . -type f -print0 | xargs -0 stat -f '%m%t%Sm %12z %N' | sort -nr | cut -f2- | head or alternatively do a `brew install coreutils` and then replace `stat` with `gstat` in the original command. Show Sample Output


    5
    find . -type f -print0 | xargs -0 stat -c'%Y :%y %12s %n' | sort -nr | cut -d: -f2- | head
    HerbCSO · 2013-08-03 09:53:46 13
  • This command list and sort files by size and in reverse order, the reverse order is very helpful when you have a very long list and wish to have the biggest files at the bottom so you don't have scrool up. The file size info is in human readable output, so ex. 1K..234M...3G Tested with Linux (Red Hat Enterprise Edition)


    4
    ls -S -lhr
    rez0r · 2009-04-28 01:28:57 5
  • find largest file in /var


    4
    find /var -mount -ls -xdev | /usr/bin/sort -nr +6 | more
    mnikhil · 2009-05-16 10:53:55 6
  • This appends a random number as a first filed of all lines in SOMEFILE then sorts by the first column and finally cuts of the random numbers.


    4
    awk 'BEGIN{srand()}{print rand(),$0}' SOMEFILE | sort -n | cut -d ' ' -f2-
    axelabs · 2009-05-29 01:20:50 11
  • somewhat faster version to see the size of our directories. Size will be in Kilo Bytes. to view smallest first change '-k1nr' to '-k1n'.


    4
    find . -depth -type d -exec du -s {} \; | sort -k1nr
    mohan43u · 2009-06-23 20:52:35 11

  • 4
    tail -n2000 /var/www/domains/*/*/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n | awk '{ if ($1 > 20)print $1,$2}'
    allrightname · 2010-05-10 19:08:37 3
  • Works in sort (GNU coreutils) 7.4, don't know when it was implemented but sometime the last 6 years.


    4
    sort -R SOMEFILE
    miniker84 · 2010-09-16 22:29:27 7
  • Notes: 1) -n-1 means sort key is the last field 2) -l is important if each separate record is on a new line (usually so for text files) 3) -j tells msort not to create log file (msort.log) in the working directory 4) may need to install msort package. 5) msort does lot more. Check man msort Show Sample Output


    4
    file /bin/* | msort -j -l -n-1 -n2 2> /dev/null
    b_t · 2010-10-05 00:37:33 4
  • This sorts files in multiple directories by their modification date. Note that sorting is done at the end using "sort", instead of using the "-ltr" options to "ls". This ensures correct results when sorting a large number of files, in which case "find" will call "ls" multiple times.


    4
    find . -type f -exec ls -l --full-time {} + | sort -k 6,7
    quadcore · 2012-08-03 22:22:51 14
  • Creates one letter folders in the current directory and moves files with corresponding initial in the folder.


    4
    for i in *; do I=`echo $i|cut -c 1|tr a-z A-Z`; if [ ! -d "$I" ]; then mkdir "$I"; fi; mv "$i" "$I"/"$i"; done
    z3bu · 2018-06-29 11:37:04 640
  • If you're like me and want to keep all your music rated, and you use xmms2, you might like this command. I takes 10 random songs from your xmms2 library that don't have any rating, and adds them to your current playlist. You can then rate them in another xmms2 client that supports rating (I like kuechenstation). I'm pretty sure there's a better way to do the grep ... | sed ... part, probably with awk, but I don't know awk, so I'd welcome any suggestions. Show Sample Output


    3
    xmms2 mlib search NOT +rating | grep -r '^[0-9]' | sed -r 's/^([0-9]+).*/\1/' | sort -R | head | xargs -L 1 xmms2 addid
    goodevilgenius · 2009-04-16 20:27:30 6
  • using mb it's still readable;) a symbol variation $ du -ms {,.[^.]}* | sort -nk1 Show Sample Output


    3
    du -ms * .[^.]*| sort -nk1
    ioggstream · 2009-07-01 13:38:13 8

  • 3
    find $MAILDIR/ -type f -printf '%T@ %p\n' | sort --reverse | sed -e '{ 1,100d; s/[0-9]*\.[0-9]* \(.*\)/\1/g }' | xargs -i sh -c "cat {}&&rm -f {}" | gzip -c >>ARCHIVE.gz
    maergil · 2009-08-11 20:12:15 3
  • "cut" the user names from /etc/passwd and then running a loop over them. Show Sample Output


    3
    for u in `cut -f1 -d: /etc/passwd`; do echo -n $u:; groups $u; done | sort
    hemanth · 2009-08-22 09:06:02 7
  • Based on the MrMerry one, just add some visuals to differentiate files and directories


    3
    du -a --max-depth=1 | sort -n | cut -d/ -f2 | sed '$d' | while read i; do if [ -f $i ]; then du -h "$i"; else echo "$(du -h --max-depth=0 "$i")/"; fi; done
    nickwe · 2009-09-03 20:43:43 3
  • random(6) - random lines from a file or random numbers


    3
    random -f <file>
    haplo · 2009-09-24 19:15:58 10
  • This is easy to type if you are looking for a few (hundred) "missing" megabytes (and don't mind the occasional K slipping in)... A variation without false positives and also finding gigabytes (but - depending on your keyboard setup - more painful to type): du -hs *|grep -P '^(\d|,)+(M|G)'|sort -n (NOTE: you might want to replace the ',' according to your locale!) Don't forget that you can modify the globbing as needed! (e.g. '.[^\.]* *' to include hidden files and directories (w/ bash)) in its core similar to: http://www.commandlinefu.com/commands/view/706/show-sorted-list-of-files-with-sizes-more-than-1mb-in-the-current-dir Show Sample Output


    3
    du -hs *|grep M|sort -n
    tuxlifan · 2010-03-25 19:20:24 4
  • Gives you a list for all installed chrome (chromium) extensions with URL to the page of the extension. With this you can easy add a new Bookmark folder called "extensions" add every URL to that folder, so it will be synced and you can access the names from every computer you are logged in. ------------------------------------------------------------------------------------------------------------------ Only tested with chromium, for chrome you maybe have to change the find $PATH. Show Sample Output


    3
    for i in $(find ~/.config/chromium/*/Extensions -name 'manifest.json'); do n=$(grep -hIr name $i| cut -f4 -d '"'| sort);u="https://chrome.google.com/extensions/detail/";ue=$(basename $(dirname $(dirname $i))); echo -e "$n:\n$u$ue\n" ; done
    new_user · 2010-05-18 15:16:36 6
  • Once you get into advanced/optimized scripts, functions, or cli usage, you will use the sort command alot. The options are difficult to master/memorize however, and when you use sort commands as much as I do (some examples below), it's useful to have the help available with a simple alias. I love this alias as I never seem to remember all the options for sort, and I use sort like crazy (much better than uniq for example). # Sorts by file permissions find . -maxdepth 1 -printf '%.5m %10M %p\n' | sort -k1 -r -g -bS 20% 00761 drwxrw---x ./tmp 00755 drwxr-xr-x . 00701 drwx-----x ./askapache-m 00644 -rw-r--r-- ./.htaccess # Shows uniq history fast history 1000 | sed 's/^[0-9 ]*//' | sort -fubdS 50% exec bash -lxv export TERM=putty-256color Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    3
    alias sorth='sort --help|sed -n "/^ *-[^-]/s/^ *\(-[^ ]* -[^ ]*\) *\(.*\)/\1:\2/p"|column -ts":"'
    AskApache · 2010-06-10 21:30:31 9
  • sort is way slow by default. This tells sort to use a buffer equal to half of the available free memory. It also will use multiple process for the sort equal to the number of cpus on your machine (if greater than 1). For me, it is magnitudes faster. If you put this in your bash_profile or startup file, it will be set correctly when bash is started. sort -S1 --parallel=2 <(echo) &>/dev/null && alias sortfast='sort -S$(($(sed '\''/MemF/!d;s/[^0-9]*//g'\'' /proc/meminfo)/2048)) $([ `nproc` -gt 1 ]&&echo -n --parallel=`nproc`)' Alternative echo|sort -S10M --parallel=2 &>/dev/null && alias sortfast="command sort -S$(($(sed '/MemT/!d;s/[^0-9]*//g' /proc/meminfo)/1024-200)) --parallel=$(($(command grep -c ^proc /proc/cpuinfo)*2))" Show Sample Output


    3
    alias sortfast='sort -S$(($(sed '\''/MemF/!d;s/[^0-9]*//g'\'' /proc/meminfo)/2048)) $([ `nproc` -gt 1 ]&&echo -n --parallel=`nproc`)'
    AskApache · 2012-02-28 01:34:58 6

  • 3
    groups $(cut -f1 -d":" /etc/passwd) | sort
    tpaisndbgps · 2013-04-27 07:12:22 15

  • 3
    dpkg-query -W --showformat='${Installed-Size}\t${Package}\n' | sort -nr | less
    dfear · 2014-01-06 01:11:36 10
  • I'm sure there's a more elegant sed version for the tr + grep section.


    3
    ls | tr '[[:punct:][:space:]]' '\n' | grep -v "^\s*$" | sort | uniq -c | sort -bn
    qdrizh · 2014-10-14 09:52:28 9
  • This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.


    2
    grep -h -o '<[^/!?][^ >]*' * | sort -u | cut -c2-
    thebodzio · 2009-06-17 00:22:18 10

  • 2
    du -ms * | sort -nk1
    Tekhne · 2009-07-08 22:11:50 25
  •  < 1 2 3 4 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Schedule Nice Background Commands That Won't Die on Logout - Alternative to nohup and at
Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.

Open Remote Desktop (RDP) from command line having a custom screen size
This example uses xfreerdp, which builds upon the development of rdesktop. This example usage will also send you the remote machine's sound.

Repeat a command until stopped
In this case it runs the command 'curl localhost:3000/site/sha' waiting the amount of time in sleep, ie: 1 second between runs, appending each run to the console. This works well for any command where the output is less than your line width This is unlike watch, because watch always clears the display.

faster version of ls *
I know its not much but is very useful in time consuming scripts (cron, rc.d, etc).

list block devices
Shows all block devices in a tree with descruptions of what they are.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Record a screencast and convert it to an mpeg
Grab X11 input and create an MPEG at 25 fps with the resolution 800x600

Lists unambigously names of all xml elements used in files in current directory
This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

Recurse through directories easily
This is a simple case of recursing through all directories, adding the '.bak' extension to every file. Of course, the 'cp $file $file.bak' could be any code you need to apply to your recursion, including tests, other functions, creating variables, doing math, etc. Simple and clean recursion.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: