Commands tagged sort (176)

  • sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can dush for the above output, dush -n 3 for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc Show Sample Output


    29
    alias dush="du -sm *|sort -n|tail"
    funky · 2010-03-26 10:18:57 16
  • This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.


    23
    alias busy='my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk "{print $1}"); let "r = $RANDOM % $my_len" 2>/dev/null; vim +$r $my_file'
    busybee · 2010-03-09 21:48:41 31
  • awk is evil! Show Sample Output


    22
    ps hax -o user | sort | uniq -c
    buzzy · 2010-04-29 10:43:03 15

  • 16
    find . -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort
    sammcj · 2011-08-14 23:34:10 39
  • Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount. BSD Version: netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr Show Sample Output


    14
    netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
    tiagofischer · 2009-03-28 21:02:26 83
  • This can be much faster than downloading one or both trees to a common servers and comparing the files there. After, only those files could be copied down for deeper comparison if needed. Show Sample Output


    14
    diff <(ssh server01 'cd config; find . -type f -exec md5sum {} \;| sort -k 2') <(ssh server02 'cd config;find . -type f -exec md5sum {} \;| sort -k 2')
    arcege · 2009-09-11 15:24:59 17
  • In this case I'm just grabbing the next level of subdirectories (and same level regular files) with the --max-depth=1 flag. leaving out that flag will just give you finer resolution. Note that you have to use the -h switch with both 'du' and with 'sort.' Show Sample Output


    14
    du -h --max-depth=1 |sort -rh
    jambino · 2011-11-15 20:30:00 13
  • Very useful when you need disk space. It calculates the disk usage of all files and dirs (descending them) located at the current directory (including hidden ones). Then sort puts them in order.


    11
    du -cs * .[^\.]* | sort -n
    cemsbr · 2009-03-02 18:43:48 18
  • dpigs is in the package debian-goodies (debian/ubuntu)


    11
    dpigs
    xrm0 · 2009-10-26 20:38:17 18
  • Search for files and list the 20 largest. find . -type f gives us a list of file, recursively, starting from here (.) -print0 | xargs -0 du -h separate the names of files with NULL characters, so we're not confused by spaces then xargs run the du command to find their size (in human-readable form -- 64M not 64123456) | sort -hr use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K | head -20 finally only select the top twenty out of the list Show Sample Output


    11
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -20
    flatcap · 2012-03-30 10:21:12 10
  • This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit. This command won't use command line arguments to sort the files and will display the sorted list correctly. Show Sample Output


    10
    find . -type f -printf '%20s %p\n' | sort -n | cut -b22- | tr '\n' '\000' | xargs -0 ls -laSr
    fsilveira · 2009-08-13 13:13:33 14
  • Just refining last proposal for this check, showing awk power to make more complex math (instead /1024/1024, 2^20). We don't need declare variable before run lsof, because $(command) returns his output. Also, awk can perform filtering by regexp instead to call grep. I changed the 0.0000xxxx messy output, with a more readable form purging all fractional numbers and files less than 1 MB. Show Sample Output


    10
    lsof -p $(pidof firefox) | awk '/.mozilla/ { s = int($7/(2^20)); if(s>0) print (s)" MB -- "$9 | "sort -rn" }'
    tzk · 2010-01-13 22:45:53 9
  • This command checks for the number of times when someone has tried to login to your server and failed. If there are a lot, then that user is being targeted on your system and you might want to make sure that user either has remote logins disabled, or has a strong password, or both. If your output has an "invalid" line, it is a summary of all logins from users that don't exist on your system. Show Sample Output


    9
    zgrep "Failed password" /var/log/auth.log* | awk '{print $9}' | sort | uniq -c | sort -nr | less
    dbart · 2009-03-03 13:45:56 14

  • 9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -10
    netaxiz · 2012-06-30 10:03:31 5

  • 8
    du -k | sort -r -n | more
    puddleglum · 2009-03-18 18:14:19 8

  • 8
    du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
    askedrelic · 2010-04-05 17:09:14 6
  • I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable. find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50% 00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp The key is: # -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' which believe it or not took me hundreds of tweaking before I was happy with the output. You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions. lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; } Personally I'm using this function because: lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`; find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; } # i can sort by user lll -sort=3 # or sort by group reversed lll -sort=4 -r # and sort by modification time lll -sort=6 If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like: for a; do lll $a; done Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
    AskApache · 2010-06-10 22:03:08 9
  • Check which files are opened by Firefox then sort by largest size (in MB). You can see all files opened by just replacing grep to "/". Useful if you'd like to debug and check which extensions or files are taking too much memory resources in Firefox. Show Sample Output


    6
    FFPID=$(pidof firefox-bin) && lsof -p $FFPID | awk '{ if($7>0) print ($7/1024/1024)" MB -- "$9; }' | grep ".mozilla" | sort -rn
    josue · 2009-08-16 08:58:22 7

  • 6
    tr A-Z a-z | tr -cs a-z '\n' | sort | uniq -c
    putnamhill · 2010-10-19 22:49:13 8
  • Sort lines within vi editor. In this example sort lines 33-61 and lines 4-9 asciibetically. Show Sample Output


    6
    :33,61 !sort
    greggster · 2011-05-06 06:10:05 8

  • 6
    cut -d: -f1 /etc/passwd | sort
    dan · 2011-12-20 10:46:52 12
  • This will mv all your mp3 files in the current directory to $ARTIST/$ALBUM/$NAME.mp3 Make sure not to use sudo - as some weird things can happen if the mp3 file doesn't have id3 tags.


    5
    for file in *.mp3;do mkdir -p "$(mp3info -p "%a/%l" "$file")" && ln -s "$file" "$(mp3info -p "%a/%l/%t.mp3" "$file")";done
    matthewbauer · 2009-08-05 17:04:34 10
  • Use this BASH trick to create a variable containing the TAB character and pass it as the argument to sort, join, cut and other commands which don't understand the \t notation. sort -t $'\t' ... join -t $'\t' ... cut -d $'\t' ... Show Sample Output


    5
    sort -t $'\t' -k 2 input.txt
    postrational · 2010-07-11 12:58:51 5
  • Save some CPU, and some PIDs. :)


    5
    awk -F ':' '{print $1 | "sort";}' /etc/passwd
    atoponce · 2011-12-20 12:46:52 4
  • This uses the ability of find (at least the one from GNU findutils that is shiped with most linux distros) to display change time as part of its output. No xargs needed.


    5
    find -printf "%C@ %p\n"|sort
    oivvio · 2013-06-19 10:42:49 12
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

List 10 largest directories in current directory
du -m option to not go across mounts (you usually want to run that command to find what to destroy in that partition) -a option to also list . files -k to display in kilobytes sort -n to sort in numerical order, biggest files last tail -10 to only display biggest 10

For a $FILE, extracts the path, filename, filename without extension and extension.
Useful for use in other scripts for renaming, testing for extensions, etc.

execute your commands and avoid history records
$ secret_command;export HISTCONTROL= This will make "secret_command" not appear in "history" list.

Display which distro is installed

Sort the size usage of a directory tree by gigabytes, kilobytes, megabytes, then bytes.
Probably only works with GNU du and modern perls.

List all users and groups

Display EPOCH time in human readable format using AWK.

split a file by a specific number of lines
Splits the file "my_file" every 500 lines. Will create files called xx01 xx02 and so on. You can change the prefix by using the -f option. Comes in handy for splitting logfiles for example. I am using it for feeding a logfile parser with smaller files instead of one big file (due to performance reasons)

Read the output of a command into the buffer in vim
This will append the output of "command" to whatever file you're currently editing in vim. Who else has good vim tricks? :)

Burn an ISO on the command line.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: