Commands tagged sort (176)

  • sorts the files by integer megabytes, which should be enough to (interactively) find the space wasters. Now you can dush for the above output, dush -n 3 for only the 3 biggest files and so on. It's always a good idea to have this line in your .profile or .bashrc Show Sample Output


    29
    alias dush="du -sm *|sort -n|tail"
    funky · 2010-03-26 10:18:57 16
  • This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.


    23
    alias busy='my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk "{print $1}"); let "r = $RANDOM % $my_len" 2>/dev/null; vim +$r $my_file'
    busybee · 2010-03-09 21:48:41 26
  • awk is evil! Show Sample Output


    22
    ps hax -o user | sort | uniq -c
    buzzy · 2010-04-29 10:43:03 14

  • 16
    find . -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort
    sammcj · 2011-08-14 23:34:10 39
  • Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount. BSD Version: netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr Show Sample Output


    14
    netstat -ntu | awk '{print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
    tiagofischer · 2009-03-28 21:02:26 80
  • This can be much faster than downloading one or both trees to a common servers and comparing the files there. After, only those files could be copied down for deeper comparison if needed. Show Sample Output


    14
    diff <(ssh server01 'cd config; find . -type f -exec md5sum {} \;| sort -k 2') <(ssh server02 'cd config;find . -type f -exec md5sum {} \;| sort -k 2')
    arcege · 2009-09-11 15:24:59 17
  • In this case I'm just grabbing the next level of subdirectories (and same level regular files) with the --max-depth=1 flag. leaving out that flag will just give you finer resolution. Note that you have to use the -h switch with both 'du' and with 'sort.' Show Sample Output


    14
    du -h --max-depth=1 |sort -rh
    jambino · 2011-11-15 20:30:00 13
  • Very useful when you need disk space. It calculates the disk usage of all files and dirs (descending them) located at the current directory (including hidden ones). Then sort puts them in order.


    11
    du -cs * .[^\.]* | sort -n
    cemsbr · 2009-03-02 18:43:48 17
  • dpigs is in the package debian-goodies (debian/ubuntu)


    11
    dpigs
    xrm0 · 2009-10-26 20:38:17 18
  • Search for files and list the 20 largest. find . -type f gives us a list of file, recursively, starting from here (.) -print0 | xargs -0 du -h separate the names of files with NULL characters, so we're not confused by spaces then xargs run the du command to find their size (in human-readable form -- 64M not 64123456) | sort -hr use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K | head -20 finally only select the top twenty out of the list Show Sample Output


    11
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -20
    flatcap · 2012-03-30 10:21:12 10
  • This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit. This command won't use command line arguments to sort the files and will display the sorted list correctly. Show Sample Output


    10
    find . -type f -printf '%20s %p\n' | sort -n | cut -b22- | tr '\n' '\000' | xargs -0 ls -laSr
    fsilveira · 2009-08-13 13:13:33 14
  • Just refining last proposal for this check, showing awk power to make more complex math (instead /1024/1024, 2^20). We don't need declare variable before run lsof, because $(command) returns his output. Also, awk can perform filtering by regexp instead to call grep. I changed the 0.0000xxxx messy output, with a more readable form purging all fractional numbers and files less than 1 MB. Show Sample Output


    10
    lsof -p $(pidof firefox) | awk '/.mozilla/ { s = int($7/(2^20)); if(s>0) print (s)" MB -- "$9 | "sort -rn" }'
    tzk · 2010-01-13 22:45:53 9
  • This command checks for the number of times when someone has tried to login to your server and failed. If there are a lot, then that user is being targeted on your system and you might want to make sure that user either has remote logins disabled, or has a strong password, or both. If your output has an "invalid" line, it is a summary of all logins from users that don't exist on your system. Show Sample Output


    9
    zgrep "Failed password" /var/log/auth.log* | awk '{print $9}' | sort | uniq -c | sort -nr | less
    dbart · 2009-03-03 13:45:56 13

  • 9
    find . -type f -print0 | xargs -0 du -h | sort -hr | head -10
    netaxiz · 2012-06-30 10:03:31 5

  • 8
    du -k | sort -r -n | more
    puddleglum · 2009-03-18 18:14:19 8

  • 8
    du -cks * | sort -rn | while read size fname; do for unit in k M G T P E Z Y; do if [ $size -lt 1024 ]; then echo -e "${size}${unit}\t${fname}"; break; fi; size=$((size/1024)); done; done
    askedrelic · 2010-04-05 17:09:14 6
  • I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable. find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50% 00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp The key is: # -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' which believe it or not took me hundreds of tweaking before I was happy with the output. You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions. lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; } Personally I'm using this function because: lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`; find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; } # i can sort by user lll -sort=3 # or sort by group reversed lll -sort=4 -r # and sort by modification time lll -sort=6 If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like: for a; do lll $a; done Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html Show Sample Output


    8
    find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
    AskApache · 2010-06-10 22:03:08 9
  • Check which files are opened by Firefox then sort by largest size (in MB). You can see all files opened by just replacing grep to "/". Useful if you'd like to debug and check which extensions or files are taking too much memory resources in Firefox. Show Sample Output


    6
    FFPID=$(pidof firefox-bin) && lsof -p $FFPID | awk '{ if($7>0) print ($7/1024/1024)" MB -- "$9; }' | grep ".mozilla" | sort -rn
    josue · 2009-08-16 08:58:22 7

  • 6
    tr A-Z a-z | tr -cs a-z '\n' | sort | uniq -c
    putnamhill · 2010-10-19 22:49:13 8
  • Sort lines within vi editor. In this example sort lines 33-61 and lines 4-9 asciibetically. Show Sample Output


    6
    :33,61 !sort
    greggster · 2011-05-06 06:10:05 8

  • 6
    cut -d: -f1 /etc/passwd | sort
    dan · 2011-12-20 10:46:52 12
  • This will mv all your mp3 files in the current directory to $ARTIST/$ALBUM/$NAME.mp3 Make sure not to use sudo - as some weird things can happen if the mp3 file doesn't have id3 tags.


    5
    for file in *.mp3;do mkdir -p "$(mp3info -p "%a/%l" "$file")" && ln -s "$file" "$(mp3info -p "%a/%l/%t.mp3" "$file")";done
    matthewbauer · 2009-08-05 17:04:34 9
  • Use this BASH trick to create a variable containing the TAB character and pass it as the argument to sort, join, cut and other commands which don't understand the \t notation. sort -t $'\t' ... join -t $'\t' ... cut -d $'\t' ... Show Sample Output


    5
    sort -t $'\t' -k 2 input.txt
    postrational · 2010-07-11 12:58:51 5
  • Save some CPU, and some PIDs. :)


    5
    awk -F ':' '{print $1 | "sort";}' /etc/passwd
    atoponce · 2011-12-20 12:46:52 4
  • This uses the ability of find (at least the one from GNU findutils that is shiped with most linux distros) to display change time as part of its output. No xargs needed.


    5
    find -printf "%C@ %p\n"|sort
    oivvio · 2013-06-19 10:42:49 12
  •  1 2 3 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Remove a range of lines from a file

Get AWS temporary credentials ready to export based on a MFA virtual appliance
You might want to secure your AWS operations requiring to use a MFA token. But then to use API or tools, you need to pass credentials generated with a MFA token. This commands asks you for the MFA code and retrieves these credentials using AWS Cli. To print the exports, you can use: `awk '{ print "export AWS_ACCESS_KEY_ID=\"" $1 "\"\n" "export AWS_SECRET_ACCESS_KEY=\"" $2 "\"\n" "export AWS_SESSION_TOKEN=\"" $3 "\"" }'` You must adapt the command line to include: * $MFA_IDis ARN of the virtual MFA or serial number of the physical one * TTL for the credentials

Check which files are opened by Firefox then sort by largest size.
Check which files are opened by Firefox then sort by largest size (in MB). You can see all files opened by just replacing grep to "/". Useful if you'd like to debug and check which extensions or files are taking too much memory resources in Firefox.

scan whole internet and specific port in humanistic time
apt-get install git gcc make libpcap-dev git clone https://github.com/robertdavidgraham/masscan cd masscan make install -pDm755 bin/masscan /usr/bin/masscan

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Filter the output of a file continously using tail and grep

ZSH prompt. ':)' after program execution with no error, ':(' after failure.
sorry for my english

write by vim need root privilege

Debug how files are being accessed by a process
Instead of looking through `lsof` results, use inotifywait!


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: