Commands using sort (800)


  • 1
    grep -R Subject /var/spool/exim/input/ | sed s/^.*Subject:\ // | sort | uniq -c | sort -n > ~/email_sort.$(date +%m.%d.%y).txt
    unixmonkey27702 · 2011-11-24 14:13:29 3

  • 1
    du --max-depth=1 | sort -nr | awk ' BEGIN { split("KB,MB,GB,TB", Units, ","); } { u = 1; while ($1 >= 1024) { $1 = $1 / 1024; u += 1 } $1 = sprintf("%.1f %s", $1, Units[u]); print $0; } '
    threv · 2011-12-08 17:43:09 4
  • This one line Perl script will display the smallest to the largest files sizes in all directories on a server. Show Sample Output


    1
    du -k | sort -n | perl -ne 'if ( /^(\d+)\s+(.*$)/){$l=log($1+.1);$m=int($l/log(1024)); printf ("%6.1f\t%s\t%25s %s\n",($1/(2**(10*$m))),(("K","M","G","T","P")[$m]),"*"x (1.5*$l),$2);}' | more
    Q_Element · 2012-02-07 15:49:19 10
  • It grabs the PID's top resource users with $(ps -eo pid,pmem,pcpu| sort -k 3 -r|grep -v PID|head -10) The sort -k is sorting by the third field which would be CPU. Change this to 2 and it will sort accordingly. The rest of the command is just using diff to display the output of 2 commands side-by-side (-y flag) I chose some good ones for ps. pidstat comes with the sysstat package(sar, mpstat, iostat, pidstat) so if you don't have it, you should. I might should take off the timestamp... :| Show Sample Output


    1
    for i in $(ps -eo pid,pmem,pcpu| sort -k 3 -r|grep -v PID|head -10|awk '{print $1}');do diff -yw <(pidstat -p $i|grep -v Linux) <(ps -o euser,pri,psr,pmem,stat -p $i|tail);done
    bob_the_builder · 2012-02-16 20:54:32 23

  • 1
    sed -e 's/[;|][[:space:]]*/\n/g' .bash_history | cut --delimiter=' ' --fields=1 | sort | uniq --count | sort --numeric-sort --reverse | head --lines=20
    WissenForscher · 2012-02-17 23:34:16 5

  • 1
    find <directory> -type f -printf "%T@\t%p\n"|sort -n|cut -f2|xargs ls -lrt
    rik · 2012-03-02 12:51:06 3
  • This works in combination with http://www.commandlinefu.com/commands/view/10496/identify-exported-sonames-in-a-path as it reports the NEEDED entries present in the files within a given path. You can then compare it with the libraries that are exported to make sure that, when cross-building a firmware image, you're not bringing in dependencies from the build host. The short version of it as can be seen in the same output is scanelf -RBnq -F "+n#f" $1 | tr ',' '\n' | sort -u Show Sample Output


    1
    scanelf --nobanner --recursive --quiet --needed --format "+n#F" $1 | tr ',' '\n' | sort -u
    Flameeyes · 2012-03-29 18:30:45 3
  • from my bashrc ;)


    1
    find . -mount -type f -printf "%k %p\n" | sort -rg | cut -d \ -f 2- | xargs -I {} du -sh {} | less
    bashrc · 2012-03-30 07:37:52 3
  • or tree -ifsF --noreport .|sort -n -k2|grep -v '/$' (rows presenting directory names become hidden)


    1
    tree -ifs --noreport .|sort -n -k2
    knoppix5 · 2012-05-04 09:18:39 6
  • See the summary. Show Sample Output


    1
    lsof +c 15 | awk '{print $1}' | sort | uniq -c | sort -rn | head
    SEJeff · 2012-05-25 16:31:46 3
  • This command give a human readable result without messing up the sorting.


    1
    for i in G M K; do du -hx /var/ | grep [0-9]$i | sort -nr -k 1; done | less
    jlaunay · 2012-06-26 22:57:17 6

  • 1
    tcpdump -ntr NAME_OF_CAPTURED_FILE.pcap 'tcp[13] = 0x02 and dst port 80' | awk '{print $4}' | tr . ' ' | awk '{print $1"."$2"."$3"."$4}' | sort | uniq -c | awk ' {print $2 "\t" $1 }'
    efuoax · 2012-08-22 21:26:10 7
  • The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP. The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count. The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk. Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it. Show Sample Output


    1
    sudo lastb | awk '{if ($3 ~ /([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}/)a[$3] = a[$3]+1} END {for (i in a){print i " : " a[i]}}' | sort -nk 3
    sgowie · 2012-09-11 14:51:10 4
  • In OSX you would have to make sure that you "sudo -s" your way to happiness since it will give a few "Permission denied" errors before finally spitting out the results. In OSX the directory structure has to start with the "Users" Directory then it will recursively perform the operation. Your Lord and master, Mematron Show Sample Output


    1
    sudo -s du -sm /Users/* | sort -nr | head -n 10
    mematron · 2012-09-13 10:15:23 4
  • You can simply run "largest", and list the top 10 files/directories in ./, or you can pass two parameters, the first being the directory, the 2nd being the limit of files to display. Best off putting this in your bashrc or bash_profile file Show Sample Output


    1
    largest() { dir=${1:-"./"}; count=${2:-"10"}; echo "Getting top $count largest files in $dir"; du -sx "$dir/"* | sort -nk 1 | tail -n $count | cut -f2 | xargs -I file du -shx file; }
    jhyland87 · 2013-01-21 09:45:21 7
  • Enhanced version: fixes sorting by human readable numbers, and filters out non MB or GB entries that have a G or an M in their name.


    1
    du --max-depth=1 -h * |sort -h -k 1 |egrep '(M|G)\s'
    TerDale · 2013-02-14 08:56:56 6
  • Replace \-dev with whatever you wanna search for


    1
    dpkg-query -Wf '${Installed-Size}\t${Package}\n' | grep "\-dev" | sort -n | awk '{ sum+=$1} END {print sum/1024 "MB"}'
    threv · 2013-02-15 20:29:18 4
  • Interesting to see which packages are larger than the kernel package. Useful to understand which RPMs might be candidates to remove if drive space is restricted. Show Sample Output


    1
    rpm -qa --queryformat '%{size} %{name}-%{version}-%{release}\n' | sort -k 1,1 -rn | nl | head -16
    mpb · 2013-03-19 21:10:54 6
  • the column number is '6'


    1
    cut -d',' -f6 file.csv | sort | uniq
    richie · 2013-04-10 14:05:32 6
  • make usable on OSX with filenames containing spaces. note: will still break if filenames contain newlines... possible, but who does that?!


    1
    svn ls -R | egrep -v -e "\/$" | tr '\n' '\0' | xargs -0 svn blame | awk '{print $2}' | sort | uniq -c | sort -nr
    rymo · 2013-04-10 19:37:53 5
  • parallel can be installed on your central node and can be used to run a command multiple times. In this example, multiple ssh connections are used to run commands. (-j is the number of jobs to run at the same time). The result can then be piped to commands to perform the "reduce" stage. (sort then uniq in this example). This example assumes "keyless ssh login" has been set up between the central node and all machines in the cluster. bashreduce may also do what you want. Show Sample Output


    1
    parallel -j 50 ssh {} "ls" ::: host1 host2 hostn | sort | uniq -c
    macoda · 2013-04-12 11:56:41 7

  • 1
    awk '{print $1}' ~/.bash_history | sort | uniq -c | sort -rn | head -n 10
    nesses · 2013-05-03 16:24:30 6

  • 1
    du -m --max-depth=1 [DIR] | sort -nr
    Paulus · 2013-07-26 07:04:41 12
  • The other commands were good, but they included packages that were installed and then removed. This command only shows packages that are currently installed, sorts smallest to largest, and formats the sizes to be human readable. Show Sample Output


    1
    dpkg-query --show --showformat='${Package;-50}\t${Installed-Size}\n' `aptitude --display-format '%p' search '?installed!?automatic'` | sort -k 2 -n | grep -v deinstall | awk '{printf "%.3f MB \t %s\n", $2/(1024), $1}'
    EvilDennisR · 2013-07-26 23:18:20 13
  • Compute the md5 checksums for the contents of two mirrored directories, then sort and diff the results. If everything matches, nothing is returned. Otherwise, any checksums which do not match, or which exist in one tree but not the other, are returned. As you might imagine, the output is useful only if no errors are found, because only the checksums, not filenames, are returned. I hope to address this, or that someone else will!


    1
    diff <(sort <(md5deep -r /directory/1/) |cut -f1 -d' ') <(sort <(md5deep -r /directory/2/) |cut -f1 -d' ')
    unixmonkey64021 · 2013-08-18 22:13:07 6
  • ‹ First  < 12 13 14 15 16 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Schedule Nice Background Commands That Won't Die on Logout - Alternative to nohup and at
Check out the usage of 'trap', you may not have seen this one much. This command provides a way to schedule commands at certain times by running them after sleep finishes sleeping. In the example 'sleep 2h' sleeps for 2 hours. What is cool about this command is that it uses the 'trap' builtin bash command to remove the SIGHUP trap that normally exits all processes started by the shell upon logout. The 'trap 1' command then restores the normal SIGHUP behaviour. It also uses the 'nice -n 19' command which causes the sleep process to be run with minimal CPU. Further, it runs all the commands within the 2nd parentheses in the background. This is sweet cuz you can fire off as many of these as you want. Very helpful for shell scripts.

Open Remote Desktop (RDP) from command line having a custom screen size
This example uses xfreerdp, which builds upon the development of rdesktop. This example usage will also send you the remote machine's sound.

Repeat a command until stopped
In this case it runs the command 'curl localhost:3000/site/sha' waiting the amount of time in sleep, ie: 1 second between runs, appending each run to the console. This works well for any command where the output is less than your line width This is unlike watch, because watch always clears the display.

faster version of ls *
I know its not much but is very useful in time consuming scripts (cron, rc.d, etc).

list block devices
Shows all block devices in a tree with descruptions of what they are.

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }

Record a screencast and convert it to an mpeg
Grab X11 input and create an MPEG at 25 fps with the resolution 800x600

Lists unambigously names of all xml elements used in files in current directory
This set of commands was very convenient for me when I was preparing some xml files for typesetting a book. I wanted to check what styles I had to prepare but coudn't remember all tags that I used. This one saved me from error-prone browsing of all my files. It should be also useful if one tries to process xml files with xsl, when using own xml application.

Recurse through directories easily
This is a simple case of recursing through all directories, adding the '.bak' extension to every file. Of course, the 'cp $file $file.bak' could be any code you need to apply to your recursion, including tests, other functions, creating variables, doing math, etc. Simple and clean recursion.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: