Commands tagged sort (176)

  • This works by reading in two lines of input, turning each into a list of one-character matches that are sorted and compared.


    1
    (echo foobar; echo farboo) | perl -E 'say[sort<>=~/./g]~~[sort<>=~/./g]?"anagram":"not anagram"'
    doherty · 2011-02-17 02:15:46 119
  • as per eightmillion's comment. Simply economical :)


    1
    du -h | sort -hr
    mooselimb · 2011-11-06 23:15:36 3

  • 1
    sed -e 's/[;|][[:space:]]*/\n/g' .bash_history | cut --delimiter=' ' --fields=1 | sort | uniq --count | sort --numeric-sort --reverse | head --lines=20
    WissenForscher · 2012-02-17 23:34:16 5
  • The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP. The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count. The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk. Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it. Show Sample Output


    1
    sudo lastb | awk '{if ($3 ~ /([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}/)a[$3] = a[$3]+1} END {for (i in a){print i " : " a[i]}}' | sort -nk 3
    sgowie · 2012-09-11 14:51:10 4

  • 1
    getent passwd | cut -d: -f1 | sort
    theftf · 2012-09-12 17:16:54 4
  • Enhanced version: fixes sorting by human readable numbers, and filters out non MB or GB entries that have a G or an M in their name.


    1
    du --max-depth=1 -h * |sort -h -k 1 |egrep '(M|G)\s'
    TerDale · 2013-02-14 08:56:56 6

  • 1
    awk '{print $1}' ~/.bash_history | sort | uniq -c | sort -rn | head -n 10
    nesses · 2013-05-03 16:24:30 6
  • The other commands were good, but they included packages that were installed and then removed. This command only shows packages that are currently installed, sorts smallest to largest, and formats the sizes to be human readable. Show Sample Output


    1
    dpkg-query --show --showformat='${Package;-50}\t${Installed-Size}\n' `aptitude --display-format '%p' search '?installed!?automatic'` | sort -k 2 -n | grep -v deinstall | awk '{printf "%.3f MB \t %s\n", $2/(1024), $1}'
    EvilDennisR · 2013-07-26 23:18:20 13
  • Same as the rest, but handle IPv6 short IPs. Also, sort in the order that you're probably looking for. Show Sample Output


    1
    netstat -ntu | awk ' $5 ~ /^(::ffff:|[0-9|])/ { gsub("::ffff:","",$5); print $5}' | cut -d: -f1 | sort | uniq -c | sort -nr
    mrwulf · 2013-09-10 19:28:06 6
  • credit shall fall to this for non-gzipped version: https://gist.github.com/marcanuy/a08d5f2d9c19ba621399 Show Sample Output


    1
    zcat error.log.gz | sed 's^\[.*\]^^g' | sed 's^\, referer: [^\n]*^^g' | sort | uniq -c | sort -n
    zanhsieh · 2014-09-24 05:26:24 8
  • capture 2000 packets and print the top 10 talkers


    1
    tcpdump -tnn -c 2000 -i eth0 | awk -F "." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr | awk ' $1 > 10 '
    hochmeister · 2014-09-26 01:15:23 10
  • Find biggest files in a directory Show Sample Output


    1
    find . -printf '%.5m %10M %#9u %-9g %TY-%Tm-%Td+%Tr [%Y] %s %p\n'|sort -nrk8|head
    AskApache · 2014-12-10 23:48:20 9
  • List all open files of all processes. . find /proc/*/fd Look through the /proc file descriptors . -xtype f list only symlinks to file . -printf "%l\n" print the symlink target . grep -P '^/(?!dev|proc|sys)' ignore files from /dev /proc or /sys . sort | uniq -c | sort -n count the results . Many processes will create and immediately delete temporary files. These can the filtered out by adding: ... | grep -v " (deleted)$" | ... Show Sample Output


    1
    find /proc/*/fd -xtype f -printf "%l\n" | grep -P '^/(?!dev|proc|sys)' | sort | uniq -c | sort -n
    flatcap · 2015-08-18 17:58:21 11
  • This is my favorite music player I use in my beloved Linux systems,server or desktop Enjoy :-) Show Sample Output


    1
    find /home/user/M?sica/ -type f -name "*.mp3" | shuf --head-count=20 --output=/home/user/playlist.m3u ; sort -R /home/user/playlist.m3u | mplayer -playlist -
    abaddon · 2016-06-10 03:04:40 20
  • No need for sort Show Sample Output


    1
    ps hax -o user --sort user | uniq -c
    sharkhands · 2016-09-15 23:31:51 15
  • this is good for variables if you have many script created files and if you want to know which one is the last created/changed one..


    1
    find . -type f -print0 | xargs -0 stat -c '%y %n' | sort -n -k 1,1 | awk 'END{print $NF}'
    emphazer · 2018-05-14 08:47:41 161

  • 1
    comm -12 <(sort -u File1) <(sort -u File2)
    guillaume1306 · 2018-09-07 11:36:15 337
  • email random list can be created here: https://www.randomlists.com/email-addresses Show Sample Output


    1
    sort -t@ -k2 emails.txt
    cellcore · 2019-02-24 17:41:17 34
  • Sometimes things break. You can find the most recent errors using a combination of journalctl, along with the classic tools sort and uniq Show Sample Output


    1
    journalctl --no-pager --since today --grep 'fail|error|fatal' --output json | jq '._EXE' | sort | uniq -c | sort --numeric --reverse --key 1
    mikhail · 2021-12-22 22:27:17 215

  • 0
    find /home/fizz -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort
    fizz · 2009-05-20 10:45:39 6
  • Save the script as: sort_file Usage: sort_file < sort_me.csv > out_file.csv This script was originally posted by Admiral Beotch in LinuxQuestions.org on the Linux-Software forum. I modified this script to make it more portable. Show Sample Output


    0
    infile=$1 for i in $(cat $infile) do echo $i | tr "," "\n" | sort -n | tr "\n" "," | sed "s/,$//" echo done
    iframe · 2009-07-12 21:23:37 6
  • This one will work a little better, the regular expressions it is not 100% accurate for XML parsing but it will suffice any XML valid document for sure. Show Sample Output


    0
    grep -Eho '<[a-ZA-Z_][a-zA-Z0-9_-:]*' * | sort -u | cut -c2-
    inkel · 2009-08-05 21:54:29 3
  • The sort utility is well used, but sometimes you want a little chaos. This will randomize the lines of a text file. BTW, on OS X there is no | sort -R option! There is also no | shuf These are only in the newer GNU core... This is also faster than the alternate of: | awk 'BEGIN { srand() } { print rand() "\t" $0 }' | sort -n | cut -f2- Show Sample Output


    0
    cat ~/SortedFile.txt | perl -wnl -e '@f=<>; END{ foreach $i (reverse 0 .. $#f) { $r=int rand ($i+1); @f[$i, $r]=@f[$r,$i] unless ($i==$r); } chomp @f; foreach $line (@f){ print $line; }}'
    drewk · 2009-09-24 15:42:43 5
  • List packages and their disk usage in decreasing order. This uses the "Installed-Size" from the package metadata. It may differ from the actual used space, because e.g. data files (think of databases) or log files may take additional space. Show Sample Output


    0
    perl -ne '$pkg=$1 if m/^Package: (.*)/; print "$1\t$pkg\n" if m/^Installed-Size: (.*)/;' < /var/lib/dpkg/status | sort -rn | less
    hfs · 2009-10-19 12:55:59 7
  • print the lines of a file in randomized order Show Sample Output


    0
    perl -wl -e '@f=<>; for $i (0 .. $#f) { $r=int rand ($i+1); @f[$i, $r]=@f[$r,$i] if ($i!=$r); } chomp @f; print join $/, @f;' try.txt
    JohnGH · 2009-12-21 21:15:55 3
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

cd to (or operate on) a file across parallel directories
This is useful for quickly jumping around branches in a file system, or operating on a parellel file. This is tested in bash. cd to (substitute in PWD, a for b) where PWD is the bash environmental variable for the "working directory"

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

display typedefs, structs, unions and functions provided by a header file
will display typedefs, structs, unions and functions declared in 'stdio.h'(checkout _IO_FILE structure). It will be helpful if we want to know what a particular header file will offer to us. Command 'cpp' is GNU's C Preprocessor.

Transform a portrait pdf in a landscape one with 2 pages per page
This is an example of the usage of pdfnup (you can find it in the 'pdfjam' package). With this command you can save ink/toner and paper (and thus trees!) when you print a pdf. This tools are very configurable, and you can make also 2x2, 3x2, 2x3 layouts, and more (the limit is your fantasy and the resolution of the printer :-) You must have installed pdfjam, pdflatex, and the LaTeX pdfpages package in your box.

Produce a pseudo random password with given length in base 64

Merge several pdf files into a single file
merge a.pdf b.pdf and c.pdf and create ./out.pdf

Watch Star Wars via telnet
Use Ctrl-] to stop it.

find and grep Word docs
Find Word docs by filename in the current directory, convert each of them to plain text using antiword (taking care of spaces in filenames), then grep for a search term in the particular file. (Of course, it's better to save your data as plain text to make for easier grepping, but that's not always possible.) Requires antiword. Or you can modify it to use catdoc instead.

find unreadable file

Convert seconds to [DD:][HH:]MM:SS
Converts any number of seconds into days, hours, minutes and seconds. sec2dhms() { declare -i SS="$1" D=$(( SS / 86400 )) H=$(( SS % 86400 / 3600 )) M=$(( SS % 3600 / 60 )) S=$(( SS % 60 )) [ "$D" -gt 0 ] && echo -n "${D}:" [ "$H" -gt 0 ] && printf "%02g:" "$H" printf "%02g:%02g\n" "$M" "$S" }


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: