Commands tagged sort (176)

  • This works by reading in two lines of input, turning each into a list of one-character matches that are sorted and compared.


    1
    (echo foobar; echo farboo) | perl -E 'say[sort<>=~/./g]~~[sort<>=~/./g]?"anagram":"not anagram"'
    doherty · 2011-02-17 02:15:46 119
  • as per eightmillion's comment. Simply economical :)


    1
    du -h | sort -hr
    mooselimb · 2011-11-06 23:15:36 3

  • 1
    sed -e 's/[;|][[:space:]]*/\n/g' .bash_history | cut --delimiter=' ' --fields=1 | sort | uniq --count | sort --numeric-sort --reverse | head --lines=20
    WissenForscher · 2012-02-17 23:34:16 5
  • The lastb command presents you with the history of failed login attempts (stored in /var/log/btmp). The reference file is read/write by root only by default. This can be quite an exhaustive list with lots of bots hammering away at your machine. Sometimes it is more important to see the scale of things, or in this case the volume of failed logins tied to each source IP. The awk statement determines if the 3rd element is an IP address, and if so increments the running count of failed login attempts associated with it. When done it prints the IP and count. The sort statement sorts numerically (-n) by column 3 (-k 3), so you can see the most aggressive sources of login attempts. Note that the ':' character is the 2nd column, and that the -n and -k can be combined to -nk. Please be aware that the btmp file will contain every instance of a failed login unless explicitly rolled over. It should be safe to delete/archive this file after you've processed it. Show Sample Output


    1
    sudo lastb | awk '{if ($3 ~ /([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}/)a[$3] = a[$3]+1} END {for (i in a){print i " : " a[i]}}' | sort -nk 3
    sgowie · 2012-09-11 14:51:10 4

  • 1
    getent passwd | cut -d: -f1 | sort
    theftf · 2012-09-12 17:16:54 4
  • Enhanced version: fixes sorting by human readable numbers, and filters out non MB or GB entries that have a G or an M in their name.


    1
    du --max-depth=1 -h * |sort -h -k 1 |egrep '(M|G)\s'
    TerDale · 2013-02-14 08:56:56 6

  • 1
    awk '{print $1}' ~/.bash_history | sort | uniq -c | sort -rn | head -n 10
    nesses · 2013-05-03 16:24:30 6
  • The other commands were good, but they included packages that were installed and then removed. This command only shows packages that are currently installed, sorts smallest to largest, and formats the sizes to be human readable. Show Sample Output


    1
    dpkg-query --show --showformat='${Package;-50}\t${Installed-Size}\n' `aptitude --display-format '%p' search '?installed!?automatic'` | sort -k 2 -n | grep -v deinstall | awk '{printf "%.3f MB \t %s\n", $2/(1024), $1}'
    EvilDennisR · 2013-07-26 23:18:20 13
  • Same as the rest, but handle IPv6 short IPs. Also, sort in the order that you're probably looking for. Show Sample Output


    1
    netstat -ntu | awk ' $5 ~ /^(::ffff:|[0-9|])/ { gsub("::ffff:","",$5); print $5}' | cut -d: -f1 | sort | uniq -c | sort -nr
    mrwulf · 2013-09-10 19:28:06 6
  • credit shall fall to this for non-gzipped version: https://gist.github.com/marcanuy/a08d5f2d9c19ba621399 Show Sample Output


    1
    zcat error.log.gz | sed 's^\[.*\]^^g' | sed 's^\, referer: [^\n]*^^g' | sort | uniq -c | sort -n
    zanhsieh · 2014-09-24 05:26:24 8
  • capture 2000 packets and print the top 10 talkers


    1
    tcpdump -tnn -c 2000 -i eth0 | awk -F "." '{print $1"."$2"."$3"."$4}' | sort | uniq -c | sort -nr | awk ' $1 > 10 '
    hochmeister · 2014-09-26 01:15:23 10
  • Find biggest files in a directory Show Sample Output


    1
    find . -printf '%.5m %10M %#9u %-9g %TY-%Tm-%Td+%Tr [%Y] %s %p\n'|sort -nrk8|head
    AskApache · 2014-12-10 23:48:20 9
  • List all open files of all processes. . find /proc/*/fd Look through the /proc file descriptors . -xtype f list only symlinks to file . -printf "%l\n" print the symlink target . grep -P '^/(?!dev|proc|sys)' ignore files from /dev /proc or /sys . sort | uniq -c | sort -n count the results . Many processes will create and immediately delete temporary files. These can the filtered out by adding: ... | grep -v " (deleted)$" | ... Show Sample Output


    1
    find /proc/*/fd -xtype f -printf "%l\n" | grep -P '^/(?!dev|proc|sys)' | sort | uniq -c | sort -n
    flatcap · 2015-08-18 17:58:21 11
  • This is my favorite music player I use in my beloved Linux systems,server or desktop Enjoy :-) Show Sample Output


    1
    find /home/user/M?sica/ -type f -name "*.mp3" | shuf --head-count=20 --output=/home/user/playlist.m3u ; sort -R /home/user/playlist.m3u | mplayer -playlist -
    abaddon · 2016-06-10 03:04:40 20
  • No need for sort Show Sample Output


    1
    ps hax -o user --sort user | uniq -c
    sharkhands · 2016-09-15 23:31:51 15
  • this is good for variables if you have many script created files and if you want to know which one is the last created/changed one..


    1
    find . -type f -print0 | xargs -0 stat -c '%y %n' | sort -n -k 1,1 | awk 'END{print $NF}'
    emphazer · 2018-05-14 08:47:41 162

  • 1
    comm -12 <(sort -u File1) <(sort -u File2)
    guillaume1306 · 2018-09-07 11:36:15 338
  • email random list can be created here: https://www.randomlists.com/email-addresses Show Sample Output


    1
    sort -t@ -k2 emails.txt
    cellcore · 2019-02-24 17:41:17 34
  • Sometimes things break. You can find the most recent errors using a combination of journalctl, along with the classic tools sort and uniq Show Sample Output


    1
    journalctl --no-pager --since today --grep 'fail|error|fatal' --output json | jq '._EXE' | sort | uniq -c | sort --numeric --reverse --key 1
    mikhail · 2021-12-22 22:27:17 216

  • 0
    find /home/fizz -type f -printf '%TY-%Tm-%Td %TT %p\n' | sort
    fizz · 2009-05-20 10:45:39 6
  • Save the script as: sort_file Usage: sort_file < sort_me.csv > out_file.csv This script was originally posted by Admiral Beotch in LinuxQuestions.org on the Linux-Software forum. I modified this script to make it more portable. Show Sample Output


    0
    infile=$1 for i in $(cat $infile) do echo $i | tr "," "\n" | sort -n | tr "\n" "," | sed "s/,$//" echo done
    iframe · 2009-07-12 21:23:37 6
  • This one will work a little better, the regular expressions it is not 100% accurate for XML parsing but it will suffice any XML valid document for sure. Show Sample Output


    0
    grep -Eho '<[a-ZA-Z_][a-zA-Z0-9_-:]*' * | sort -u | cut -c2-
    inkel · 2009-08-05 21:54:29 3
  • The sort utility is well used, but sometimes you want a little chaos. This will randomize the lines of a text file. BTW, on OS X there is no | sort -R option! There is also no | shuf These are only in the newer GNU core... This is also faster than the alternate of: | awk 'BEGIN { srand() } { print rand() "\t" $0 }' | sort -n | cut -f2- Show Sample Output


    0
    cat ~/SortedFile.txt | perl -wnl -e '@f=<>; END{ foreach $i (reverse 0 .. $#f) { $r=int rand ($i+1); @f[$i, $r]=@f[$r,$i] unless ($i==$r); } chomp @f; foreach $line (@f){ print $line; }}'
    drewk · 2009-09-24 15:42:43 5
  • List packages and their disk usage in decreasing order. This uses the "Installed-Size" from the package metadata. It may differ from the actual used space, because e.g. data files (think of databases) or log files may take additional space. Show Sample Output


    0
    perl -ne '$pkg=$1 if m/^Package: (.*)/; print "$1\t$pkg\n" if m/^Installed-Size: (.*)/;' < /var/lib/dpkg/status | sort -rn | less
    hfs · 2009-10-19 12:55:59 7
  • print the lines of a file in randomized order Show Sample Output


    0
    perl -wl -e '@f=<>; for $i (0 .. $#f) { $r=int rand ($i+1); @f[$i, $r]=@f[$r,$i] if ($i!=$r); } chomp @f; print join $/, @f;' try.txt
    JohnGH · 2009-12-21 21:15:55 3
  • ‹ First  < 2 3 4 5 6 >  Last ›

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

remove OSX resource forks ._ files
DESCRIPTION For each dir, dot_clean recursively merges all ._* files with their cor- responding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used. If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.

Decrypt exported android wallet keys for import into desktop client (LTC,FTC,BTC)

Create an SSH connection (reverse tunnel) through your firewall.
Allows you to establish a tunnel (encapsulate packets) to your (Server B) remote server IP from your local host (Server A). On Server B you can then connect to port 2001 which will forward all packets (encapsulated) to port 22 on Server A. -- www.fir3net.com --

Listen YouTube radios streaming
Listen YouTube radios streaming. I use it on an alias to easily enter kinda flow state for study/programming.

Print a great grey scale demo !
Seen here: http://www.pixelbeat.org/docs/terminal_colours/

Go up multiple levels of directories quickly and easily.

Pull Total Memory Usage In Virtual Environment
This command basically adds up all of the individual instances processes and gives you a grand total for used memory in that instance alone.

Get a range of SVN revisions from svn diff and tar gz them
Handy when you need to create a list of files to be updated when subversion is not available on the remote host. You can take this tar file, and upload and extract it where you need it. Replace M and N with the revisions specific to yours. Make sure you do this from an updated (svn up) working directory.

Test disk I/O
Especially useful to gauge the performance of a VPS.

tar and remove files which are older that 100 days
tar does not have a -mtime option as find. tar appends all the file to an existing tar file.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: