Commands using sort (800)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Find usb device in realtime
Using this command you can track a moment when usb device was attached.

move you up one directory quickly
In bash, this turns on auto cd. If a command is just a directory name, it cd's into that directory.

flush memcached via netcat

Which processes are listening on a specific port (e.g. port 80)
swap out "80" for your port of interest. Can use port number or named ports e.g. "http"

Recover deleted Binary files
The above command assumes the lost data is on /dev/sda and you previously issued the following command to mount _another_ disk or partition (/dev/sdb1) on /recovery $sudo mount /dev/sdb1 /recovery If you don't do this, the data could be overwrited! foremost is a very powerful carving tool. By default foremost recovers all known file types. If you want to reduce the amount of files that are recovered you can specify the file type you are looking for. Read the man page to know the available file types. i.e to recover JPEG pictures append to foremost the switch -tjpg

Print lines in a text file with numbers in first column higher or equal than a value
A text file contains thousands of numbers. This command prints lines were the number is greater or equal than a specified value (134000000).

Copy an element from the previous command
You can specify a range via '-'.

Pass TAB as field separator to sort, join, cut, etc.
Use this BASH trick to create a variable containing the TAB character and pass it as the argument to sort, join, cut and other commands which don't understand the \t notation. $ sort -t $'\t' ... $ join -t $'\t' ... $ cut -d $'\t' ...

Sum columns from CSV column $COL
More of the same but with more elaborate perl-fu :-)

Efficiently print a line deep in a huge log file
Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using $ sed '999995,1000005!d' < my_massive_file


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: