Commands using uniq (252)

  • This command might not be useful for most of us, I just wanted to share it to show power of command line. Download simple text version of novel David Copperfield from Poject Gutenberg and then generate a single column of words after which occurences of each word is counted by sort | uniq -c combination. This command removes numbers and single characters from count. I'm sure you can write a shorter version. Show Sample Output


    -4
    wget -q -O- http://www.gutenberg.org/dirs/etext96/cprfd10.txt | sed '1,419d' | tr "\n" " " | tr " " "\n" | perl -lpe 's/\W//g;$_=lc($_)' | grep "^[a-z]" | awk 'length > 1' | sort | uniq -c | awk '{print $2"\t"$1}'
    alperyilmaz · 2009-05-04 16:00:39 8
  • netstat has two lines of headers: Active Internet connections (w/o servers) Proto Recv-Q Send-Q Local Address Foreign Address State Added a filter in the awk command to remove them


    -4
    netstat -ntu | awk ' $5 ~ /^[0-9]/ {print $5}' | cut -d: -f1 | sort | uniq -c | sort -n
    letterj · 2011-07-04 20:23:21 1
  • ‹ First  < 9 10 11

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

print indepth hardware info
wanna know something about your hardware? how about EVERYTHING?? then this should do ya well

Show all machines on the network
Depending on the network setup, you may not get the hostname.

Check if the files in current directory has the RPATH variable defined
Using gentoo prefix portage I got in a situation where some packages did not contain the needed RPATH variable. This command helped me to find out which ones I should recompile

Search for an active process without catching the search-process
This does the same thing as many of the 'grep' based alternatives but allows a more finite control over the output. For example if you only wanted the process ID you could change the command: $ ps -ef | awk '/mingetty/ && !/awk/ {print $2}' If you wanted to kill the returned PID's: $ ps -ef | awk '/mingetty/ && !/awk/ {print $2}' | xargs -i kill {}

See entire packet payload using tcpdump.
This command will show you the entire payload of a packet. The final "s" increases the snaplength, grabbing the whole packet.

List all symbolic links in a directory matching a string
Finds all symbolic links in the specified directory which match the specified string pattern. I used this when upgrading from an Apple-supported version of Java 6 (1.6.0_65) to an Oracle-supported version (1.7.0_55) on Mac OS X 10.8.5 to find out which executables were pointing to /System/Library/Frameworks/JavaVM.framework/Versions/Current/Commands (Apple version) vs. /Library/Java/JavaVirtualMachines/jdk1.7.0_55.jdk/Contents/Home/bin (Oracle version). However, it appears the current JDK installation script already takes care of modifying the links.

Analyze, check, auto-repair and optimize Mysql Database
A useful way to do a full check and auto repair damaged databases

Split a large file, without wasting disk space
It's common to want to split up large files and the usual method is to use split(1). If you have a 10GiB file, you'll need 10GiB of free space. Then the OS has to read 10GiB and write 10GiB (usually on the same filesystem). This takes AGES. . The command uses a set of loop block devices to create fake chunks, but without making any changes to the file. This means the file splitting is nearly instantaneous. The example creates a 1GiB file, then splits it into 16 x 64MiB chunks (/dev/loop0 .. loop15). . Note: This isn't a drop-in replacement for using split. The results are block devices. tar and zip won't do what you expect when given block devices. . These commands will work: $ hexdump /dev/loop4 . $ gzip -9 < /dev/loop6 > part6.gz . $ cat /dev/loop10 > /media/usb/part10.bin

Protect directory from an overzealous rm -rf *
Forces the -i flag on the rm command when using a wildcard delete.

Filter IP's in apache access logs based on use
Show's per IP of how many requests they did to the Apache webserver


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: