Commands using awk (1,418)

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands


Check These Out

Finds all files from / on down over specified size.
Very useful for finding all files over a specified size, such as out of control log files chewing up all available disk space. Fedora Core x specific version.

Check a server is up. If it isn't mail me.
This version uses netcat to check a particular service.

cpuinfo

Count the number of pages of all PDFs in current directory and all subdirs, recursively

send a file or directory via ssh compressing with lzma for low trafic

determine if tcp port is open
for udp nmap -sU -p 80 hostname

Get current logged in users shortname

ping a host until it responds, then play a sound, then exit
Audio acknowledgement for host availability. When running the command from a Linux systems, you can use "festival" or "espeak" instead of "say".

Fetch the current human population of Earth
Fetches the world population JSON data from the US census and parses it uses jshon

execute your commands and avoid history records
$ secret_command;export HISTCONTROL= This will make "secret_command" not appear in "history" list.


Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: