commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Now we can capture only a specific window (we have to chose by clicking on it)
ffmpeg complains about "Frame size must be a multiple of 2" so we calculate the upper even number with (g)awk trickery.
We remove the grep, we are already using (g)awk here ....why losing time with grep !!! ;)
Remove old kernels (*-generic and *-generic-pae) via apt-get on debian/ubuntu based systems. Tested on ubuntu 10.04 - 12.04.
Use this the next time you need to come up with a reasonably random bitstring, like for a WPA/WPA2 PSK or something. Takes a continuous stream of bytes coming from /dev/urandom, runs it through od(1), picking a random field ($0 and $1 excluded) from a random line and then prints it.
Depending on your Apache access log configuration you may have to change the sum+=$11 to previous or next awk token.
Beware, usually in access log last token is time of response in microseconds, penultimate token is size of response in bytes. You may use this command line to calculate sum and average of responses sizes.
You can also refine the egrep regexp to match specific HTTP requests.
That's the easiest way to do it. -I (or capital i) display all network addresses of a host
gives u each configured IP in a seperate line.
cut -f1,2 - IP range 16
cut -f1,2,3 - IP range 24
cut -f1,2,3,4 - IP range 24
This command allows you to revert every modified file one-by-one in a while loop, but also after "echo $file;" you can do any sort of processing you might want to add before the revert happens.
Will split the std input lines into files grouped by the 5th column content.
If your locale has Monday as the first day of the week, like mine in the UK, change the two $7 into $6
This was done in csh.
This is a little trickier than finding the last Sunday, because you know the last Sunday is in the first position of the last line. The trick is to use the NF less than or equal to 7 so it picks up all the lines then grep out any empty lines.
Simpler and without all of the coloring gimmicks. This just returns a list of branches with the most recent first. This should be useful for cleaning your remotes.
Like the original version except it does not include the parent apache process or the grep process and adds "sudo" so it can be run by user.
Prints top 5 twitter topics. Not very well written at all but none of the others worked.
Improvement on Coderjoe's Solution. Gets rid of grep and cut (and implements them in awk) and specifies some different mplayer options that speed things up a bit.
This will save parsing time for operations on very big files.
Convert readable date/time with `date` command
Not figured by me, but a colleague of mine.
See the total amount of data on an AIX machine.
It remove the square bracket and convert UNIX time to human readable time for all line of a stream (or file).
Kill all process that concide whit PATTERN
You can use only awk