commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Depending on your Apache access log configuration you may have to change the sum+=$11 to previous or next awk token.
Beware, usually in access log last token is time of response in microseconds, penultimate token is size of response in bytes. You may use this command line to calculate sum and average of responses sizes.
You can also refine the egrep regexp to match specific HTTP requests.
Get a list of all the unique hostnames from the apache configuration files. Handy to see what sites are running on a server. When i saw the command i had some ideas to make it shorter. Here is my version.
Simple TCPDUMP grepping for common unsafe protocols (HTTP, POP3, SMTP, FTP)
First get a api key for google url shortner from here https://developers.google.com/url-shortener/
Then replace the API_KEY in the command
Very useful for finding all the source code that should be compiled.
Returns last day of current month. Useful to implement a bash script backup based on a GFS strategy.
Search for java explicit incrementation in order to replace it with postfix or assignment operator
find -printf "%f\n" prints just the file name from the given path. This means directory paths which contain extensions will not be considered.
adjusting the field "f1" will give you additional information such as
f1 = 98%
f2 = discharging
f3 = 2:02 remaining
speaks out last twitter update using 'say'
The other 2 commands that are listed will also kill the egrep process and any libexec processes because the .exe isn't escaped so it is really using . meaning anything containing exe. The command i posted escapes the (dot) in .exe and then filters the actual egrep process so that it doesn't get killed before the other processes being killed. Also added the -9 switch for kill to send sigterm to the processes, in case people are wondering why processes aren't getting killed after running just kill . This should work better for people :)
Command is properly working on HP-UX 11.31
Does not print any line that either:
- is empty
- contains only spaces or tabs
- starts with #
- starts with spaces/tabs followed by a #
since the most url shorteners respond with a header containing the Location: ... this works with most common shorteners
Hide comments and empty lines, included XML comments,
Provides a cleaner output plus some more details about the IP address. Also, a flaw was corrected where the URL provided the results in Spanish by default.
Just use your system preinstalled file
Discover host and url of media files (ex. flv, mp4, m4v..).
It locate the urls of audio and video files so that they can be recorded.
Get all URLs from website via Regular Expression... You must have lynx installed in your computer to execute the command.
--> lynx --dump "" | egrep -o ""
- Must substitute it for the website path that you want to extract the URLs
- Regular Expression that you wanna filter the website