make usable on OSX with filenames containing spaces. note: will still break if filenames contain newlines... possible, but who does that?!
Backs up all databases, excluding test, mysql, performance_schema, information_schema. Requires parallel to work, install parallel on Ubuntu by running: sudo aptitude install parallel
Is the better option on a Open SuSE Box
Puts a splash of color in your access logs. IP addresses are gray, 200 and 304 are green, all 4xx errors are red. Works well with e.g. "colorize access_log | less -R" if you want to see your colors while paging.
Use as inspiration for other things you might be tailing, like syslog or vmstat
Usage:
tail -f access.log | colorize
Recent hardware may or may not enumerate *both of* these values
This command run fine on my Ubuntu machine, but on Red Hat I had to change the awk command to `awk '{print $10}'`.
This dumps serial numbers of all the drives but HP warranty check does not say they are valid ... Show Sample Output
This command will find all occurrences of one or more patterns in a collection of files and will delete every line matching the patterns in every file
search file for string1 or string2
This is useful for piping to other commands, as well:
svn status | egrep '^(M|A)' | egrep -o '[^MA\ ].*$' | xargs $EDITOR
I use this (well I normally just drop the F=*.log bit and put that straight into the awk command) to count how many times I get referred from another site. I know its rough, its to give me an idea where any posts I make are ending up. The reason I do the Q="query" bit is because I often want to check another domain quickly and its quick to use CTRL+A to jump to the start and then CTRL+F to move forward the 3 steps to change the grep query. (I find this easier than moving backwards because if you group a lot of domains with the pipe your command line can get quite messy so its normally easier to have it all at the front so you just have to edit it & hit enter). For people new to the shell it does the following. The Q and F equals bits just make names we can refer to. The awk -F\" '{print $4}' $F reads the file specified by $F and splits it up using double-quotes. It prints out the fourth column for egrep to work on. The 4th column in the log is the referer domain. egrep then matches our query against this list from awk. Finally wc -l gives us the total number of lines (i.e. matches). Show Sample Output
Really only valuable in a PHP-only project directory. This is using standard linux versions of the tools. On most older BSD variants of sed, use -E instead of -r. Or use: sed 's/\+[[:space:]]\{1,\}//' instead. Show Sample Output
LaTeX is not a smart compiler - You need to run it several times to make it back-patch all the missing refs. The message if to do so or not is buried in its endless output and the log file. This grep lines helps to find it.
You can use that to create a excludefile for nmap, to find hosts, with no DHCP lease in your DHCP range. Show Sample Output
If used without arguments, returns own IP info. If used with argument, returns info about the parsed argument. Show Sample Output
Can be used in a working copy to output the URL (extracted from svn info), or as part of another function, as $(svnurl some/path). Saves a lot of time in my SVN workflow.
This command is jsut for the main IP settings of ndd. if you need ip6 or icmp edit the text within the egrep inclusion area. Felix001 - www.Fir3net.com Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: