zsh: list of files sorted by size, greater than 100mb, head the top 5. '**/*' is recursive, and the glob qualifiers provide '.' = regular file, 'L' size, which is followed by 'm' = 'megabyte', and finally '+100' = a value of 100
I'm not sure how reliable this command is, but it works for my needs. Here's also a variant using grep. nslookup www.example.com | grep "^Address: " | awk '{print $2}' Show Sample Output
shorter (thus better ;-) Show Sample Output
This is similar to standard `pv`, but it retains the rate history instead of only showing the current rate. This is useful for spotting changes. To do this, -f is used to force pv to output, and stderr is redirected to stdout so that `tr` can swap the carriage returns for new lines. (doesn't work correctly is in zsh for some reason. Tail's output isn't redirected to /dev/null like it is in bash. anyone know why? ???????) Show Sample Output
Tails a log and replaces it line-by-line according to whatever you want to replace. Useful if the file writing to the log can't be modified, so you need to modify its output instead. Show Sample Output
It displays, last 15 yum operations (in last operation as first row order) with its dates. Change 15 to any number of operations you need to display or remove "| tac" to see it in reverse order (last operation as last row)
Strangely enough, there is no option --lines=[negative] with tail, like the head's one, so we have to use sed, which is very short and clear, you see. Strangely more enough, skipping lines at the bottom with sed is not short nor clear. From Sed one liner : # delete the last 10 lines of a file $ sed -e :a -e '$d;N;2,10ba' -e 'P;D' # method 1 $ sed -n -e :a -e '1,10!{P;N;D;};N;ba' # method 2 Show Sample Output
You can actually do the same thing with a combination of head and tail. For example, in a file of four lines, if you just want the middle two lines:
head -n3 sample.txt | tail -n2
Line 1 --\
Line 2 } These three lines are selected by head -n3,
Line 3 --/ this feeds the following filtered list to tail:
Line 4
Line 1
Line 2 \___ These two lines are filtered by tail -n2,
Line 3 / This results in:
Line 2
Line 3
being printed to screen (or wherever you redirect it).
say only processes a complete file, at eof, so following a file isn't possible. Quick and dirty perl oneliner to feed each line from the tail -f to say. Yes, expensive to lauch a new process each line. This little ditty was prompted by a discussion on how horrible it is to use VoiceOver on ncurses programs such as irssi.
Realtime lines per second in a log file using python ... identical to perl version, except python is much better :) Show Sample Output
end_w_nl filename
will check if the last byte of filename is a unix newline character. tail -c1 yields the file's last byte and xxd converts it to hex format.
Simply add this to whatever apache startup script you have, or if you are on a MAC, create a new automator application. This will show a pretty growl notification whenever theres a new Apache error log entry. Useful for local development
this also can find the old command you used before
Use this command to watch apache access logs in real time to see what pages are getting hit. Show Sample Output
In the field, I needed to script a process to scan a specific vendor devices in the network. With the help of nmap, I got all the devices of that particular vendor, and started a scripted netcat session to download configuration files from a tftp server. This is the nmap loop (part of the script). You can however, add another pipe with grep to filter the vendor/manufacturer devices only. If want to check the whole script, check in http://pastebin.com/ju7h4Xf4 Show Sample Output
# displays 10 largest files and folders in bytes # last entry is largest # similar output to this: du -sk * | sort -nr | head Show Sample Output
I wanted a method to display the last run of my script from my log file. I had a pattern I could grep for to find the beginning of each run. This command line greps for that pattern in the log, finds the last occurrence and gives me the line number. Then I use the line number in tail to give me everything from that line number to the end of the log file. I tested this on Linux Mint (variant of Ubuntu) and on RHEL, but I suspect it will run many Linux systems.
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: