Same as the rest, but handle IPv6 short IPs. Also, sort in the order that you're probably looking for. Show Sample Output
shorter (thus better ;-) Show Sample Output
Download latest NVIDIA Geforce x64 Windows7-8 driver from Nvidia's website. Pulls the latest download version (which includes beta). This is the "English" version. The following command includes a 'sed' line to replace "english" with "international" if needed. You can also replace the starting subdomain with "eu." "uk." and others. Enjoy this one liner! 1 character under the max :)
wget "us.download.nvidia.com$(wget -qO- "$(wget -qO- "nvidia.com/Download/processFind.aspx?psid=95&pfid=695&osid=19&lid=1&lang=en-us" | awk '/driverResults.aspx/ {print $4}' | cut -d "'" -f2 | head -n 1)" | awk '/url=/ {print $2}' | sed -e "s/english/international/" | cut -d '=' -f3 | cut -d '&' -f1)"
Show Sample Output
Very quick! Based only on the content sizes and the character counts of filenames. If both numbers are equal then two (or more) directories seem to be most likely identical.
if in doubt apply:
diff -rq path_to_dir1 path_to_dir2
AWK function taken from here:
http://stackoverflow.com/questions/2912224/find-duplicates-lines-based-on-some-delimited-fileds-on-line
Show Sample Output
on some distro's you have to replace "BogoMIPS" with "bogomips". Show Sample Output
show off how big your disks are Show Sample Output
While the posted solution works, I'm a bit uneasy about the "%d" part. This would be hyper-correct approach:
lsof|gawk '$4~/txt/{next};/REG.*\(deleted\)$/{sub(/.$/,"",$4);printf ">/proc/%s/fd/%s\n", $2,$4}'
Oh, and you gotta pipe the result to sh if you want it to actually trim the files. ;)
Btw, this approach also removes false negatives (OP's command skips any deleted files with "txt" in their name).
Calculate the date of Sysadmin day (last Friday of July) of any given year Show Sample Output
capture 2000 packets and print the top 10 talkers
https://stackoverflow.com/questions/10768160/ip-address-converter Show Sample Output
worse alternative to ctrl+r: grep the history removing duplicates without sorting (case insensitive search). Show Sample Output
spectrum protect's dsmc command shows file names and total amount of restore. This command shows which files are actually open and their siz in GB and highlights the change to the previous output Show Sample Output
Parse an m3u file with seconds for each item and output the length of the entire playlist Show Sample Output
A text file contains thousands of numbers. This command prints lines were the number is greater or equal than a specified value (134000000). Show Sample Output
You need sysstat and gawk for this to work. Show Sample Output
Helpful when we want to do mass file renaming(especially mp3s). Show Sample Output
awk can clear the screen while displaying output. This is a handy way of seeing how many lines a tail -f has hit or see how many files find has found. On solaris, you may have to use 'nawk' and your machine needs 'tput' Show Sample Output
This will drop you into vim to edit all files that contain your grep string.
saves one command. Needs GNU grep though :-(
Does not require input to function or complete. Number of iterations controlled by shell variable $NUM. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: