This uses Bash's "process substitution" feature to compare (using diff) the output of two different process pipelines.
I have a bash alias for this command line and find it useful for searching C code for error messages. The -H tells grep to print the filename. you can omit the -i to match the case exactly or keep the -i for case-insensitive matching. This find command find all .c and .h files Show Sample Output
Find files in a specific date range - in this case, the first half of last year.
-newermt = modification time of the file is more recent than this date
GNU find allows any date specfication that GNU date would accept, e.g.
find . -type f -newermt "3 years ago" ! -newermt "2 years ago"
or
find . -type f -newermt "last monday"
This makes an alias for a command named 'busy'. The 'busy' command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.
This command finds and prints all the symbolic and hard links to a file. Note that the file argument itself be a link and it will find the original file as well.
You can also do this with the inode number for a file or directory by first using stat or ls or some other tool to get the number like so:
stat -Lc %i file
or
ls -Hid file
And then using:
find -L / -inum INODE_NUMBER -exec ls -ld {} +
This command find all files in the current dir and subdirs, and replace all occurances of "oldstring" in every file with "newstring".
This can be much faster than downloading one or both trees to a common servers and comparing the files there. After, only those files could be copied down for deeper comparison if needed. Show Sample Output
This helped me find a botnet that had made into my system. Of course, this is not a foolproof or guarantied way to find all of them or even most of them. But it helped me find it.
Search for files and list the 20 largest.
find . -type f
gives us a list of file, recursively, starting from here (.)
-print0 | xargs -0 du -h
separate the names of files with NULL characters, so we're not confused by spaces
then xargs run the du command to find their size (in human-readable form -- 64M not 64123456)
| sort -hr
use sort to arrange the list in size order. sort -h knows that 1M is bigger than 9K
| head -20
finally only select the top twenty out of the list
Show Sample Output
This command will find the biggest files recursively under a certain directory, no matter if they are too many. If you try the regular commands ("find -type f -exec ls -laSr {} +" or "find -type f -print0 | xargs -0 ls -laSr") the sorting won't be correct because of command line arguments limit. This command won't use command line arguments to sort the files and will display the sorted list correctly. Show Sample Output
change the *.avi to whatever you want to match, you can remove it altogether if you want to check all files.
From the cwd, recursively find all rar files, extracting each rar into the directory where it was found, rather than cwd. A nice time saver if you've used wget or similar to mirror something, where each sub dir contains an rar archive. Its likely this can be tuned to work with multi-part archives where all parts use ambiguous .rar extensions but I didn't test this. Perhaps unrar would handle this gracefully anyway?
Run this in the directory you store your music in. mp3gain and vorbisgain applies the ReplayGain normalization routine to mp3 and ogg files (respectively) in a reversible way. ReplayGain uses psychoacoustic analysis to make all files sound about the same loudness, so you don't get knocked out of your chair by loud songs after cranking up the volume on quieter ones.
Quick method of isolating filenames from a full path using expansion. Much quicker than using "basename" Show Sample Output
This command will give you the same list of files as "find /etc/ -name '*killall' | xargs ls -l". In a simpler format just do 'ls /etc/**/file'. It uses shell globbing, so it will also work with other commands, like "cp /etc/**/sshd sshd_backup". Show Sample Output
By putting the "-not \( -name .svn -prune \)" in the very front of the "find" command, you eliminate the .svn directories in your find command itself. No need to grep them out.
You can even create an alias for this command:
alias svn_find="find . -not \( -name .svn -prune \)"
Now you can do things like
svn_find -mtime -3
I used this to copy all PDFs recursively to a selected dir
This command is useful for renaming a clipart, pic gallery or your photo collection. It will only change the big caps to small ones (on the extension). Show Sample Output
I love this function because it tells me everything I want to know about files, more than stat, more than ls. It's very useful and infinitely expandable.
find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n' | sort -rgbS 50%
00761 drwxrw---x askapache:askapache 777:666 [06/10/10 | 06/10/10 | 06/10/10] [d] /web/cg/tmp
The key is:
# -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'
which believe it or not took me hundreds of tweaking before I was happy with the output.
You can easily use this within a function to do whatever you want.. This simple function works recursively if you call it with -r as an argument, and sorts by file permissions.
lsl(){ O="-maxdepth 1";sed -n '/-r/!Q1'<<<$@ &&O=;find $PWD $O -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -rgbS 50%; }
Personally I'm using this function because:
lll () { local a KS="1 -r -g"; sed -n '/-sort=/!Q1' <<< $@ && KS=`sed 's/.*-sort=\(.*\)/\1/g'<<<$@`;
find $PWD -maxdepth 1 -printf '%.5m %10M %#9u:%-9g %#5U:%-5G [%AD | %TD | %CD] [%Y] %p\n'|sort -k$KS -bS 50%; }
# i can sort by user
lll -sort=3
# or sort by group reversed
lll -sort=4 -r
# and sort by modification time
lll -sort=6
If anyone wants to help me make this function handle multiple dirs/files like ls, go for it and I would appreciate it.. Something very minimal would be awesome.. maybe like:
for a; do lll $a; done
Note this uses the latest version of GNU find built from source, easy to build from gnu ftp tarball. Taken from my http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html
Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: