xargs is a more elegant approach to executing a command on find results then -exec as -exec is meant as a filtering flag.
Uses find, plutil and xpath. Note: Some applications don't have proper information. system_profiler might be better to use. It's a bit slow query. Due to command length limit, I removed -name "*.app" and CFBundleName. Show Sample Output
If you have a directory with lot of backups (full backups I mean), when it gets to some size, you could want to empty some space. With this command you'll remove half of the files. The command assumes that your backup files starts with YYYYMMDD or that they go some alphabetical order. Show Sample Output
If /home/sonic/archive/ was a symlink to /backup/sonic/archive it would follow the links and give you the file listing. By default find will NOT follow symbolic links. The default behavior for the find command is to treat the symlinks as literal files. I discovered this when trying to write a script run via cron to delete files with a modification time older than X days. The easiest solution was to use: /usr/bin/find -L /home/sonic/archive -name '*gz' -type f -mtime +14 -exec rm '{}' \; Show Sample Output
I _think_ you were trying to delete files whether or not they had spaces. This would do that. You should probably be more specific though.
Find all files larger than 500M in home directory and print them ordered by size with full info about each file. Show Sample Output
This is just another example of what the nocache package is useful for, which I described in http://www.commandlinefu.com/commands/view/12357/ and that provides the commands
nocache <command to run with page cache disabled>
cachedel <single file to remove from page cache>
cachstats <single file> # to get the current cache state
Often, we do not want to disable caching, because several file reads are involved in a command and operations would be slowed down a lot, due to massive disk seeks. But after our operations, the file sits in the cache needlessly, if we know we're very likely never touching it again.
cachedel helps to reduce cache pollution, i.e. frequently required files relevant for desktop interaction (libs/configs/etc.) would be removed from RAM.
So we can run cachedel after each data intensive job. Today I run commands like these:
<compile job> && find . -type f -exec cachedel '{}' \; &> /dev/null # no need to keep all source code and tmp files in memory
sudo apt-get dist-upgrade && find /var/cache/apt/archives/ -type f -exec cachedel '{}' \; # Debian/*buntu system upgrade
dropbox status | grep -Fi idle && find ~/Dropbox -type f -exec cachedel '{}' \; &> /dev/null # if Dropbox is idle, remove sync'ed files from cache
https://github.com/Feh/nocache
http://packages.debian.org/search?keywords=nocache
http://packages.ubuntu.com/search?keywords=nocache
http://askubuntu.com/questions/122857
Add `-maxdepth 1` to find to exclude subfolders.
Btrfs reports the inode numbers of files with failed checksums. Use `find` to lookup the file names of those inodes.
Finds all nfo files without the filename movie.nfo and deletes them.
Sometimes you just want to operate on files that were created after specific date. This command consists of 3 commands: - Create a dummy file with the custom date - Find all files with "creation time" further than our custom date by using `-newer` find option. Add your crazy stuff here, like moving, deleting, printing, etc. - Remove the dummy file Show Sample Output
for filename multilingual (ex.japanese, chinese, ...etc)
Variant of find grep that ignores files with .svn in the name. Useful for searching through a local repository of source code.
run as root and use it fo find file you're looking for. Show Sample Output
For a python project, sometimes I need to clean all the compiled python files. I have an alias 'rmpyc' to this command. This really saves me a lot of typing and hunting throughout the folders to delete those files.
For quick validation of folder's file-contents (structure not taken into account) - I use it mostly to check if two folders' contents are the same. Show Sample Output
Found here: http://xentek.net/xentek/315/recursively-delete-svn-folders/ This is fast and efficient because rm is only run once.
You can use this command to delete CVS/svn folders on given project.
"." is current dir, maxdepth is the level, -print0 | xargs -0 fix spaces in names, -i interactive , ./ is the current dir {} actual name , and {,.bak} is the atual name + bak
Run this in your music folder, or give the path directly after "find". The sed pattern filters away the basename. Show Sample Output
Ever wanted to find the most recently modified files, but couldn't remember exactly where they were in a project directory with many subdirectories? The "find" command, using a combination of "-mtime -N" and "-depth -D" can be used to find those files. If your directory structure isn't very deep, just omit the "-depth -D", but if your directory structure is very deep, then you can limit the depth of the traversal using "-depth -D", where "D" is the maximum number of directory levels to descend. Show Sample Output
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: