Returns any file in the folder which would be rejected by Gmail, if you were to send zipped version. (Yes, you could just zip it and knock the extension off and put it back on the other side, but for some people this just isn't a solution) Show Sample Output
Show the top file size in human readable form
Take a folder full of files and split it into smaller folders containing a maximum number of files. In this case, 100 files per directory. find creates the list of files xargs breaks up the list into groups of 100 for each group, create a directory and copy in the files Note: This command won't work if there is whitespace in the filenames (but then again, neither do the alternative commands :-)
One of my friends committed his code in the encoding of GB2312, which broke the build job. I have to find his code and convert.
Deletes capistrano-style release directories (except that there are dashes between the YYYY-MM-DD) Show Sample Output
There's nothing particularly novel about this combination of find, grep, and wc, I'm just putting it here in case I want it again. Show Sample Output
Sort by time and Reverse to get Ascending order, then display a marker next to the a file, negate directory and select only 1 result Show Sample Output
calls grep on all non-binary files returned by find on its current working directory Show Sample Output
needs grep what supports '--recursive' Show Sample Output
These should be a little faster since they don't have to spawn grep.
Also shows files as they are found. Only works from a tty.
xargs is a more elegant approach to executing a command on find results then -exec as -exec is meant as a filtering flag.
If /home/sonic/archive/ was a symlink to /backup/sonic/archive it would follow the links and give you the file listing. By default find will NOT follow symbolic links. The default behavior for the find command is to treat the symlinks as literal files. I discovered this when trying to write a script run via cron to delete files with a modification time older than X days. The easiest solution was to use: /usr/bin/find -L /home/sonic/archive -name '*gz' -type f -mtime +14 -exec rm '{}' \; Show Sample Output
Find all files larger than 500M in home directory and print them ordered by size with full info about each file. Show Sample Output
Btrfs reports the inode numbers of files with failed checksums. Use `find` to lookup the file names of those inodes.
Find and replace specific characters in a single line in multiple files with sed. Show Sample Output
Ever wanted to find the most recently modified files, but couldn't remember exactly where they were in a project directory with many subdirectories? The "find" command, using a combination of "-mtime -N" and "-depth -D" can be used to find those files. If your directory structure isn't very deep, just omit the "-depth -D", but if your directory structure is very deep, then you can limit the depth of the traversal using "-depth -D", where "D" is the maximum number of directory levels to descend. Show Sample Output
This script will list all the files in the tarballs present on any folder or subfolder of the provided path. The while loop is for echoing the file name of the tarball before listing the files, so the tarball can be identified
commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for: