function map-files() { find $1 -name $2 -exec ${@:3} {} \; }

map a command over a list of files - map-files /lib *.so ls -la

function map-files() { find $1 -name $2 -exec ${@:3} {} \; }

0
By: gml
2011-04-01 20:20:44

These Might Interest You

  • A quick find command to identify all TAR files in a given path, extract a list of files contained within the tar, then search for a given string in the filelist. Returns to the user as a list of TAR files found (enclosed in []) followed by any matching files that exist in that archive. TAR can easily be swapped for JAR if required. Show Sample Output


    1
    find . -type f -name "*.tar" -printf [%f]\\n -exec tar -tf {} \; | grep -iE "[\[]|<filename>"
    andrewtayloruk · 2011-01-06 13:01:38 0
  • Some source package have many 'README' kind of files, among many other regular files/directories. This command could be useful when one wants to list only 'README' kind of files among jungle of other files. (e.g. I came across this situation after downloading source for module-init-tools) Warning: This command would miss a file like => README.1 (or one with spaces in-between) Corrections welcome. Show Sample Output


    0
    ls | grep '^[A-Z0-9]*$'
    b_t · 2010-12-19 21:45:53 1
  • Take a folder full of files and split it into smaller folders containing a maximum number of files. In this case, 100 files per directory. find creates the list of files xargs breaks up the list into groups of 100 for each group, create a directory and copy in the files Note: This command won't work if there is whitespace in the filenames (but then again, neither do the alternative commands :-)


    -1
    files -type f | xargs -n100 | while read l; do mkdir $((++f)); cp $l $f; done
    flatcap · 2011-02-15 23:15:16 1
  • This Command will list files & folders which are in GBs. G can be replace by M to get files in MBs.


    0
    du -h |grep -P "^\S*G"
    girish_patel · 2010-12-13 15:26:31 0
  • say you've just found all the config files with this command find . -name '*.config' and you need to edit them all vi `!!` will re-execute the command and present them to vi in the argument list don't use if the list is really long as it may overflow the command buffer


    0
    vi `!!`
    libdave · 2009-07-15 15:20:58 2
  • Often you run a command, but afterwards you're not quite sure what it did. By adding this prefix/suffix around [COMMAND], you can list any files that were modified. . Take a nanosecond timestamp: YYYY-MM-DD HH:MM:SS.NNNNNNNNN date "+%F %T.%N" . Find any files that have been modified since that timestamp: find . -newermt "$D" . This command currently only searches below the current directory. If you want to look elsewhere change the find parameter, e.g. find /var/log . -newermt "$D" Show Sample Output


    2
    D="$(date "+%F %T.%N")"; [COMMAND]; find . -newermt "$D"
    flatcap · 2015-10-15 21:09:54 2

What Others Think

Huh. Would have never thought of that.
kaedenn · 372 weeks and 2 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: