find . -name "\.svn" -exec rm -rf {} ";"

Remove all .svn folders inside a folder


-1
By: caiosba
2009-05-14 12:02:25

These Might Interest You

  • If you have a folder with thousand of files and want to have many folder with only 100 file per folder, run this. It will create 0/,1/ etc and put 100 file inside each one. But find will return true even if it don't find anything ... Show Sample Output


    -1
    folder=0;mkdir $folder; while find -maxdepth 1 -type f -exec mv "{}" $folder \; -quit ; do if [ $( ls $folder | wc -l ) -ge 100 ]; then folder=$(( $folder + 1 )); mkdir $folder; fi ; done
    Juluan · 2011-02-11 21:28:01 0
  • Problem: I wanted to backup user data individually. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...) I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_full.tgz" and "lucy_full.tgz".


    1
    find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir/"; tar -czf $destdir/"$tarfile"_full.tgz -P $d; done
    jaimerosario · 2013-12-05 19:07:17 0
  • Problem: I wanted to backup user data individually, using and incremental method. In this example, all user data is located in "/mnt/storage/profiles", and about 25 folders inside, each with a username ( /mnt/storage/profiles/mike; /mnt/storage/profiles/lucy ...) I need each individual folder backed up, not the whole "/mnt/storage/profiles". So, using find while excluding directories depth and creating two variables (tarfile=username & desdir=destination), tar will create a .tgz file for each folder, resulting in a "mike_2013-12-05.tgz" and "lucy_2013-12-05.tgz".


    1
    find /mnt/storage/profiles/ -maxdepth 1 -mindepth 1 -type d | while read d; do tarfile=`echo "$d" | cut -d "/" -f5`; destdir="/local/backupdir"; tar -czvf "$destdir"/"$tarfile"_`date +%F`.tgz -P $d; done
    jaimerosario · 2013-12-05 19:18:03 0
  • makes sure not to skip children nodemodules credit: https://coderwall.com/p/guqrca/remove-all-node_module-folders-recursively


    0
    find . -name "node_modules" -type d -prune -exec rm -rf '{}' +
    ctcrnitv · 2018-02-10 04:23:36 0
  • For quick validation of folder's file-contents (structure not taken into account) - I use it mostly to check if two folders' contents are the same. Show Sample Output


    -2
    find path/to/folder/ -type f -print0 | xargs -0 -n 1 md5sum | awk '{print $1}' | sort | md5sum | awk '{print $1}'
    mcover · 2009-02-16 19:39:37 3
  • This will move a folder and merge it with another folder which may contain duplicates. Technically it's just creating hardlinks of everything in the folder, and after it's done, delete the source (with rm -r source/ ) to complete the move. This is much faster than, for example, using rsync to merge folders which would actually copy the entire contents and so for a lot of files would take much longer. This uses macutils gcp port of cp so it can be used on osx/MacOS. If using in linux or some unix where cp includes the ability to create links with -l you can just use cp instead of gcp.


    2
    gcp -r -l source/ destination/
    fivestones · 2017-02-09 23:48:38 0

What Others Think

Looking for files called .svn? ;-) find . -name "*\.svn" -exec rm -rf {} ";" might be better. Much better would be: find . -name "*\.svn" -exec echo {} + Why? Performance! for n in `seq 1 1 1000`; do touch $n.svn; done time find . -name "*\.svn" -exec echo {} \; 1>/dev/null real 0m2.629s user 0m0.970s sys 0m1.655s time find . -name "*\.svn" -exec echo {} + 1>/dev/null real 0m0.020s user 0m0.012s sys 0m0.007s
OJM · 471 weeks ago
What's with find . -name ".svn" -type d | xargs rm -rf -> -type d tells find to only find dirs -> xargs starts onyl one rm subprocess and not one rm subprocess for each found dir like find's exec option does. So it is much more efficient.
Marco · 471 weeks ago
Also... If you're using find to remove directories, use the -depth option (search deep first). find . -depth -type d -name .svn -exec rm {} \; This will prevent errors from find trying to descend into directories that it's just deleted.
flatcap · 471 weeks ago
@OJM: the directories are called ".svn", it's not a file extension but a hidden directory. Agreed on the + vs "\;" suggestion
karel1980 · 459 weeks and 3 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands



Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: