Delete all files and folders except one file/dir

ls -R | grep -v skipme | xargs rm -Rf
This command will delete all files and folders except 'skipme'. it could be a file or a folder.

2013-10-18 08:11:39

What Others Think

this is extremely unsafe
malathion · 414 weeks and 4 days ago
I learned from this site that you can just do: rm -f !(survivior.txt) and survivor.txt won't be deleted. It also works with wildcards: rm !(*.foo|*.bar|*.baz) While your command may not be the most elegant way to exclude files from deletion, it still works, is easy to remember, and I don't consider it to be anymore unsafe than using the syntax above. I also like the fact that you can run the command without xargs to see what it will delete before piping it to xargs.
sonic · 414 weeks and 4 days ago
malathion is right. Not only is it unsafe, it is ERRONEOUS (due to ignorance or an assumption with grep -v skipme only matching once - you'd need the option -m 1 and even that will result in potential problems; is it the correct file to skip?). Sorry but that's the truth. I'll elaborate: 1. erroneous: if the goal is to skip a single file or directory (and only a single file or directory [= more than one file, mind you; that is semantics, so forget that if it bothers you]) then it is a failure. grep with the option -v will skip more than just the pattern alone, by itself. that is because grep matches what is requested and 'skipme' will match: donotskipme and skipmeplease among any other line containing skipme. 2. the find command would be one better option. E.g. the following (but see 3 below). \! -name 'skipme' with -exec, -execdir, -ok, -okdir (all with executing rm or otherwise passing to xargs and in that case use -print0 and -0 to find and xargs respectively); exercise to the reader to check the man page if they don't know how! Otherwise, you could use -delete if you aren't (for example) enabling depth (which some options imply e.g., -prune). See also among others -path option. 3. But as a general rule using any command that acts on more than one [file] and at the same time not knowing for sure which files will be acted upon (and it is entirely impossible with the given way to know for 100% sure that you're skipping only one and skipping is the goal, right?) is a really dangerous (or foolish if you want to go that way) thing to do. I'll also remind people of the fact that files can contain newlines, spaces and other things and that can make things more dangerous (depending on what you're doing - hint: 777 on tmp - regardless of its sticky bit - has messed up administrators because of such devious things and messed up in ugly ways! temporary files in general can be a problem to security and are when much care is not taken).
sigmet · 413 weeks and 5 days ago
The above is true, but you as I mentioned, you can see the output and then decide whether or not to pipe it to xargs. It never really crossed my mind that some people might not know how REGEX works and that you're using it with grep. I agree that this is dangerous because of #2 mentioned above. Playing with xargs is still fun, but the find command works best for this sort of thing. It's just a little bit more of a pain to memorize the entire command but well worth your time.
sonic · 413 weeks and 1 day ago
Late response but I have a lot going on (and I don't spend much time on the web). Still, the end of my post gives another example of why using grep to find a file name (when not by content of file) is a bad idea. The problem is not that some people might not know how grep works. The problem is that grep is not a good way to find files, period, unless you're trying to find a file with some content in it. As you point out, the reason is simple: grep is for regular expressions. A file called 'skipme' is a single file but piping output of ls to grep -v to skip a single file has no guarantee there is only one file with the pattern. Most important point, however, is this: file globbing is NOT the same thing as a regular expression as shown in the example below: echo '/etc/sysconfig/network' > testglob grep testglob /etc/sysconfig/network ls /etc/ ls: cannot access /etc/ No such file or directory They're similar but not the same and its dangerous to use them as equal (and that goes for using any two different but similar things as equals). And yes its worth your time to learn the commands in full (unless of course you're afraid of the command prompt and its features like some are afraid of using > instead of >> which is ridiculous if you know what you are doing). I would also add that using xargs is still useful with find. Quite useful even - xargs is designed to work with find (example: -printf with '\0' or -print0 and xargs option -0 are used together). xargs is also used for long command lines (and associated risks with). Any way, as I wrote, I rarely will post here or even check here but I wanted to clarify these points (or at least the point about file globbing and regular expressions being different) because it is quite an important thing to consider.
sigmet · 410 weeks and 3 days ago

What do you think?

Any thoughts on this command? Does it work on your machine? Can you do the same thing with only 14 characters?

You must be signed in to comment.

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: