Hide

What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.


If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Hide

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:

Hide

News

2011-03-12 - Confoo 2011 presentation
Slides are available from the commandlinefu presentation at Confoo 2011: http://presentations.codeinthehole.com/confoo2011/
2011-01-04 - Moderation now required for new commands
To try and put and end to the spamming, new commands require moderation before they will appear on the site.
2010-12-27 - Apologies for not banning the trolls sooner
Have been away from the interwebs over Christmas. Will be more vigilant henceforth.
2010-09-24 - OAuth and pagination problems fixed
Apologies for the delay in getting Twitter's OAuth supported. Annoying pagination gremlin also fixed.
Hide

Tags

Hide

Functions

Delete all files and folders except one file/dir

Terminal - Delete all files and folders except one file/dir
ls -R | grep -v skipme | xargs rm -Rf
2013-10-18 08:11:39
Functions: grep ls rm xargs
-11
Delete all files and folders except one file/dir

This command will delete all files and folders except 'skipme'. it could be a file or a folder.

Alternatives

There are 2 alternatives - vote for the best!

Terminal - Alternatives

Know a better way?

If you can do better, submit your command here.

What others think

this is extremely unsafe

Comment by malathion 26 weeks and 6 days ago

I learned from this site that you can just do:

rm -f !(survivior.txt)

and survivor.txt won't be deleted. It also works with wildcards:

rm !(*.foo|*.bar|*.baz)

While your command may not be the most elegant way to exclude files from deletion, it still works, is easy to remember, and I don't consider it to be anymore unsafe than using the syntax above.

I also like the fact that you can run the command without xargs to see what it will delete before piping it to xargs.

Comment by sonic 26 weeks and 6 days ago

malathion is right. Not only is it unsafe, it is ERRONEOUS (due to ignorance or an assumption with grep -v skipme only matching once - you'd need the option -m 1 and even that will result in potential problems; is it the correct file to skip?). Sorry but that's the truth. I'll elaborate:

1. erroneous: if the goal is to skip a single file or directory (and only a single file or directory [= more than one file, mind you; that is semantics, so forget that if it bothers you]) then it is a failure. grep with the option -v will skip more than just the pattern alone, by itself. that is because grep matches what is requested and 'skipme' will match: donotskipme and skipmeplease among any other line containing skipme.

2. the find command would be one better option. E.g. the following (but see 3 below). \! -name 'skipme' with -exec, -execdir, -ok, -okdir (all with executing rm or otherwise passing to xargs and in that case use -print0 and -0 to find and xargs respectively); exercise to the reader to check the man page if they don't know how! Otherwise, you could use -delete if you aren't (for example) enabling depth (which some options imply e.g., -prune). See also among others -path option.

3. But as a general rule using any command that acts on more than one [file] and at the same time not knowing for sure which files will be acted upon (and it is entirely impossible with the given way to know for 100% sure that you're skipping only one and skipping is the goal, right?) is a really dangerous (or foolish if you want to go that way) thing to do. I'll also remind people of the fact that files can contain newlines, spaces and other things and that can make things more dangerous (depending on what you're doing - hint: 777 on tmp - regardless of its sticky bit - has messed up administrators because of such devious things and messed up in ugly ways! temporary files in general can be a problem to security and are when much care is not taken).

Comment by sigmet 26 weeks and 1 day ago

The above is true, but you as I mentioned, you can see the output and then decide whether or not to pipe it to xargs.

It never really crossed my mind that some people might not know how REGEX works and that you're using it with grep. I agree that this is dangerous because of #2 mentioned above.

Playing with xargs is still fun, but the find command works best for this sort of thing. It's just a little bit more of a pain to memorize the entire command but well worth your time.

Comment by sonic 25 weeks and 4 days ago

Late response but I have a lot going on (and I don't spend much time on the web). Still, the end of my post gives another example of why using grep to find a file name (when not by content of file) is a bad idea.

The problem is not that some people might not know how grep works. The problem is that grep is not a good way to find files, period, unless you're trying to find a file with some content in it. As you point out, the reason is simple: grep is for regular expressions. A file called 'skipme' is a single file but piping output of ls to grep -v to skip a single file has no guarantee there is only one file with the pattern.

Most important point, however, is this: file globbing is NOT the same thing as a regular expression as shown in the example below:

echo '/etc/sysconfig/network' > testglob grep sysconfig.network testglob

/etc/sysconfig/network

ls /etc/sysconfig.network

ls: cannot access /etc/sysconfig.network: No such file or directory

They're similar but not the same and its dangerous to use them as equal (and that goes for using any two different but similar things as equals).

And yes its worth your time to learn the commands in full (unless of course you're afraid of the command prompt and its features like some are afraid of using > instead of >> which is ridiculous if you know what you are doing). I would also add that using xargs is still useful with find. Quite useful even - xargs is designed to work with find (example: -printf with '\0' or -print0 and xargs option -0 are used together). xargs is also used for long command lines (and associated risks with).

Any way, as I wrote, I rarely will post here or even check here but I wanted to clarify these points (or at least the point about file globbing and regular expressions being different) because it is quite an important thing to consider.

Comment by sigmet 22 weeks and 5 days ago

Your point of view

You must be signed in to comment.

Related sites and podcasts