commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Prepend text to a file. It doen't need temporary files, ed or sed.
$ prepend content to add [filename]
Uses ed, so no temp files created.
The original command is great, but I often want to prepend to every line.
A much shorter version of this command.
This command does the following:
- converts any sequence of multiple spaces/tabs to one space only
- completely removes any space(s)/tab(s) at the end of each line
(If spaces and tabs are mixed in a sequence i.e. [tab][tab][space][tab], you have to execute this command twice!)
This is a more concise answer to http://blog.commandlinekungfu.com/2011/09/episode-158-old-switcheroo.html in my opinion.
Using the sed -i (inline), you can replace the beginning of the first line of a file without redirecting the output to a temporary location.
In case the line you want to join start with a char different than ", you may use \n.*"\n as regex.
this line ends here
but must be concatenated with this one
"this line ends here"
and should NOT be concatenated with this one
recursive find and replace. important stuff are grep -Z and zargs -0 which add zero byte after file name so sed can work even with file names with spaces.
Replaces tabs in output with spaces. Uses perl since sed seems to work differently across platforms.
Do a recursive (-r) search with grep for all files where your old mail address is mentioned (-l shows only the file names) and use sed to replace it with your new address. Works with other search/replacement patterns too.
Changed out the for loop for an xargs. It's a tad shorter, and a tad cleaner.
Recursively replace a string in files with lines matching string. Lines with the string "group name" will have the first > character replaced while other > characters on other lines will be ignored.
If you can install rpl it's simpler to use and faster than combinations of find, grep and sed.
See man rpl for various options.
time on above operation: real 0m0.862s, user 0m0.548s, sys 0m0.180s
using find + sed: real 0m3.546s, user 0m1.752s, sys 0m1.580s
This does the following:
1 - Search recursively for files whose names match REGEX_A
2 - From this list exclude files whose names match REGEX_B
3 - Open this as a group in textmate (in the sidebar)
And now you can use Command+Shift+F to use textmate own find and replace on this particular group of files.
For advanced regex in the first expression you can use -regextype posix-egrep like this:
mate - `find * -type f -regextype posix-egrep -regex 'REGEX_A' | grep -v -E 'REGEX_B'`
Warning: this is not ment to open files or folders with space os special characters in the filename. If anyone knows a solution to that, tell me so I can fix the line.
Yeah, there are many ways to do that.
Doing with sed by using a for loop is my favourite, because these are two basic things in all *nix environments. Sed by default does not allow to save the output in the same files so we'll use mv to do that in batch along with the sed.