commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
If you want certain files out of a directory hierarchy, this will copy just the listed files, but will create the directory hierarchy in the new location ($DIR/)
Will return your internal IP address.
This command uses the recursive glob and glob qualifiers from zsh. This will remove all the empty directories from the current directory down.
The **/* recurses down through all the files and directories
The glob qualifiers are added into the parenthesis. The / means only directories. The F means 'full' directories, and the ^ reverses that to mean non-full directories. For more info on these qualifiers see the zsh docs: http://zsh.dotsrc.org/Doc/Release/Expansion.html#SEC87
It will create a backup of the filename. The advantage is that if you list the folder the backups will be sorted by date. The command works on any unix in bash.
Same thing as above, just uses fetch and ipchicken.com
The same as the other user, but smarter, using -d and -f
default stack size is 10M. This makes your multithread app filling rapidly your memory.
on my PC I was able to create only 300thread with default stack size.
Lower the default stack size to the one effectively used by your threads, let you create more.
ex. putting 64k I was able to create more than 10.000threads.
Obviously ...your thread shouldn't need more than 64k ram!!!
Tweeting from terminal to twitter accounts..
Display information about the cores.
* sudo apt-get install schedtool
Converts the batch of images to video.
Plays the sound of the file, should sound like *some* kind of music, most files sound like static but some are really cool.
sudo cat /dev/sda > /dev/dsp
sudo cat /dev/sda5 | aplay
Check out http://bbs.archlinux.org/viewtopic.php?id=70937 for more variations!
semi-dupe--like http://www.commandlinefu.com/commands/view/985/generate-white-noise but with different syntax and program.
Having to escape forwardslashes when using sed can be a pain. However, it's possible to instead of using / as the separator to use : .
I found this by trying to substitute $PWD into my pattern, like so
sed "s/~.*/$PWD/" file.txt
Of course, $PWD will expand to a character string that begins with a / , which will make sed spit out an error such as "sed: -e expression #1, char 8: unknown option to `s'".
So simply changing it to
sed "s:~.*:$PWD:" file.txt
did the trick.
xargs -P N spawns up to N worker processes. -n 40 means each grep command gets up to 40 file names each on the command line.
This command uses mutt to send the mail. You must pipe in a body, otherwise mutt will prompt you for some stuff. If you don't have mutt, it should be dead easy to install.
Force make command to create as many compile processes as specified (4 in the example), so that each one goes into one core or CPU and compilation happens in parallel. This reduces the time required to compile a program by up to a half in the case of CPUs with 2 cores, one fourth in the case of quad cores... and so on.
Colorize output of make, gcc/g++ or diff, making it easier to read at a glance.
They are not distributed with make, diff or gcc, but are usually available in the repositories.
This one will work a little better, the regular expressions it is not 100% accurate for XML parsing but it will suffice any XML valid document for sure.