commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Find the length of the longest line of code in your files.
Rip audio tracks from CD to wav files in current dir
Often times you run a command in the terminal and you don't realize it's going to take forever. You can open a new terminal, but you lose the local history of the suspended one. You can stop the running command using , but that may produce undesirable side-effects. suspends the job, and (assuming you have no other jobs running in the background) %1 resumes it. Appending & tells it to run in the background.
You now have a job running concurrently with your terminal. Note this will still print any output to the same terminal you're working on.
Tested on zsh and bash.
Use the lshw command to display information about your video card. Give more ouput when run as root.
Print out contents of file with line numbers.
This version will print a number for every line, and separates the numbering from the line with a tab.
if you have a alias like this:
alias cp='cp -i'
# cp file1 file1.bak
# cp -i file1 file1.bak
(it will not overwrite file1.bak if it exist)
# \cp file1 file1.bak
# /bin/cp file1 file1.bak
(skip alias settings, it will overwrite file1.bak if it exist)
d --> delete
!d ---> delete others
The quality ranges between 0 to 9, with the smaller number indicates a higher quality file but bigger too.
Next time you see a mac fanboy bragging about 64-bitness of 10.6 give him this so he might sh?
Used it on daily basis, not sure if it's any better than the OPs version, tho'
One advantage is - you can replace 'bash' at the end of the line with eg. cat - to check if the generated command is OK.
Perl version - just for completeness sake ;)
Uses Unicode combining characters to produce strikethrough effect. Since commandlinefu doesn't display Unicode properly, you will need to replace the dash in the code above with the Unicode long stroke overlay (U+0336).
rsync by itself doesn't support copying between two remote hosts, but if you use sshfs you can pretend one of them is local. If you have a passphrase-less ssh-key, you can even put this script into a cron job.
A faster alternative is to run ssh-keygen on remote1 and put the pubkey into remote2:~/.ssh/authorized_keys, running rsync on remote1 (or vice versa), but the problem with that is that now a hacker on remote1 can access remote2 at any time. The above method ensures your local computer stays the weak link.
Git uses secure hash sums for its revision numbers. I'm sure this is fine and dandy for ultra-secure computing, but it's less than optimal for humans. Thus, this will give you sequential revision numbers in Git all the way from the first commit.
Per country GET report, based on access log. Easy to transform to unique IP
You cannot kill zombies, as they are already dead. But if you have too many zombies then kill parent process or restart service.
You can kill zombie process using PID obtained from the above command. For example kill zombie proces having PID 4104:
# kill -9 4104
Please note that kill -9 does not guarantee to kill a zombie process.
This alternative cleans HISTTIMEFORMAT environment variable and calls gnuplot just after /tmp/cmds is closed, to avoid some errors.
For debian likes, that's in python-xml package.