commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
this alternative shows the differences as they occur so that they are made plain
extract data in multiline blocks of data with perl pattern matching loop
Write script or commands in notepad/Editplus/MS word etc, copy the contents, type the above command and click on enter now, paste by right click-ing the mouse. Entire contents in the clip-board gets pasted now again click on Enter to go to new line/next line. Press Ctrl+D to close/save the file. Not always required to vi to create a new file.
Like the original version except it does not include the parent apache process or the grep process and adds "sudo" so it can be run by user.
optionally you can add
|cut -d' ' -f2|uniq
to the end of the command line.
depends on date format locale ...
useful to count events in logs
- excel date compatible with a separate hour field
- added a fixed 1 for easier request counter aggregation
- split URL in directory, filename, fileext, query
- used with tomcat valve with response bytes replaced by elapsed time
similar to previous except this exports to a temporary file, opens that file with your default web browser, then deletes it.
Using urandom to get random data, deleting non-letters with tr and print the first $1 bytes.
cat without comments
Example: you have a package.txt you want to install on a system. Instead of this:
You want it to cat out on one line so you can print "yum install package1 package2 package3"
When dealing with system resource limits like max number of processes and open files per user, it can be hard to tell exactly what's happening. The /etc/security/limits.conf file defines the ceiling for the values, but not what they currently are, while
will show you the current values for your shell, and you can set them for new logins in /etc/profile and/or ~/.bashrc with a command like:
ulimit -S -n 100000 >/dev/null 2>&1
But with the variability in when those files get read (login vs any shell startup, interactive vs non-interactive) it can be difficult to know for sure what values apply to processes that are currently running, like database or app servers. Just find the PID via "ps aux | grep programname", then look at that PID's "limits" file in /proc. Then you'll know for sure what actually applies to that process.
work on any debian based GNU/Linux distro