commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Starting with a large MySQL dump file (*.sql) remove any lines that have inserts for the specified table. Sometimes one or two tables are very large and uneeded, eg. log tables. To exclude multiple tables you can get fancy with sed, or just run the command again on subsequently generated files.
A basic usage
Sometimes, I just want to back up a single client's databases. Fortunately, all clients have a set prefix to their database names. This makes my life easy! I just use 'CLIENTNAME_%' as my MYSQL_PATTERN in this command, and my life is suddenly easy.
-e - (optional) use extended insert.
-B - what follows is a list of databases
-v - (optional) give verbose output
-N - don't write column names in output (prevents us trying to back up a database called "Database").
How to extract data from one table:
mysqldump --opt --where="true LIMIT 5000" dbinproduzione tabella > miodbditest_tabella.sql
This should probably only be used for testing in a dev environment as it's not terribly efficient, but if you're doing something that might trash a DB and you still want the old data available, this works like a charm.
This version compresses the data for transport.
In the example above 3 tables are copied. You can change the number of tables. You should be able to come up with variants of the command by modifying the mysqldump part easily, to copy some part of remote mysql DB.
`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.
This command will dump a database on a remote stream to stdout, compress it, stream it to your local machine, decompress it and put it into a file called database.sql.You could even pipe it into mysql on your local machine to restore it immediately. I had to use this recently because the server I needed a backup from didn't have enough disk space.
No need to loop when we have `xargs`. The sed command filters out the first line of `show databases` output, which is always "Database".
The coolest way I've found to backup a wordpress mysql database using encryption, and using local variables created directly from the wp-config.php file so that you don't have to type them- which would allow someone sniffing your terminal or viewing your shell history to see your info.
I use a variation of this for my servers that have hundreds of wordpress installs and databases by using a find command for the wp-config.php file and passing that through xargs to my function.