commandlinefu.com is the place to record those command-line gems that you return to again and again.
Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.
You can sign-in using OpenID credentials, or register a traditional username and password.
First-time OpenID users will be automatically assigned a username which can be changed after signing in.
Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.
Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):
Subscribe to the feed for:
Wow, didn't really expect you to read this far down. The latest iteration of the site is in open beta. It's a gentle open beta-- not in prime-time just yet. It's being hosted over at UpGuard (link) and you are more than welcome to give it a shot. Couple things:
Exports the result of query in a csv file
mtop allows you to monitor the operation of a MySQL application in real time. See, among the high, the number of queries performed per second, slower queries, the number of active processes.
To install on Ubuntu
sudo apt-get-y install mtop
Mysql command to list the disk usage of the database
This should probably only be used for testing in a dev environment as it's not terribly efficient, but if you're doing something that might trash a DB and you still want the old data available, this works like a charm.
-H suppress Headers
-I Inserts instead of csv
-R to give ; as the row delimeter.
Probably you can concatenate each line with a ; while importing to the db.
This will show the locations, in order of preference, that MySQL will look for a configuration file
This version compresses the data for transport.
In the example above 3 tables are copied. You can change the number of tables. You should be able to come up with variants of the command by modifying the mysqldump part easily, to copy some part of remote mysql DB.
`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)
-o : optimize
-p : asks for password
-u : user to use for authentication
This uses PV to monitor the progress of the MySQL import and displays it though Zenity. You could also do this
pv ~/database.sql | mysql -u root -pPASSWORD -D database_name
and get a display in the CLI that looks like this
2.19MB 0:00:06 [ 160kB/s] [> ] 5% ETA 0:01:40
My Nautalus script using this command is here
show Mysql uptime
This way you keep the file compressed saving disk space.
Other way less optimal using named pipes:
mysql -uroot -p'passwd' database <
It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.
This command will dump a database on a remote stream to stdout, compress it, stream it to your local machine, decompress it and put it into a file called database.sql.You could even pipe it into mysql on your local machine to restore it immediately. I had to use this recently because the server I needed a backup from didn't have enough disk space.
Filters out all non-insert SQL operations (we couldn't filter out only lines starting with "INSERT" because inserts can span multiple lines), quotes table names with backticks, saves dump to a file and pipes it straight to mysql.
This transfers only data--it expects your schema is already in place. In Ruby on Rails, you can easily recreate the schema in MySQL with "rake db:schema:load RAILS_ENV=production".
Output is from Debian Lenny
perror should be installed if mysql-server package is installed