What's this?

commandlinefu.com is the place to record those command-line gems that you return to again and again.

Delete that bloated snippets file you've been using and share your personal repository with the world. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

If you have a new feature suggestion or find a bug, please get in touch via http://commandlinefu.uservoice.com/

Get involved!

You can sign-in using OpenID credentials, or register a traditional username and password.

First-time OpenID users will be automatically assigned a username which can be changed after signing in.

Universal configuration monitoring and system of record for IT.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.

» http://twitter.com/commandlinefu
» http://twitter.com/commandlinefu3
» http://twitter.com/commandlinefu10

Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for:



May 19, 2015 - A Look At The New Commandlinefu
I've put together a short writeup on what kind of newness you can expect from the next iteration of clfu. Check it out here.
March 2, 2015 - New Management
I'm Jon, I'll be maintaining and improving clfu. Thanks to David for building such a great resource!

Top Tags





Commands tagged mysql from sorted by
Terminal - Commands tagged mysql - 57 results
mysql -u[username] -p[password] [nome_database] -B -e "SELECT * FROM [table] INTO OUTFILE '/tmp/ca.csv' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
2011-06-15 17:43:18
User: daneoshiga
Tags: mysql

Exports the result of query in a csv file

mtop se -1
2011-06-14 19:12:13
User: 0disse0
Tags: mysql mtop

mtop allows you to monitor the operation of a MySQL application in real time. See, among the high, the number of queries performed per second, slower queries, the number of active processes.

To install on Ubuntu

sudo apt-get-y install mtop

SELECT table_schema "Data Base Name", sum( data_length + index_length ) / 1024 / 1024 "Data Base Size in MB" FROM information_schema.TABLES GROUP BY table_schema ;
2011-05-23 08:42:38
User: igorfu
Tags: mysql

Mysql command to list the disk usage of the database

mysqldump OLD_DB | cat <(echo "CREATE DATABASE NEW_DB; USE NEW_DB;") - | mysql
2011-05-16 20:42:01
User: michaelmior
Functions: cat echo

This should probably only be used for testing in a dev environment as it's not terribly efficient, but if you're doing something that might trash a DB and you still want the old data available, this works like a charm.

mdb-export -H -I -R database.mdb table >table.sql
2011-04-09 22:18:24
User: vasundhar

-H suppress Headers

-I Inserts instead of csv

-R to give ; as the row delimeter.

Probably you can concatenate each line with a ; while importing to the db.

mysql -? | grep ".cnf"
2011-04-02 15:45:52
User: DarkSavant
Functions: grep
Tags: mysql

This will show the locations, in order of preference, that MySQL will look for a configuration file

ssh username@remotehost 'mysqldump -u <dbusername> -p<dbpassword> <dbname> tbl_name_1 tbl_name_2 tbl_name_3 | gzip -c -' | gzip -dc - | mysql -u <localusername> -p<localdbpassword> <localdbname>
ssh username@remotehost 'mysqldump -u <dbusername> -p<dbpassword> <dbname> tbl_name_1 tbl_name_2 tbl_name_3' | mysql -u <localusername> -p<localdbpassword> <localdbname> < /dev/stdin
2011-03-09 18:35:07
User: tur_ki_sh
Functions: ssh

In the example above 3 tables are copied. You can change the number of tables. You should be able to come up with variants of the command by modifying the mysqldump part easily, to copy some part of remote mysql DB.

tar xfzO <backup_name>.tar.gz | mysql -u root <database_name>
2011-02-10 22:18:42
User: alecnmk
Functions: tar

`tar xfzO` extracts to STDOUT which got redirected directly to mysql. Really helpful, when your hard drive can't fit two copies of non-compressed database :)

tshark -i any -T fields -R mysql.query -e mysql.query
mysqlcheck -op -u<user> <db>
2010-10-06 00:25:49
User: Weboide
Tags: mysql optimize

-o : optimize

-p : asks for password

-u : user to use for authentication

(pv -n ~/database.sql | mysql -u root -pPASSWORD -D database_name) 2>&1 | zenity --width 550 --progress --auto-close --auto-kill --title "Importing into MySQL" --text "Importing into the database"
2010-06-19 22:40:10
User: kbrill
Tags: mysql pv zenity

This uses PV to monitor the progress of the MySQL import and displays it though Zenity. You could also do this

pv ~/database.sql | mysql -u root -pPASSWORD -D database_name

and get a display in the CLI that looks like this

2.19MB 0:00:06 [ 160kB/s] [> ] 5% ETA 0:01:40

My Nautalus script using this command is here


mysql -e"SHOW STATUS LIKE '%uptime%'"|awk '/ptime/{ calc = $NF / 3600;print $(NF-1), calc"Hour" }'
zcat database.sql.gz | mysql -uroot -p'passwd' database
2010-03-23 12:41:57
User: rubenmoran
Functions: zcat
Tags: mysql gzip zcat

This way you keep the file compressed saving disk space.

Other way less optimal using named pipes:

mysql -uroot -p'passwd' database <

mysql>use DBNAME; mysql>source FILENAME
for I in $(mysql -e 'show databases' -u root --password=root -s --skip-column-names); do mysqldump -u root --password=root $I | gzip -c | ssh user@server.com "cat > /remote/$I.sql.gz"; done
2010-03-07 15:03:12
User: juliend2
Functions: gzip ssh

It grabs all the database names granted for the $MYSQLUSER and gzip them to a remote host via SSH.

mysqldump -uUSERNAME -pPASSWORD database | gzip > /path/to/db/files/db-backup-`date +%Y-%m-%d`.sql.gz ;find /path/to/db/files/* -mtime +5 -exec rm {} \;
ssh user@host "mysqldump -h localhost -u mysqluser -pP@$$W3rD databasename | gzip -cf" | gunzip -c > database.sql
2009-10-05 00:57:51
User: daws
Functions: gunzip ssh

This command will dump a database on a remote stream to stdout, compress it, stream it to your local machine, decompress it and put it into a file called database.sql.You could even pipe it into mysql on your local machine to restore it immediately. I had to use this recently because the server I needed a backup from didn't have enough disk space.

sqlite3 mydb.sqlite3 '.dump' | grep -vE '^(BEGIN|COMMIT|CREATE|DELETE)|"sqlite_sequence"' | sed -r 's/"([^"]+)"/`\1`/' | tee mydb.sql | mysql -p mydb
2009-10-02 14:40:51
User: mislav
Functions: grep sed tee
Tags: mysql sqlite dump

Filters out all non-insert SQL operations (we couldn't filter out only lines starting with "INSERT" because inserts can span multiple lines), quotes table names with backticks, saves dump to a file and pipes it straight to mysql.

This transfers only data--it expects your schema is already in place. In Ruby on Rails, you can easily recreate the schema in MySQL with "rake db:schema:load RAILS_ENV=production".

grep CONFIG $(which mysqlbug)
cat `whereis mysqlbug | awk '{print $2}'` | grep 'CONFIGURE_LINE='
perror NUMBER
2009-03-31 19:19:44
User: alperyilmaz

perror should be installed if mysql-server package is installed

slave start; SELECT MASTER_POS_WAIT('master.000088','8145654'); slave stop;
2009-03-26 14:11:43
User: slim

say you want to reinitialize the slave database without resetting the master positions. You stop the slave, dump the master database with --master-data=2 then execute the command on the slave and wait for it to stop at the exact position of the dump. reinit the slave db and start the slave. enjoy.