Commands tagged postgresql (12)

  • Get a listing of all of your databases in Postgres and their sizes, ordering by the largest size first. Requires that you give the -d parameter a valid database name that you can connect to. Show Sample Output

    psql -c "SELECT pg_database.datname, pg_database_size(pg_database.datname), pg_size_pretty(pg_database_size(pg_database.datname)) FROM pg_database ORDER BY pg_database_size DESC;" -d <ANYDBNAME>
    bbbco · 2011-11-30 15:22:48 4
  • It's certainly not nicely formatted SQL, but you can see the SQL in there...

    sudo tcpdump -nnvvXSs 1514 -i lo0 dst port 5432
    ethanmiller · 2009-12-18 17:12:44 3
  • This command will "su" the execution of the command to the postgres user(implies that you are already logger as root), and export the result of the query to a file on the csv format. You'll need to adequate the fields and database information to one of your choice/need. Show Sample Output

    # su -c "psql -d maillog -c \"copy (select date,sender,destination,subject from maillog where destination like '') to '/tmp/mails.csv' with csv;\" " postgres
    Risthel · 2013-02-13 13:03:17 2
  • This command drops all the tables of the 'public' schema from the database. First, it constructs a 'drop table' instruction for each table found in the schema, then it pipes the result to the psql interactive command. Useful when you have to recreate your schema from scratch in development for example. I mainly use this command in conjunction with a similar command which drop all sequences as well. Example : psql -h <pg_host> -p <pg_port> -U <pg_user> <pg_db> -t -c "select 'drop table \"' || tablename || '\" cascade;' from pg_tables where schemaname='public'" | psql -h <pg_host> -p <pg_port> -U <pg_user> <pg_db> psql -h <ph_host> -p <pg_port> -U <pg_user> <pg_db> -t -c "select 'drop sequence \"' || relname || '\" cascade;' from pg_class where relkind='S'" | psql -h <ph_host> -p <pg_port> -U <pg_user> <pg_db> See it scripted here :

    psql -h <pg_host> -p <pg_port> -U <pg_user> <pg_db> -t -c "select 'drop table \"' || tablename || '\" cascade;' from pg_tables where schemaname='public'" | psql -h <pg_host> -p <pg_port> -U <pg_user> <pg_db>
    cuberri · 2013-12-11 15:39:56 7
  • This command drops all the sequences of the 'public' schema from the database. First, it constructs a 'drop sequence' instruction for each table found in the schema, then it pipes the result to the psql interactive command. See it scripted here :

    psql -h <ph_host> -p <pg_port> -U <pg_user> <pg_db> -t -c "select 'drop sequence \"' || relname || '\" cascade;' from pg_class where relkind='S'" | psql -h <ph_host> -p <pg_port> -U <pg_user> <pg_db>
    cuberri · 2013-12-11 15:42:34 2
  • In a multiple PostgreSQL server environment knowing the servers version can be important. Note that psql --version returns just the local psql apps version which may not be what you want. This command dumps the PostgreSQL servers version out to one line. You may need to add more command line options to the psql command for your connection environment. Show Sample Output

    psql -h <SERVER NAME HERE> -c 'SELECT version();' | grep -v 'version\|---\|row\|^ *$' | sed 's/^\s*//'
    pnelsonsr · 2014-03-17 18:36:40 4
  • -t, --tuples-only print rows only Show Sample Output

    psql -h <SERVER NAME HERE> -t -c 'SELECT version();' |head -1
    hxre · 2014-04-25 08:26:23 3
  • Without using a pipe. -X ignores the user's .psqlrc configuration file -A sets un-aligned table output mode -t prints rows only (no headers or footers) Show Sample Output

    psql -X -A -t -c "SELECT version();"
    malathion · 2014-05-01 18:10:20 4
  • Replace the credentials to psql if necessary, and the my-query part with your query. Show Sample Output

    psql -U quassel quassel -c "SELECT message FROM backlog ORDER BY time DESC LIMIT 1000;" | grep my-query
    Tatsh · 2014-10-12 19:53:06 4
  • to kill, use `kill PID` Credit: user Craig Ringer on, recommends to kill the process rather than deleting when there is an orphan Postgresql server process. Show Sample Output

    cat /usr/local/var/postgres/
    ctcrnitv · 2017-02-07 02:38:28 8
  • Continuously watches postgres, showing the instances using the most RAM at the top. Show Sample Output

    watch -n 1 '{ ps aux | head -n 1; ps aux --sort -rss | grep postgres | grep -v grep; } | cat'
    carbocation · 2017-05-14 17:01:44 7
  • Check if SSH tunnel is open and open it, if it isn't. NB: In this example, 3333 would be your local port, 5432 the remote port (which is, afaik, usually used by PostgreSQL) and of course you should replace REMOTE_HOST with any valid IP or hostname. The example above let's you work on remote PostgreSQL databases from your local shell, like this: psql -E -h localhost -p 3333

    while true; do nc -z localhost 3333 >|/dev/null || (ssh -NfL 3333:REMOTE_HOST:5432 USER@REMOTE_HOST); sleep 15; done
    rxw · 2015-09-21 02:25:49 6

What's this? is the place to record those command-line gems that you return to again and again. That way others can gain from your CLI wisdom and you from theirs too. All commands can be commented on, discussed and voted up or down.

Share Your Commands

Check These Out

Pick a random line from a file

get you public ip address

watch your network load on specific network interface
-n means refresh frequency you could change eth0 to any interface you want, like wlan0

extract column from csv file
extracts the 5th column using the delimiter ','

list block devices
Shows all block devices in a tree with descruptions of what they are.

concatenate compressed and uncompressed logs
I use zgrep because it also parses non gzip files. With ls -tr, we parse logs in time order. Greping the empty string just concatenates all logs, but you can also grep an IP, an URL...

Insert the last argument of the previous command
for example if you did a: $ ls -la /bin/ls then $ ls !$ is equivalent to doing a $ ls /bin/ls

Create a mirror of a local folder, on a remote server
Create a exact mirror of the local folder "/root/files", on remote server 'remote_server' using SSH command (listening on port 22) (all files & folders on destination server/folder will be deleted)

Limit memory usage per script/program
When I'm testing some scripts or programs, they end up using more memory than anticipated. In that case, computer nearly halts due to swap space usage, and sometimes I have to press Magic SysRq+REISUB to reboot. So, I was looking for a way to limit memory usage per script and found out that ulimit can limit memory. If you run it this way: $ $ ulimit -v 1000000 . $ $ scriptname Then the new memory limit will be valid for that shell. I think changing the limit within a subshell is much more flexible and it won't interfere with your current shell ulimit settings. note: -v 1000000 corresponds to approximately 1GB of RAM

Perl One Liner to Generate a Random IP Address
A bash version.

Stay in the loop…

Follow the Tweets.

Every new command is wrapped in a tweet and posted to Twitter. Following the stream is a great way of staying abreast of the latest commands. For the more discerning, there are Twitter accounts for commands that get a minimum of 3 and 10 votes - that way only the great commands get tweeted.


Subscribe to the feeds.

Use your favourite RSS aggregator to stay in touch with the latest commands. There are feeds mirroring the 3 Twitter streams as well as for virtually every other subset (users, tags, functions,…):

Subscribe to the feed for: