How many Postgres connections are needed

I’m using the stock, up-to-date Docker image on a relatively standard Ubuntu machine, with a Postgres database on another service. I have run into the issue a couple of times of connection limits being used up on the Postgres side, which ends up with the server crashing. It looks like at least 11 simultaneous connections are used up quickly after starting. Right now I am at 16 without much usage.

I could not find anything mentioned in the FAQ or DB Docs, it might be good to mention that as a requirement.

As they are all idle, it makes me wonder if there’s something in Django or the Postgres driver. In the tech specs, I found the Persistent connections setting, but am not sure how to configure it in Baserow. But it would be good to have a way to reduce or at least cap the # of simultaneous connections.

Hey @loleg, in the most basic form, Baserow would need at least 5 database connections. This is because it runs an asgi worker (for websocket connections), wsgi worker (http requests), fast background worker, slow background worker, and a period task scheduler. Each would need a PostgreSQL database connection.

Have you by any chance set the BASEROW_AMOUNT_OF_GUNICORN_WORKERS or BASEROW_AMOUNT_OF_WORKERS environment variables? These will spin up more workers for these services, and each will require a PostgreSQL connection. If you don’t set these environment variables, then it will automatically spin up a number of workers based on the number of CPU cores you have. Note that if you set the env vars to 1, it means that Baserow might not be able to handle concurrent API requests, or process concurrent exports, for example.