What's the way to backup baserow when the data is stored in an external postgres server?

Please fill in the questionnaire below.

Technical Help Questionnaire

Have you read and followed the instructions at: *READ ME FIRST* Technical Help FAQs - #2 by nigel ?

Answer: Yes

How have you self-hosted Baserow.

Hosted on Azure. Database is running on a dedicated server. Docker.

What are the specs of the service or server you are using to host Baserow.

Sufficiently large - 16GB

Which version of Baserow are you using.

1.10.1

How have you configured your self-hosted installation?

Nothing peculiar except for the DATBASE related .env

What commands if any did you use to start your Baserow server?

docker run

Describe the problem

I’ve been using this setup for the last one or two years. Everything has been working good. Now, i want to move away from this dedicated database server and move everything into something ismpler (to save costs).

However, when I follow the instructions in baserow postgres backup section, the backup created is a 90kb file. there’s nothing relevant inside it. My backup should be more.

What’s the right way to take a backup when you’ve hosted your data on a dedicated server?

I would like to know as well.
My scenario and plan to move my baserow from one to another environment, are - as I think similar.
I am self-hosting my baserow in a docker (Synology NAS) and soon I will upgrade my system and NAS device where I would like to have all data including database in one place → which is docker container subfolder (according to this docker compose settings described in this tutorial: https://mariushosting.com/how-to-install-baserow-on-your-synology-nas/).

Shortly said:
I would be very thankful for some help or manual (domething like step1, step 2, step 3, …) from baserow tech team how to export (or backup) everything and migrate to another system, and best with some tips if the configuration on a destination system will be a slightly different.

I tried a whole lot of things the last four hours. Finally found a way that works

  1. Stop the docker container for baserow to avoid any data loss.
  2. Export using pg_dump. I used pg_dump to create a backup of the data, and I backed it up directly from the Azure Database for SQL.
  3. Created a new database container and a database user with the exact same username & password combo. Created a database called baserow. Added superuser privileges to the user I created. Had to add a few azure related roles and assign it to my db user (may not be relevant for you)
  4. Restored the backup with the help of pg_restore. Ran the docker command passing the db info and it worked. Make sure you use the same version of baserow image on both instances.
1 Like

Thanks. I forgot to add the link to a docker compose tutorial - edited in the previous post.
This is the configuration I would like to use on a new device. Before it I used the standard docker compose settings which are published officially by baserow on their pages.
The difference is that in the new one from mariushosting it includes not only “baserow:” , but also “redis:” and “db:” services sections (parts), while in the old one there is only “baserow:” part.

That means I would like to migrate from old one (backup everything from it) and import it into the new one.

Any help here very welcomed. :slight_smile:

Thanks for sharing what worked for you, @blizzerand. :raised_hands: