Context for my question:
I exported an existing Baserow table (the table doesn’t contain links to other tables or lookup fields, its entirely self contained), as a CSV file. I then imported that CSV into a new Baserow table and discovered that the field types were not maintained.
For example, the data switched from multi-select to single line text fields, this makes sense given the limits around the CSV format.
I could see a JSON export containing that additional context
With the premium membership, do the import/export features allow a user to import a Baserow-XML/JSON-exported file, and retain the Baserow features like, the options under a multi-/single select field, or the forumals from a formula field?
The CSV, XML and JSON export do not maintain column/field type data. This is because we want to make the content of the file as generic as possible. This is important because when you export data, you probably want to to use it in other software and not re-import it in Baserow again.
What you can do is create a new empty table, create the fields before the import and use the “Import file” option. This is available when you click on the three dots next to the view. From here you can import a file into an existing table. You would need to manually map the column of the file to the fields in the table. If the cell value is compatible, it will be imported.
Would love to learn more about your use case? Especially because we also have duplication and snapshot functionality in Baserow directly. If you are technical and comfortable with the CLI, you can also use the
import_group_applications commands, to export all databases of a workspace into a format that can later be imported again. This will also keep relations, files, etc.
So for my use case I’m just lazy. I was exploring some of Baserow’s templates, specifically the health one which includes a workout tracking table. I wanted to replace that table with one I built to see if that template would then be more useful to me.