Context for my question:
I exported an existing Baserow table (the table doesn’t contain links to other tables or lookup fields, its entirely self contained), as a CSV file. I then imported that CSV into a new Baserow table and discovered that the field types were not maintained.
For example, the data switched from multi-select to single line text fields, this makes sense given the limits around the CSV format.
I could see a JSON export containing that additional context
TLDR:
With the premium membership, do the import/export features allow a user to import a Baserow-XML/JSON-exported file, and retain the Baserow features like, the options under a multi-/single select field, or the forumals from a formula field?
The CSV, XML and JSON export do not maintain column/field type data. This is because we want to make the content of the file as generic as possible. This is important because when you export data, you probably want to to use it in other software and not re-import it in Baserow again.
What you can do is create a new empty table, create the fields before the import and use the “Import file” option. This is available when you click on the three dots next to the view. From here you can import a file into an existing table. You would need to manually map the column of the file to the fields in the table. If the cell value is compatible, it will be imported.
Would love to learn more about your use case? Especially because we also have duplication and snapshot functionality in Baserow directly. If you are technical and comfortable with the CLI, you can also use the export_group_applications and import_group_applications commands, to export all databases of a workspace into a format that can later be imported again. This will also keep relations, files, etc.
So for my use case I’m just lazy. I was exploring some of Baserow’s templates, specifically the health one which includes a workout tracking table. I wanted to replace that table with one I built to see if that template would then be more useful to me.
What is the point of the JSON or XML export not maintaining the column/field type data? There must be at least one file format that retains them.
I am experiencing the same issue where multiple tables need to ‘copied’ using the Baserow web interface (not the CLI), and just like the author, I would pay for being able to do so.
Instead, everytime a table is imported, I need to adjust tens or hundreds of columns across tables. Please note that the export/import_group_applications you are referring to serves well for cloning an antire database, but not for partial cloning of selected database tables.
To give you an example:
Database 1 contains tables: Blog Categories
Blog Subcategories
Articles
Products
Users
User Roles
Database 2:
Orders
Users
User Roles
If we want to copy only selected database tables manually into another workspace, we need to adjust each column type.
Essentially, there’s no way to quickly combine multiple tables from different databases without using a CLI.
Hello @Sulitzer, we double-checked and the mapping is working on our side. You’re not supposed to map the fields manually, so something seems off. Could you please share a few screenshots showing that the fields weren’t recognized and that you had to set them up manually?