Hey @Vlas, thanks for popping into the Baserow community
To answer your first and third questions:
Are you planning on using the hosted or self-hosted version of Baserow?
Baserow can handle 100K+ rows, but the level of performance might vary based on the number of fields, filters, and views in your database once you’re above 100K rows. (This is bound to change in the future, but as of this writing, we’re actively working on supporting vast amounts of data beyond 100K rows.)
If you’re self-hosting, there’s no limit.
Yes, you can absolutely import by CSV. Here’s how:
Once you choose the CSV file you want to import, select your delimiter (column separator), encoding, the option to denote whether the first row is a header or not. You’ll also be able to preview your import before adding the table.
I wanted to try to work with cloud version, but opportunity to work with big tables (over then 100 000 rows) is required for my goals. So, I have only one way – selfhost
Yeah, I’ve seen the feature, that I can create new table by .csv-file, but when I tried to add 30k rows, Baserow showed to me the message, that I cant’t add more then 5000 rows. I tried to add 5000 rows, but it showed to me the same message. I deleted rows for a few times untill they are gone 1000. And only after that I was success
So, does the Baserow have a function to add rows from .csv to an existing table?
Indeed, that row import limitation—similar to your previous question—exists in the hosted version. You wouldn’t encounter this limitation in the self-hosted version.
At this point, in understanding your immediate needs, it’d probably be best to self-host using Cloudron.
This would allow you to import everything you need at once instead of in chunks.