Hi. Im using n8n to to write data to baserow (self hosted with cloudron) and the write speed seems a little slow. Im currently using nocodb (want to switch to baserow for automated record linking) but nocodb is 10 times faster at writing data for the same operations (tested with both the cloud and self hosted versions of baserow)
is there an environment variable or a parameter I can modify to increase the write speed or is the default the limit.
Hi @jakenielson, updating rows in Baserow can really fast, but it depends on some other factors.
- Which version of Baserow do you have installed?
- How many milliseconds does it take to update a single row via an API requests?
- How does your schema look like in Baserow, so which tables, fields, number of rows do you have? * How many API requests are being made to Baserow per second/minute?
- What are the specifications of your server running Cloudron?
- Do you have any resource constraints installed, CPU, memory, etc?
There are definitely environment variables you can configure to speed things up, but it depends a bit on the answers of the questions above.
Baserow also has some additional features like undo/redo, trash, advanced permissions, and real-time collaboration. It definitely shouldn’t make a 10 times difference, but it does play a small part.
Hey thanks for your response. The issue is with the default n8n baserow node. Im now batch processing the requests and everything is working fine.
Glad to hear that Baserow was not the problem. Please let me know if you have any other questions.
Im now facing issues when it comes to retrieving records. ive set the BASEROW_ROW_PAGE_SIZE_LIMIT to 9999999 and n8n already process get requests in bulk but Im still not retrieving the records fast enough, do you have any idea why this might be the case?
Hi @jakenielson, how many rows are you trying to fetch in one API request? It would be able to advise better if you answer the questions in my previous message.
-Which version of Baserow do you have installed?
How many milliseconds does it take to update a single row via an API requests?
- Im able to retrieve and create rows pretty quickly, but Im limited to updating 50 rows a second.
How does your schema look like in Baserow, so which tables, fields, number of rows do you have?
- 200k rows, 13 text/number columns, 1 linked field column (what im trying to automatically update.)
How many API requests are being made to Baserow per second/minute?
- I can make as many requests as I want to create or retrieve rows, but I can only make 50 a second to update rows
What are the specifications of your server running Cloudron?
- 4 vCPU Cores, 8 GB RAM, 50 GB NVMe
Do you have any resource constraints installed, CPU, memory, etc?
-Ive attached images of my cloudron setup below.
Hi @jakenielson, a table with just some text fields and one link to another table should update super fast. Do I understand correctly that you currently can’t make more than 50 update requests per second? I’m curious if you’re updating the same rows within that second, or you’re usually updating different ones?
To handle more requests per second, you can set the
BASEROW_AMOUNT_OF_GUNICORN_WORKERS environment variable to
8 for example. By default, it’s
3 in Cloudron, and that means you can handle
3 concurrent requests. Changing it to
8 will increase your memory usage, you would have to monitor that. If you’re able to make more requests per second because of this, you’ll also see your CPU usage increase, so you have to monitor that as well.
Based on your server metrics, I can see that you still have some room, but it might also mean you would have to increase your server hardware if you run out of resources.