API Call Returns Error when Table Content is Too Big

Wenn I call one of our Tables in Baserow, which has a lot of long text field full of content, I get this error in Make:

Bundle 1Collection

  • logstatus

Data too long. The log content was cut. The size of the original buffer was: 5800006

  • dataLong String

Then the API call does not work since there seems to be a memory buffer we are hitting, since Baserow API tries to output all the table content when then exceeds the memory buffer. I do understand that we cannot call all content in one API call. I would need some workaround for this.

The API Call error won’t tell me how many rows have been found. But I think my filter worked and Baserow would be able to get that information. The API Call should first return the match result, how many matches have been found using my filter. before showing the error.

The issue is that t the API Call loads the content of every single column for each row.
I would need an option to only count the number of rows and their row ID first, then I can use the iterator in Make.com to processs each row.

Maybe if could have a criteria in the API query to only load the matching Row Numbers and their IDs instead of loading every single column and their content?

Hello @artoflogic,

which endpoint are you calling to get the data for your table?

Have you considered using the limit query parameter to restrict the number of rows returned? You can also use the offset query parameter to retrieve the subsequent or previous set of rows.

Here you can also see other query parameters you can use to exclude fields from the request: Baserow API spec.

I hope this helps,
davide

Hi @davide , thanks for the suggestion. I have not tried that yet, yes I can try that.

I think I need to exclude the content rich fields via the exclude criteria and try this. If that wouldn’t work, I need to try the limit and offset.