Advanced triggers for Webhooks

@bram - sorry - my error :slight_smile:
I used an older make_com flow for tests, which assumed one call per row.

I adjusted the flow to account for the fact that all changed “rows” are indicated in one HTTP request, which actually makes much more sense for this feature. Now everything works as expected. Even better than expected, actually!

Thanks!

No worries at all, thanks for letting me know. Glad to hear that it is working as expected.

@bram I thought I’d write here for consistency, but do let me know if you’d rather I post it in a separate thread.

Is there a way to limit, which data/columns make it to the webhook payload?
I could not find it in the UI or documentation (but may have missed this)

That would be a super nice feature (let’s call it a cherry on top of the recent webhook cake : )

we’re having issues with webhook payload size limits, which for make_com are rather low at 5MB - N8N at 16 I think, but that is still not enough, as for larger batches of changed rows - this sends large parts of the entire table as payload :slight_smile:

It would be nice to have at least the option to send just the data from the selected column
with some basic metadata like row IDs, etc.

Hi @dev-rd, there currently isn’t a way to limit the fields included in the webhook payload. I do like the idea, though. Out of curiosity, how did the payload ended up that big? Is that because the row update limit has been increased, or do you have many fields?

@bram
In short:
we need to have long texts in a few fields :slight_smile: a lot of metadata and the table has 100+ columns

long version:
I was considering using webhooks as triggers for some custom automations.
For simplicity of use - say - the user would choose a single select value named sth like “send data” or “trigger action xyz” which would act as a kind of a “button” - and perform an action on the data within that row or batch of rows.
This actually works quite well for smaller batches of rows and single rows, so we will certainly use this feature in that way!

However - some workflows require batches of more than 1000 rows to be ‘automated’ (with 100+ columns, some of which include a lot of metadata) - that’s why we end up with huge webhook payloads.

That said - for larger batches - we will likely go with a different approach, where we would use the webhook just as a trigger from a separate table and and then get the data from the target table ‘properly’ via API.
Such workaround should be viable for no-code users. The idea is to use one table as a kind of “control panel” which would allow normal users to specify “filters” for the ‘target’ table and trigger an automation (webhook either via Make or N8N) from that “control table”, which could read the desired filters from the control table.
That way, we circumvent the need for huge webhook payloads, but no-code users can still “select” rows for a specific batch task in the target table.

Perhaps not the most elegant solution, but seems doable - at least as a temporary workaround - until we have proper automations/scripts implemented in Baserow natively.

Another way around the payload size issue would be to just send one webhook call per row.

would it be possible to have a toggle for that in the webhook options?
I assume some people might prefer to have all the data sent in one payload.

Hey @dev-rd, it would be possible to implement an option to send one HTTP request per modified row, but to be honest, it doesn’t look like a feature we’ll be able to prioritize any time soon. By default, it’s possible to edit a maximum of 200 rows in Baserow, to avoid huge slow updating taking place. Changing that to a higher number is possible, but can come with side effects like these.

It also sounds like the problem is not per sé on the side of Baserow, but rather on the side of the automation tool because it doesn’t accept payloads of a certain size. Another solution might be to self-host n8n, and increase the payload size. It seems like they have a configurable environment variable to increase the payload size Payload Limit for HTTP Node - Questions - n8n Community.

1 Like

I’ve already found a sensible workaround, so the one webhook call per row is indeed not a priority :slight_smile:
I’ve recently got around to dive deeper into self-hosted N8N and it proves to be extremely helpful :slight_smile: so, well worth the time investment.

I’ve already used the custom payload size feature. Works OK locally, but we will likely use a different trigger in production anyways, so again, a non-issue right now.

Thanks for all the support!

Glad to hear that @dev-rd. You’re welcome!