Make API timeout due to data dump size. API generate_csv doesn't respect "limit" function.
Y
Yari Schutzer
I've created a Make API to handle a data dump of backups, based on this:
However, the larger the table data, it times out for each Table that takes over a minute.
I tried it as part of the URL string, as in:
The link above doesn't succesfully pass on the ?limit-1000 to generate_csv, so I cannot use make to cycle through the data because I cannot reduce the size of the data.
Noting I did separately implemented the backup on the solution level, but that doesn't export data as text like a cvs does, so with a CVS that gets too big and thus does a timeout, there isn't a way to use the backup for tables when the export has too many records yet cannot limit each loop to 1000 records.
@Peter Novosel recommended I create this as a request. As I'm certain that most users don't use API's or will run into the need for data backups, this post is really for tracking purposes more than a feature which any casual user will upvote!
Jon Darbyshire
Hiya Yari Schutzer, thanks for this post! I have a few more questions for you:
- Can you provide more details about the size of the data you are trying to export?
- Are there specific tables that are causing the timeout or is it a general issue across all tables?
- Have you tried any other methods to limit the data size or to prevent the timeout?
Y
Yari Schutzer
Jon Darbyshire The size is not specific, as it's just the timeout from make which thinks if it cannot get a response, it times out. That's more about the time-to-live in the make module. So, by using limit, the response would at least complete. If not for the timeout, it's a non-issue, but surely a timeout is necessary in modules to establish how long is a reasonable time to get a response.
This happens to every table eventually, as each table got bigger, and then backups would fail.
There are no other methods, given that every module has a time out by design, I think. Therefore, it seems like "limit" is the only way I can imagine to control the size of the data.