Answered
1
1

Hello,

The scenario I am working on will be to write/post large data sets to an API. At times there maybe 100,000+ entries to post to an API.

To audit and validate the post, is there an option to throttle the post requests into smaller post requests to prevent the backend API service from being overrun?

For example, segment and throttle the requests to POSTs of 500 lines, log the response and if no errors continue to send the POST requests?

the AWS api gateway provides throttling options that may meet these requirements. Does autoflow support any similar throttling options?

https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-request-throttling.html

  • You must to post comments
Best Answer
1
0

There are two different throttling you are discussing. One is throttling the post request, and the AWS feature you referred to is related to throttling from the server side. We have a plan to add server side throttling but in terms of throttling your post request, you can split your data into chunks  and use any of the iteration actions to iterate over those chunks to send the post request. We may be able to provide an iteration action that makes it easier for you to achieve this. Will get back to you on that.

  • You must to post comments
0
0

Thanks David,

Any chance you have an example of how to build the iterations to split a POST request into smaller, predefined chunks (say 500 lines per POST).

  • You must to post comments
Showing 2 results
Your Answer

Please first to submit.