Hello @Kishore
Here are some suggestions that might help:
- Consider using Durable Functions: Durable Functions are designed to handle long-running, stateful workflows. They can help you manage the state of your function and handle retries and failures more gracefully. You might find that using Durable Functions makes it easier to handle large data sets more efficiently.
- Use a queue to process requests asynchronously: If you're fetching all data in a single run, you might be overloading your function. Consider using a queue to process requests asynchronously. This can help you manage the load on your function and ensure that it doesn't become overwhelmed.
- Optimize your code: Make sure that your code is optimized for performance. For example, you might be able to use parallel processing to fetch data more quickly. You might also be able to optimize your caching strategy to minimize the number of API calls you need to make.
- Monitor your function: Make sure that you're monitoring your function's performance and looking for ways to optimize it. Use tools like Application Insights to track performance metrics and identify areas for improvement.
I hope these suggestions helps. If so, please mark this response as Answered. This will not only acknowledge our efforts, but also assist other community members who may be looking for similar solutions.