I get an error in TimeGen1 deployment: The model execution took longer than the timeout supplied in request_timeout_ms under request_settings of your model deployment config.

maxime 0 Reputation points
2025-02-13T15:12:52.38+00:00

The error started appearing one week ago even though I did not change anything to my deployment

Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,153 questions
{count} votes

Accepted answer
  1. JAYA SHANKAR G S 330 Reputation points Microsoft Vendor
    2025-02-19T09:48:37.9933333+00:00

    Hi @maxime ,

    Check the Nixtla client for request timeout parameter adjustment.

    Here, you can set the timeout to None to disable or you required timeout.


1 additional answer

Sort by: Most helpful
  1. Manas Mohanty 655 Reputation points Microsoft Vendor
    2025-02-14T11:02:46.9933333+00:00

    Hi maxime!

    Below Error suggests on model is taking longer times then "request_timeout_ms" declared in your model deployment.

    The model execution took longer than the timeout supplied in request_timeout_ms under request_settings of your model deployment config.

    You can do the following steps to address above issue.

    1. Adjust timeouts param from your endpoint (score_timeout, timeouts in custom liveness and readyness prob)
    2. You can adjust autoscalling option in your endpoints to scale automatically on metrics like CPU_utlization, latency, request per minut etc if it is taking more time /load on existing endpoint
    3. Use Higher SKU for faster processing
    4. Optimized your code to predict smaller batches of data to lower inference time.Please don’t forget to Accept Answer and Yes for "was this answer helpful" wherever the information provided helps you, this can be beneficial to other community members.Thank You.

    Thank you.


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.