What is the context length for DeepSeek-R1

Leo Chow 20 Reputation points
2025-03-10T10:37:41.3766667+00:00

I am aware not too long ago when Azure AI Foundry first launched the DeepSeek-R1 model, there was a max context length of 40k tokens. This is also when Azure AI Foundry was offering the DeepSeek-R1 model for free.

Now that DeepSeek-R1 is no longer free on Azure AI Foundry, what is the max context length for DeepSeek-R1? Is it still 40k tokens? Note: The original DeepSeek-R1 model can take in up to a max 128k context length.

Azure AI Language
Azure AI Language
An Azure service that provides natural language capabilities including sentiment analysis, entity extraction, and automated question answering.
471 questions
0 comments No comments
{count} votes

Accepted answer
  1. Pavankumar Purilla 4,280 Reputation points Microsoft External Staff
    2025-03-10T21:40:47.0066667+00:00

    Hi Leo Chow,
    Currently, the DeepSeek-R1 model is in Preview mode, and it supports a maximum context length of 128k tokens. This extended context length enables the model to excel at complex reasoning tasks, including language understanding, scientific reasoning, and coding.

    Hope this helps. Do let us know if you have any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful.

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.