How to fix "ERROR: pip's dependency resolver does not currently take into account all the packages that are installed." while using Promt Flow.

Lovedeep Singh 225 Reputation points
2024-12-17T07:06:14.37+00:00

I have made a custom image to use python's version 3.10.1 on Azure Ai Foundry 's Prompt Flow ( Earlier Ai Studio). The Flow is running successfully by ignoring the error.

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.mlflow 2.13.0 requires protobuf<5,>=3.12.0, but you have protobuf 5.29.1 which is incompatible.mlflow-skinny 2.13.0 requires protobuf<5,>=3.12.0, but you have protobuf 5.29.1 which is incompatible.

But, I according to me this causing trouble in the final production deployment. And, I have also checked the version of "protobuf" on my custom image but that was 4.25.5 and that should have worked in this scenario. Below is the screen capture of the error and docker container.
Possible_Reason_For_Deployment_Failure_ML_FLow_Docker_Check

The error which I am getting while doing the find deployment is as below:
Error_For_Random_Try_1

The Error is ResourceNotReady as per the documentation for troubleshooting this error 'https://learn.microsoft.com/en-us/azure/machine-learning/how-to-troubleshoot-online-endpoints?view=azureml-api-2&tabs=cli#error-resourcenotready'. there is a mention of score.py file. I want to know what is Score.py file, does this automatically get generated while doing deployment or there is a need to create this file separately while custom image. And ultimately how I can resolve this issue.


Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,042 questions
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,000 questions
{count} votes

1 answer

Sort by: Most helpful
  1. romungi-MSFT 48,126 Reputation points Microsoft Employee
    2024-12-18T05:35:13.28+00:00

    @Lovedeep Singh The first two conditions that should be satisfied are:

    • the docker image must be created based on prompt flow base image, mcr.microsoft.com/azureml/promptflow/promptflow-runtime-stable:<newest_version>. You can find the newest version here.
    • the environment definition must include the inference_config.

    If your image is already created on top of base image, then the current flow.dag.yaml file should point to the correct location of this image.

    The second requirement of inference_config this should be added after the image field in your current yaml.

    inference_config:
      liveness_route:
        port: 8080
        path: /health
      readiness_route:
        port: 8080
        path: /health
      scoring_route:
        port: 8080
        path: /score
    
    
    

    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.

    1 person found this answer helpful.
    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.