Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langchain Model Deploy failed when putting input_sample #3618

Open
prise6 opened this issue Nov 25, 2024 · 1 comment
Open

Langchain Model Deploy failed when putting input_sample #3618

prise6 opened this issue Nov 25, 2024 · 1 comment

Comments

@prise6
Copy link

prise6 commented Nov 25, 2024

Hello, when i use something like below to log langchain model

input_example = {"user_message": "foobar"}
with mlflow.start_run():
    model_info = mlflow.langchain.log_model(
        lc_model=lc_model_path,
        artifact_path=artifact_path,
        input_example=input_example,
        signature=mlflow.models.infer_signature(
            model_input=input_example, model_output="output"
        ),
    )

And try to deploy the logged model with default image endpoint (example: azureml://registries/azureml/environments/mlflow-py39-inference/versions/2). There is an error :

 File "/opt/miniconda/envs/userenv/lib/python3.11/site-packages/azureml_inference_server_http/server/user_script.py", line 77, in load_script
   main_module_spec.loader.exec_module(user_module)
 File "<frozen importlib._bootstrap_external>", line 940, in exec_module
 File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
 File "/var/mlflow_resources/mlflow_score_script.py", line 374, in <module>
   input_param, output_param, params_param = get_parameter_type(sample_input, sample_output, sample_params)
                                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/var/mlflow_resources/mlflow_score_script.py", line 342, in get_parameter_type
   param_arg[key] = NumpyParameterType(value, enforce_shape=False)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 File "/opt/miniconda/envs/userenv/lib/python3.11/site-packages/inference_schema/parameter_types/numpy_parameter_type.py", line 33, in __init__
   raise Exception("Invalid sample input provided, must provide a sample Numpy array.")
Exception: Invalid sample input provided, must provide a sample Numpy array.

This is because mlflow_score_script.py contains theses lines :

#...
            elif isinstance(sample_input_ex, dict):
                _logger.info("sample input is a dict")
                # TODO keeping this around while _infer_schema doesn't work on dataframe string signatures
                param_arg = {}
                for key, value in sample_input_ex.items():
                    param_arg[key] = NumpyParameterType(value, enforce_shape=False)
                input_param = StandardPythonParameterType(param_arg)
#...

Am i doing something wrong ?

Thank you for you help.

@Abhi7739309
Copy link

HELP IN

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants