You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
And try to deploy the logged model with default image endpoint (example: azureml://registries/azureml/environments/mlflow-py39-inference/versions/2). There is an error :
File "/opt/miniconda/envs/userenv/lib/python3.11/site-packages/azureml_inference_server_http/server/user_script.py", line 77, in load_script
main_module_spec.loader.exec_module(user_module)
File "<frozen importlib._bootstrap_external>", line 940, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/var/mlflow_resources/mlflow_score_script.py", line 374, in <module>
input_param, output_param, params_param = get_parameter_type(sample_input, sample_output, sample_params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/mlflow_resources/mlflow_score_script.py", line 342, in get_parameter_type
param_arg[key] = NumpyParameterType(value, enforce_shape=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda/envs/userenv/lib/python3.11/site-packages/inference_schema/parameter_types/numpy_parameter_type.py", line 33, in __init__
raise Exception("Invalid sample input provided, must provide a sample Numpy array.")
Exception: Invalid sample input provided, must provide a sample Numpy array.
This is because mlflow_score_script.py contains theses lines :
#...elifisinstance(sample_input_ex, dict):
_logger.info("sample input is a dict")
# TODO keeping this around while _infer_schema doesn't work on dataframe string signaturesparam_arg= {}
forkey, valueinsample_input_ex.items():
param_arg[key] =NumpyParameterType(value, enforce_shape=False)
input_param=StandardPythonParameterType(param_arg)
#...
Am i doing something wrong ?
Thank you for you help.
The text was updated successfully, but these errors were encountered:
Hello, when i use something like below to log langchain model
And try to deploy the logged model with default image endpoint (example:
azureml://registries/azureml/environments/mlflow-py39-inference/versions/2
). There is an error :This is because
mlflow_score_script.py
contains theses lines :Am i doing something wrong ?
Thank you for you help.
The text was updated successfully, but these errors were encountered: