Hello!
Im using a clearml serving module and when i try to test it i get this error
{"detail":"Error [<class 'ValueError'>] processing request: Error: Failed loading preprocess code for 'py_code_transformer_model': No module named 'transformers'\n\nTraceback (most recent call last):\n File "/root/clearml/clearml_serving/serving/preprocess_service.py", line 51, in init\n self._instantiate_custom_preprocess_cls(task)\n File "/root/clearml/clearml_serving/serving/preprocess_service.py", line 81, in _instantiate_custom_preprocess_cls\n spec.loader.exec_module(_preprocess)\n File "<frozen importlib._bootstrap_external>", line 940, in exec_module\n File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed\n File "/root/.clearml/cache/storage_manager/global/1d18e4bdb21fe8036a0aa878c460e72b.preprocess.py", line 3, in <module>\n from transformers import AutoTokenizer, PreTrainedTokenizer, TensorType\n File "/usr/local/lib/python3.11/site-packages/clearml/binding/import_bind.py", line 54, in __patched_import3\n mod = builtins.org_import(\n ^^^^^^^^^^^^^^^^^^^^^^^^\nModuleNotFoundError: No module named 'transformers'\n"}
When i try to post request the endpoint, even though from the triton-gpu container i an see that the packeges are getting dowloaded as intended.
Does anyone know how to solve this?