Reputation
Badges 1
9 × Eureka!As i understand, if im runnning an sklearn experiment locally, i can also save the model artifact by using joblib.dump. How do i set the metadata of the artifact within the sourcecode of the experiment as well or am i meant to add the metadata separately?
Ahhh, i see. Now i know what i was missing. I thought i could skip the preprocessing part. Does this mean for other engine/framework especially TF/Keras, the serving also sets the input/output based on preprocessing as well?
i think as long as u install the clearml in that venv, it would only be executed within it
yeah, cause i thought that it would be able to figure that out from the model file, and im just missing some code/configuration
alright. Can you at least point me to an example of setting the input-size and output-size (via the clearml-serving cli)? Can't find it in the main doc
alright, thanks for the second pair of eyes
yeah, it was previously restarted
ah alright. I can look in that direction, thanks
in windows, the clearml.conf would be located in the User folder. Is there any way to configure it to be moved inside the project's folder, or maybe configure it using cli?
Um, that is not a valid command. And what i want to do is remove the serving instance, not an endpoint
nevermind. In the docs already said to point to serving-inference