
Reputation
Badges 1
89 × Eureka!and then have a wrapper that gets the model data and selects which way to construct and deserialise the model class.
` def get_model(task_id, model_name):
task = Task.get_task(task_id)
try:
model_data = next(model for model in task.models['output'] if model.name == model_name)
except StopIteration as ex:
raise ValueError(f'Model {model_name} not found in: {[model.name for model in task.models["output"]]}')
filename = model_data.get_local_copy()
model_type =...
while in our own code:if model_type == 'XGBClassifier': model = XGBClassifier() model.load_model(filename)
Yes, this is exactly how I solved it at the end
pickle.dump({ 'model': model, 'X_train': X_train, 'Y_train': Y_train, 'X_test': X_test, 'Y_test': Y_test, 'impute_values': impute_values }, open(self.output_filename, 'wb'))
That's because then I need to teach every DS how to use the ClearML api
this is a bit WIP but we save it with the design of the model:
` parameters = dict(self.parameters, model_type='XGBClassifier')
...
output_model.update_design(config_dict=parameters) `
import xgboost # noqa self._model = xgboost.Booster() self._model.load_model(self._get_local_model_file())
I think this is because of the version of xgboost that serving installs. How can I control these?
I want the model to be stored in a way that clearml-serving can recognise it as a model
The DSes would expect the same interface as they used in the code that saved the model (me too TBH)
I'd rather just fail if they try to use a model that is unknown.
I just disabled all of them with auto_connect_frameworks=False
I didn't realise that pickling is what triggers clearml to pick it up. I am actually saving a dictionary that contains the model as a value (+ training datasets)
well, it just shoved the dataset files with cryptic data_1/data_2 names among the artifacts.
This receives the payload from the server and turns it into something that can be fed to the model. This in out case depends on a data structure that is stored on the clearml server as an artifact. I need to communicate this to the class so it can pick it up and use it when called
yeah, Friday afternoon
BTW you are not exporting Framework in __
init
__
so you need to import it like from clearml.model import Framework
If I do this it still autorecords the sklearn one
I am running a script
also random tasks are popping up in the DevOps project in the UI
I absolutely need to pin the packages (incl main DS packages) I use.
Having human readable ids always help communication but programmatically we definitely going to use the "real" id. But I think we are too early into this and I will report back on how it is going with this.