Yeah its trying to plan down the line into model deployment. Whilst its easy to save out a keras SavedModel or similar and have that artifact uploaded into the store, just wanted to check if there was a more generic solution. I could just create a Python class and serialise that out such that it has a standard interface, but good to check. So for example, some artifact representing an arbitrary math function. For better context, the idea is to make deploying any artifact we upload using clearml as easy as possible. Back in my last project, which was airflow+mlflow, all models were executable using a standard interface (pyfunc), and making a custom model which was interfaced with the same way as sklearn/keras/ and thus deployed/served the same way, was done by extending PythonModel ( https://www.mlflow.org/docs/latest/models.html#example-creating-a-custom-add-n-model ). Im trying to get the hang of how to do similar things with ClearML, and have been over the docs in clearml.model.Model but this doesnt seem what I want - which is to be able to get a Task’s model and run using it, with bonus points if I dont have to care about what the model itself is