Reputation
Badges 1
86 × Eureka!I had initially just pasted the new credentials in place of the existing ones in my conf file;
Running clearml-init now fails at verifying credentials
Configuration completed now; I t was a proxy issue from my end
However running my pipeline from a different m achine still gives me a problem
2022-07-12 13:41:39,309 - clearml.Task - ERROR - Action failed <400/12: tasks.create/v1.0 (Validation error (error for field 'name'. field is required!))> (name=custom pipeline logic, system_tags=['development'], type=controller, comment=Auto-generated at 2022-07-12 08:11:38 UTC by mbz1kor@BMH1125053, project=2ecfc7efcda448a6b6e7de61c8553ba1, input={'view': {}})
Traceback (most recent call last):
File "main.py", line 189, in <module>
executing_pipeline(
File "/home/mbz1kor...
Hey so I was able to get the local .py files imported by adding the folder to my path sys .path
So I did exactly that, and the name and path of the model on the local repo is noted;
However, I want to upload it to the fileserver
Hey, is it possible for me to upload a pdf as an artefact?
Basically I have a script that generates a pipeline report in pdf format, was wondering if that can be logged
Is there a way to store the return values after each pipeline stage in a format other than pickle?
Yep, the pipeline finishes but the status is still at running . Do we need to close a logger that we use for scalers or anything?
Yep, no clue why I had two of them either;
It started my pipeline and a few seconds in, another pipeline shows up
More context:
I have agents running the stages and the pipeline being executed locally here.
More context:
I'm using agents to remotely execute the pipeline,
A weird observation is that adding ‘import timm’ to the pipeline controller doesn't make the agent running the pipeline controller install timm.
Adding to this,
Here I want to run the entire pipeline remotely - both the controller and the components run on agents
How do I provide a specific output path to store the model? (Say I want to server to store it in ~/models)
I'm training my model via a remote agent.
Thanks to your suggestion I could log the model as an artefact(using PipelineDecorator.upload_model()) - but only the path is reflected; I can't seem to download the model from the server
Also,
How do I just submit a pipeline to the server to be executed by an agent?
Currently I am able to use P ipeline Decorator.run_locally() to run it ;
However I just want to push it to a queue and make the agent do it's trick, any recommendations ?
http://localhost:9000 http://localhost:9000/%3Cbucket%3E
My minio instance is hosted locally at the 9000 port.
Can PipelineDecorator.upload_model be used to store models on the clearml fileserver?
Thanks for actively replying, David
Any update on the example for saving a model from within a pipeline( specifically in .pth or h5 formats?)