Hi @<1558986867771183104:profile|ShakyKangaroo32> , what version of clearml are you using?
Think of it this way. You have the pipeline controller which is the 'special' task that manages the logic. Then you have the pipeline steps. Both the controller and the steps need some agent to execute them. So you need an agent to execute the controller and also you need another agent to run the steps themselves.
I would suggest by clicking on 'task_one' and going into full details. My guess it is in 'enqueued' state probably to the 'default' queue.
Can you add the full log?
SoreDragonfly16 , let me take a look 🙂
GentleSwallow91 , you can also use Task.create()
https://clear.ml/docs/latest/docs/references/sdk/task#taskcreate
You can do it in one API call as follows:
https://clear.ml/docs/latest/docs/references/api/tasks#post-tasksget_all
Hi @<1639799308809146368:profile|TritePigeon86> , can you please elaborate? What do you mean by external way?
Hi SoreHorse95 ,
Does ClearML not automatically log all outputs?
Regarding logging maybe try the following setting in ~/clearml.conf sdk.network.metrics.file_upload_threads: 16
@<1547028079333871616:profile|IdealElephant83> , what are you trying to do during the code execution?
Is it a self hosted server?
Hi @<1668427950573228032:profile|TeenyShells80> , you would need to configure it in the clearml.conf of the machine running the clearml-agent
RipeAnt6 , you have to manage your storage on the NAS yourself. We delete data only on the fileserver.
However, you could try mounting the NAS to the fileserver docker as a volume and then deletion should also handle files on the NAS 🙂
Hi @<1529271098653282304:profile|WorriedRabbit94> , you can sign up with a new email
I see, thanks for the input!
Hi @<1690896105262288896:profile|EnergeticTiger5> , can you add a full long of the run please?
Hi @<1539780284646428672:profile|PoisedElephant79> , please post in the same thread you started, no need to spam the main channel 🙂
Regarding your issue - it looks like you have some issue with authentication. How are you spinning the server?
DilapidatedDucks58 , I think this is what you're looking for
https://github.com/allegroai/clearml/blob/master/docs/clearml.conf#L69
Hi @<1874989039501709312:profile|LividDragonfly0> , you can easily achieve that with pipelines
Try configuring the following and tell me if this helps 🙂
In your ~/clearml.conf configure the following:
sdk.development.default_output_uri: "<S3_bucket>"
Hmmm maybe SuccessfulKoala55 can help 🙂
Ok, will do once I get back to the office, thanks for the heads up! 🙂
SmugDolphin23 , maybe you have an idea?
I would suggest directly using the API for this. Then simply look at what the web UI sends as a reference 🙂
Did you provide an entry point?
Are you using a self hosted server or app.clear.ml ?
EnormousCormorant39 , there are SDK methods for using the datasets. I think this will simplify your process immensely.
https://clear.ml/docs/latest/docs/references/sdk/dataset
Also here is a small example for the usage 🙂
` task = Task.init(project_name="<PROJECT_NAME>", task_name="<TASK_NAME>")
#Create dataset
ds = Dataset.create(dataset_name="<DATASET_NAME>", dataset_project="PROJECT_NAME")
ds.add_files("<PATH_TO_FILE/S>")
ds.upload()
ds.finalize() `
The metadata would relate to the entire dataset.
For your use case I think what's relevant is HyperDatasets
Hi @<1561885921379356672:profile|GorgeousPuppy74> , ClearML does support running with multiple GPUs
@<1523701553372860416:profile|DrabOwl94> , I would suggest restarting the elastic container. If that doesn't help, check the ES folder permissions - maybe something changed
I was also curious if I've missed an easy way to mention required packages for the controller, like you can do that in components (by providing
packages=[...]
).
I think you can do this using https://clear.ml/docs/latest/docs/references/sdk/task#taskadd_requirements
I'm not sure how the pipeline controller is handled but it still should be regarded as a task on a deeper level.
Meaning that you might need to use task = Task.current_task() and then ` task.add_r...