Not sure what you mean, it looks like your experiment ran for 6 minutes.. Can you add the full log?
AlertCrow40 Hi!
How are you trying to connect to your jupyter notebook, can you provide a snippet? What version of clearml are you using?
Hi DrabCockroach54 , what version of ClearML are you using? What OS is this running on?
Moving objects between steps is usually done via the artifacts mechanism. How are you building the pipeline, with decorators?
I think something might be block ports on your local machine. Did you change ports mapping for the ClearML dockers?
There aren't any specific functions for this. But all of this information sits on the task object. I suggest running dir(task)
to see where this attribute is stored
It looks like it's hanging on the connection to the ClearML server
UnevenDolphin73 , you can configure timeout through the apiserver settings through..\config\default\apiserver.conf
and just push it there as auth.default_expiration_sec
You mean you would like to delete an output model of a task if other models in the task surpass it?
You need to separate the Task object itself from the code that is running. If you're manually 'reviving' a task but then nothing happens and no code is running then the task will get aborted eventually. I'm not sure I understand entirely what you're doing but I have a feeling you're doing something 'hacky'.
Hi @<1610445887681597440:profile|WittyBadger59> , how are you reporting the plots?
I would suggest taking a look here and running all the different examples to see the reporting capabilities:
None
For example, in the response of tasks.get_by_id
you get the data in data.tasks.0.started
anddata.tasks.0.completed
I hope this helps 🙂
Hi FierceHamster54 ,
Does squashing two datasets deletes the two original datasets ?
I don't think so. Should just create a new one.
Is it possible to edit tags using the SDK on a finalized dataset ?
I think so. I don't se a dedicated method for this on Dataset module but Datasets are basically tasks so you can fetch the dataset task and then use Task.add_tags()
https://clear.ml/docs/latest/docs/references/sdk/task#add_tags
Hi @<1523702786867335168:profile|AdventurousButterfly15> , are the models logged in the artifacts section?
JitteryCoyote63 , Hi 🙂
Why do you expect to see the enqueued on top of 'started' if they haven't started yet and are in enqueue state only? You can sort by 'updated' to get this result.
I think you can either add the requirement manually through code ( https://clear.ml/docs/latest/docs/references/sdk/task#taskadd_requirements ) or force the agent to use the requirements.txt when running in remote
When you generate new credentials in the GUI, it comes up with a section to copy and paste into either
clearml-init
or
~/clearml.conf
. I want the files server displayed here to be a GCP address
Regarding this - I think you should open a github feature request since there is currently no way to do this via UI
MammothParrot39 , yes it is available. This part of the Dataset module of clearml
Now try logging in
Hi @<1639074542859063296:profile|StunningSwallow12> , to answer your questions:
- Technically speaking the UI uses the API to do this, so you can generate it via the API. I would suggest opening dev tools (F12) and seeing what the web UI sends to the backend when you create the credentials through the UI.
- Yes. Simply point all
clearml.conf
files to point to the NAS instead of the files server
GrievingTurkey78 , can you try disabling the cpu/gpu detection?
Is it your own server installation or are you using the SaaS?
BoredPigeon26 , what do you mean in the server view?
It looks like it's failing to login to the backend. Did you make sure you have valid credentials in clearml.conf
?
Hi TrickyFox41 , I think this issue is solved in 1.9.0, please update to the latest version of clearml
Hi @<1673501397007470592:profile|RelievedDuck3> , should be possible. What errors are u getting?
Is there a reason it is requiring pytorch? )
The script you provided has only clearml
as a requirement
https://clear.ml/docs/latest/docs/references/sdk/dataset/#get_num_chunks
I think this might also be helpful. Gloss over the functions available in the documentation, I think you might find what you're looking for 🙂
From my understanding ClearML uses Apache-2.0 license, so it depends if that covers it or not
I see. It appears to be an open bug. As a workaround for now, if you go into 'Projects' -> 'All Experiments' and then search for the ID with the search bar at the top right (Magnifying glass icon)