Reputation
Badges 1
52 × Eureka!sorry I am a noob not sure how can do that but happy to help if I can
so it looks like the server is there (docker ps), I can see the artifacts (web ui), but not sure where things are as per documentation there is no /mnt/fileserver (?)
@<1523701435869433856:profile|SmugDolphin23> I had the same issue uploading a torch model. Thank you for being a life 🛟
Hey @<1593051292383580160:profile|SoreSparrow36> I am trying to test that if I delete a project the S3 storage gets also deleted. But I am not sure this is even a good assumption as I haven’t found anywhere what the expected/default behaviour is. Do you happen to know anything about this? Thanks.
Thanks @<1523701205467926528:profile|AgitatedDove14> happy to PR on the docs 😉
Hi @<1523701087100473344:profile|SuccessfulKoala55> thanks for your response. What I mean is that in the Web UI when you are creating a project you have storage (S3) field at the bottom of the create project pop-up, where you enter the S3 bucket that you want to associate with the project. Now, the thing is, you can’t visualize that information after the project is created, anywhere in the UI, as far as I can tell. So, it would be great to be able to see the configured bucket somewhere in...
Will this work?
task.connect(OmegaConf.to_object(cfg))
assuming cfg is my Hydra dict
I can’t see anything under /mnt so no fileserver there (?)
I see this in the docker-compose.yml file:
fileserver:
networks:
- backend
- frontend
command:
- fileserver
container_name: clearml-fileserver
image: allegroai/clearml:1.12.1-397
environment:
CLEARML__fileserver__delete__allow_batch: "true"
restart: unless-stopped
volumes:
- /opt/clearml/logs:/var/log/clearml
- /opt/clearml/data/fileserver:/mnt/fileserver
- /opt/clearml/config:/opt/clearml/config
ports:
- "8081:...
I am not a docker expert but am I correct to say that here the ‘/mnt/fileserver’ is the container path rather than the source path?
there under fileserver it should read /opt/clearml/data/fileserver
Hi @<1523701205467926528:profile|AgitatedDove14> , I see _allow_omegaconf_edit_ under HYPERPARAMETERS > Hydra
I just ran a dummy experiment logging images, plots, etc and I can see them in my server’s Web UI.
3fdcf5db64d allegroai/clearml:1.12.1-397 “/opt/clearml/wrappe…” 10 days ago Up 9 minutes 8008/tcp, 8080/tcp, 0.0.0.0:8081->8081/tcp, :::8081->8081/tcp clearml-fileserver
@<1547028031053238272:profile|MassiveGoldfish6> check this:
- does your local
clearml.confshould useuse_credentials_chain:true? - Do you have the needed AWS credentials in your local environment?
- Do you have an S3 bucket as the storage for your project (did you set this up when you created the project)?
- Do your local AWS credentials give you write access to that S3 bucket?
@<1523701205467926528:profile|AgitatedDove14> None
Hi @<1523701205467926528:profile|AgitatedDove14> thanks for your reply. I am seeing this is an issue with torch 2.0.1 because it does not install the needed cuda dependencies:
Adding this info here, in case anyone here has this issue. It looks like switching to torch 2.0.0 fixes the issue. I will update here after I test that. Thanks again 🙏
Thanks Martin. This is the first step out of many…
This is what I see:
Hi @<1523701087100473344:profile|SuccessfulKoala55> it’s failing again.. I haven’t rebooted the agent or changed anything and I am able to connect with ssh with ssh -vT git@github.com on a different tmux sess.
This is the error I am seeing running the agent with the -debug flag:
Using cached repository in "/home/ubuntu/.clearml/vcs-cache/clearml-tutorial.git.e1c2351b09f3d661b6f0dbf85e92be2e/clearml-tutorial.git"
git@github.com: Permission denied (pub...
Hei @<1523701087100473344:profile|SuccessfulKoala55> it just worked. Maybe there was some github refresh delay … not sure but thanks anyways for the debug suggestion. 👍
but from a terminal I can do:
ubuntu@***:~/sw/clearml-tutorial$ git fetch --all --recurse-submodules
Fetching origin
and it works
@<1523701087100473344:profile|SuccessfulKoala55> I changed my agent to poetry mode it and it worked like magic. Thanks Jake!

