Reputation
Badges 1
8 × Eureka!Hi @<1523701070390366208:profile|CostlyOstrich36> I ran this with the repo and script arguments, and it seemed that the package installation reverted to cloning the repo.
Executing task id [task]:
repository = my_repo
branch = my_branch
version_num =
tag =
docker_cmd = test
entry_point = clearml_pipeline/scheduler.py
working_dir = .
The docker image is correctly identified here, but I still face the installation errors I had before, which I hoped to circumvent using the image
And if so, does it mean the dockerfile isn't necessary?
Thanks @<1523701070390366208:profile|CostlyOstrich36> , what would be the intended use case of the docker option?
I was using it because I have some packages I'd like to install from a private repo, with a poetry environment, so I found it easier to containerize this set up process, as some authentication was necessary
I thought the docker option meant I can simply run the task using nothing but the docker image
Thanks @<1523701070390366208:profile|CostlyOstrich36> , so then I must still reference the repo and script?
This is the command I'm using
clearml-agent build --id ${ID} --docker --target new-docker --entry-point clone_task --cpu-only
Is there a way to get this to skip over cached venvs and instead create a new env?
Sorry, still new to how this environment works; would this then work if I deployed it, say from an EC2 instance rather than my machine?
Hi @<1627478122452488192:profile|AdorableDeer85> might you have been able to resolve this? I also gave the same challenge with remote execution
Hi @<1523701070390366208:profile|CostlyOstrich36> yes, this is what I'd want to do. Would this be the right way to do it?
Yes @<1523701087100473344:profile|SuccessfulKoala55> , a self deployed server